Sai Goud Durgappagari, VIJAY KOLAGANI, Dr. Yingcai Xiao

Slides:



Advertisements
Similar presentations
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
Advertisements

Game Development with Kinect
Input and Interaction Dr. Yingcai Xiao. A good user interface allows users to perform interaction tasks with ease and joy. WYSIWYG (What you see is what.
Copyright © 2014, Oracle and/or its affiliates. All rights reserved. | From a certain point of view Eye tracking with Java Gerrit Grunwald Java Technology.
 Introduction  Devices  Technology – Hardware & Software  Architecture  Applications.
Leap Motion - Proprietary & Confidential 8 Great Disruptions 1.
Welcome to CIS 083 ! Events CIS 068.
Muscle Volume Analysis 3D reconstruction allows for accurate volume calculation Provides methods for monitoring disease progression Measure muscle atrophy.
Yingcai Xiao Game Development Interactive Animation.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
HW#9 Clues CSCI 571 Fall, HW#9 Prototype
Interactive Computer Graphics
Virtual Pointing Device Using Stereo Camera The 6th International Conference on Applications and Principles of Information Science Jan , 2007, Kuala.
Yingcai Xiao Event-driven Programming in Game Development Yingcai Xiao.
HRTF 8 AudioGraph Spatial audio source Can be input source or submix Support #emitters depends on multiple factors Orientation, Position,
CHAPTER 7 TouchGestures. Chapter objectives: To code the detection of and response to touch gestures. Patterns of common touches to create touch gestures.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Game Development with Unity3D
Interactive Animation
Vijay Kumar Kolagani Dr. Yingcai Xiao
CSC 205 Programming II Lecture 5 AWT - I.
Southern Taiwan University Department of Electrical Engineering
EYE TRACKING TECHNOLOGY
CSC 222: Object-Oriented Programming
EEC-693/793 Applied Computer Vision with Depth Cameras
Hand Gestures Based Applications
Introducing virtual REALITY
5/5/2018 1:12 AM © Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS.
A seminar on Touchless Touchscreen Technology
Operating System Review
Musical Instrument Virtual
Event Loops and GUI Intro2CS – weeks
Introduction to Virtual Environments & Virtual Reality
CompSci 230 S Software Construction
Sai Goud Durgappagari Dr. Yingcai Xiao
EEC-693/793 Applied Computer Vision with Depth Cameras
Sliding Puzzle Project
Human Computer Interaction
Mobile Navigation Control for Planetary Web Portals Team Members: John Calilung, Miguel Martinez, Frank Navarrete, Kevin Parton, Max Ru, Catherine Suh.
Augmented Reality And Virtual Reality.
CSC461 Lecture 8: Input Devices
Speed Sensor Calibration
Virtual Reality By: brady adger.
Requirements & Verification
Vijay Kumar Kolagani Dr. Yingcai Xiao
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Understand Windows Forms Applications and Console-based Applications
TOUCHLESS TOUCHSCREEN USER INTERFACE
Managed DirectX Joe Gavin.
Programming HCI Yingcai Xiao Yingcai Xiao.
EEC-693/793 Applied Computer Vision with Depth Cameras
Operating System Review
EE 422C Java FX.
Vijay Kumar Kolagani Dr. Yingcai Xiao
Introduction to Computing Using Java
Event-driven Programming
A seminar on Touchless Technology
Oculus Rift DK2 + Leap Motion Unity Tutorial
X Windows.
A Prime Example of HCI Application
Virtual Reality.
Chapter I Introduction
Professional Environment
Myo + Oculus Rift Tutorial
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
EEC-693/793 Applied Computer Vision with Depth Cameras
CMPE 135: Object-Oriented Analysis and Design March 14 Class Meeting
Presentation transcript:

Sai Goud Durgappagari, VIJAY KOLAGANI, Dr. Yingcai Xiao Leap Motion Sai Goud Durgappagari, VIJAY KOLAGANI, Dr. Yingcai Xiao https://www.leapmotion.com/

What is Leap? The Leap Motion controller, also known as The Leap, was Launched in 2012 by Leap Motion, Inc. iPod sized USB peripheral It Can create custom gestures

Components Two Global Sutter Image Sensor Infrared LED’s USB 3.0 Controller

How does it work Can track movements down to a 1/100th of a millimeter The leap motion device acts as a camera, and continuously takes stereographic images of the area in front of the sensor. The images are scanned for hands, fingers and pointables.

Leap Motion Programming NUI OO-EDP

Programming Leap motion https://developer.leapmotion.com Input device NUI Only tracks hands and fingers Can be integrated with VR systems Over 200K programmers

Programming Leapmotion The Leap Motion Controller tracks hands and fingers and reports position, velocity, and orientation with low latency and good accuracy. The controller can be mounted on a VR headset or be used on a tabletop. The Leap Motion controller system consists of a hardware device and a software component which runs as a service or daemon on the host computer. The software component analyses images produced by the hardware and sends tracking information to applications. The Leap Motion Unity plugin connects to this service to get data.

Hand Tracking Leap Motion controller uses optical sensors and infrared light. Sensors field of view about 150 degrees. Approximately extends 0.03 to 0.6 meters above device. Detection and tracking work best when the controller has a clear, high- contrast, silhouette view of the hands and fingers. The Leap Motion software combines its sensor data with an internal model of the human hand to help cope with challenging tracking conditions.

SDK V1.0 Download SDK here https://drive.google.com/file/d/1m33yLgpmNSqF0B7SJvSCxvuYmq9vcAnq/view how to set up C# Windows Forms and Presentation Foundation projects using Visual Studio Sample program https://developer.leapmotion.com/documentation/csharp/devguide/Project_Set up.html

Leap Motion Programming DATA STRUCTURE

Tracking Model

Tracking Model Frame Hand Arm Pointable, Finger Bone Image The Frame object is the root and provides access to all other tracked entities. Hand The Hand object describes the position and orientation of a hand, tracks its motion between frames and contains lists of the fingers associated with that hand Arm Arm objects describe the position, direction, and orientation of the arm to which a hand is attached. Arms can only be accessed through the Hand object. Pointable, Finger Pointable objects define the basic characteristics common to fingers. The Finger class extends Pointable with additional information specific to those entities. Bone Bone object represent the position and orientation of a bone. Image Image objects provide the raw sensor data and calibration grid for the Leap Motion cameras. Utility Classes Vector and Matrix

Leap Motion Programming Algorithms

Steps to OO-EDP Initialization: to setup the UI and to prepare the event loop. Sometimes the platform takes care of all of these, you just need to create the objects for the event/handlers. Registration: to register the event handers to interested events. No need if the names of the event handlers are hardcoded to the events. Event loop: to start the event loop, usually through a platform-provided API. Some platforms automatically starts the event loop after your code is done in the main function. Event handers: To process the event To re-register the event-handler To invoke post-processing functions, directly or indirectly. The latter is done by creating another event which is handled by the post-processing functions. Cleanup: release recourses.

Keys to OO-EDP Events: NUI events are usually composite, and described by vender-defined classes (UDTs). Registration: registration APIs are usually similar to that of callback functions. Sometimes are hardcoded in the handlers’ names. Event Handlers: usually single argument functions with an event object as the input argument. Sometimes, with another argument to identify the sender (i.e., event generator).

Tracking API for Application without Unity3D System Architecture for Native Application Development

Tracking API without Unity 3D System Architecture for WebSocket Interface

Tracking API without Unity 3D To connect to the Leap Motion device, create a Controller object Controller controller = new Controller(); Get Frame objects containing tracking data by calling the Controller.frame() function. Foreground and Background Applications Leap Motion service/daemon only sends tracking data to the application that has the operating system input focus. Listening for Controller Events The Controller object dispatches a number of events using the Listener mechanism. To handle these events, you can extend the Listener class to implement callback functions.

Tracking API without Unity 3D import com.leapmotion.leap.*; public class LeapEventListener extends Listener { public void onFrame (Controller controller){ System.out.println("New Frame"); } public void onInit (Controller controller){ System.out.println("Initialized"); public void onConnect (Controller controller){ System.out.println("Connected");

Tracking API without Unity 3D public void onDisconnect (Controller controller){ System.out.println("Disconnected"); } public void onFocusGained (Controller controller){ System.out.println("Focus gained"); public void onFocusLost (Controller controller){ System.out.println("Focus lost");

Tracking API without Unity 3D Controller controller = new Controller(); LeapEventListener listener = new LeapEventListener(); controller.addListener(listener);

SampleListener class to define event handler function class SampleListener { public void OnServiceConnect(object sender, ConnectionEventArgs args) { Console.WriteLine("Service Connected"); } public void OnConnect(object sender, DeviceEventArgs args) Console.WriteLine("Connected"); public void OnFrame(object sender, FrameEventArgs args) Console.WriteLine("Frame Available.");

properties of the Frame object: public void OnFrame(object sender, FrameEventArgs args) { // Get the most recent frame Frame frame = args.frame; Console.WriteLine( "Frame id: {0}, timestamp: {1}, hands: {2}", frame.Id, frame.Timestamp, frame.Hands.Count ); foreach (Hand hand in frame.Hands) Console.WriteLine(" Hand id: {0}, palm position: {1}, fingers: {2}", hand.Id, hand.PalmPosition, hand.Fingers.Count); // Get the hand's normal vector and direction Vector normal = hand.PalmNormal; Vector direction = hand.Direction; // Calculate the hand's pitch, roll, and yaw angles Console.WriteLine( " Hand pitch: {0} degrees, roll: {1} degrees, yaw: {2} degrees", direction.Pitch * 180.0f / (float)Math.PI, normal.Roll * 180.0f / (float)Math.PI, direction.Yaw * 180.0f / (float)Math.PI ); }

Leap Motion Programming NUI OO-EDP with Unity3D

Setup in Unity3D Download the latest asset package from: https://developer.leapmotion.com/downloads/unity Open or create a project. Select the Unity Assets > Import Package > Custom Package menu command. Locate the downloaded asset package and click Open. The assets are imported into your project. Example scenes can be found in the LeapMotion/Scenes folder. Reference : https://www.leapmotion.com/

Tracking API for Unity3D The classes in the Leap.Unity namespace interact directly with GameObjects and other UnityEngine components. These scripts take tracking data and put it to use in Unity. LeapServiceProvider provides access to the tracking data and images as well as access to the Leap.|Controller|_ object. The LeapServiceProvider transforms the tracking data so that the coordinates are relative to the transform of the Unity GameObject to which it is attached. LeapHandController provides access to the active hand graphics and physics objects.

Coordinate Systems Unity3D uses left-hand convention for coordinate system Leap Motion API uses right-hand convention Unity3D default unit: meters. Leap Motion default unit: millimeters.

Unity Coordinate System (left-hand) Reference : https://www.leapmotion.com/

Demo http://www.cs.uakron.edu/~xiao/hci/EDP-Unity5-LeapMotion.zip EDP-Unity5-LeapMotion/Assets/Player.cs EDP-Unity5-LeapMotion/Assets/LeapMotion/DemoResources/HandCycler.cs

Leap Motion Recent Advancement with VR Hand Tracking in Virtual Reality Virtual Reality is a sensory input and stereo output system that transports the user into a virtual world. Current technology uses head-mounted displays as the primary means of sensory input. These displays use position and orientation tracking of various levels of sophistication to immerse the user in the virtual world and provide a sense of physical presence. Hand tracking can amplify that sense of presence.

Unity Coordinate System for VR Reference : https://www.leapmotion.com/

References https://www.leapmotion.com/ https://developer.leapmotion.com/ https://developer.leapmotion.com/documentation/java/index.html https://developer.leapmotion.com/documentation/unity/unity/Unity_Overview.ht ml