Download presentation
Presentation is loading. Please wait.
Published byLilian Evans Modified over 9 years ago
1
Kinect H4x Gesture Recognition and Playback Tools (+Inspiration)
2
SDK Version 1.0 - Out TODAY What's New? Ability to control which user(s) are under full skeletal tracking. "Near mode" enables interaction as close as 40cm from the device. Includes "too far" and "too close" depth indicators: Better everything Mother of All Kinect Demos - Kinect Explorer sample app shows off all features (camera tilt, audio beam angles, etc).
3
Gesture Recognition: Dynamic Time Warping A sequence matching algorithm that can adapt to sequences that vary in speed and time. (think Levenshtein distances, but generalized to matching any sort of input to stored data) It measures the similarity between two sequences based on a cost function of how much it needs to "warp" the points forward/backward in time to have them line up. In Kinect-land, this means an algorithm that can take streaming joint data and quickly find the closest match to a gesture 'on record'.
4
There's an App for that -- DEMO
5
But how does it work? And how can I work into my own project? Three classes: Skeleton2DDataExtract.cs -- Takes in Kinect SDK joint data, spits out normalized 2d skeleton points. Skeleton2DdataCoordEventArgs.cs -- Defines the event args that get emitted from the Skeleton2DDataExtract event handler once processed. DtwGestureRecognizer.cs - Parse the 2d skeleton data and call Recognize() to match against loaded gestures (see code for loading/saving example).
6
Recognizing A Gesture -- Code
7
Really Advanced Hacks The DtwGestureRecognizer can flexibly match any vectorized data stream. We happened to use skeleton data in our example, but it should be fairly simple to incorporate a depth stream or color stream. Just ensure that each of the sequence objects you pass into AddOrUpdate and Recognize is an array of doubles (e.g. double[] observationPoint).
8
Working on P4 Collaboratively: Recording and Replaying Skeleton Data In light of the fact that many teams share a single Kinect, it might be of use to you to be able to record a sequence of skeleton data, write it to a file, and replay it back through a dummy nui.SkeletonFrameReady handler. Fortunately, the Kinect Toolbox (not to be confused with the Coding4Fun Kinect Toolkit), allows us to do just that. Get it at: http://kinecttoolbox.codeplex.com/http://kinecttoolbox.codeplex.com/
9
Demo -- Sample Code Walkthrough -- Saving and Playing Back Gestures
10
Candescent NUI (Demo) The Cool News: Hand + Finger Tracking! The Bad News: Behemoth, undocumented code library Project available at http://candescentnui.codeplex.com Kinect SDK only provides depth values from 800mm out. The finger tracking code only works in the range of.8 - 1m, so using this project in conjunction with the Kinect SDK will prove difficult. UPDATE: NEW SDK 1.0 provides depth values from 400mm out! Compatible With Candescent Alternative: OpenNI + NITE uses the raw point cloud to make best-guess tracking estimates < 800mm away. Good community, documentation at OpenNI.org / OpenKinect.org
11
Anant - Shape Game Walkthrough
12
Inspirational Project: Deixis Application To Children's Education Games Main Idea: Ask children to point and verbally identify ("this one!") a subset of objects (numbers, colors, animals). Description + Video http://www.cs.utoronto.ca/~uzmakhan/HCIProject.htmlhttp://www.cs.utoronto.ca/~uzmakhan/HCIProject.html
13
Inspirational Hacks Gesture Enabled Garden Hose Main Idea: Use servos (simple motors) in conjunction with netduinos (network-enabled microcontrollers) to control the servo via Kinect gestures: http://channel9.msdn.com/coding4fun/kinect/Kinect-to-Netduino-Cross-post In Practice: http://channel9.msdn.com/coding4fun/kinect/A-Kinect-- Netduino-controlled-squirt-gunhttp://channel9.msdn.com/coding4fun/kinect/A-Kinect-- Netduino-controlled-squirt-gun http://www.youtube.com/watch?v=FWINsKcP8oQ
14
Inspirational Hacks Pt. 2 Gesture Based Electronic Music Performance http://vimeo.com/channels/pulseproject#30301433
15
Inspirational Hacks Pt. 3 EDEN: Interactive Ecosystem Simulation Software http://vimeo.com/31940579 Main Idea: Create a topographical landscape on the iPad, fill it with (simulated) water, project it onto a sandscape via depth data with an overhead Kinect + projector. Play with the sand to change the climate and topography to terraform your own sandscape.
16
More Kinect Telepresence - http://www.youtube.com/watch?v=ecMOX8_GeRYhttp://www.youtube.com/watch?v=ecMOX8_GeRY Home Security Camera - http://www.youtube.com/watch?v=UfGOR1Eg_qQhttp://www.youtube.com/watch?v=UfGOR1Eg_qQ Living Paintings - http://www.youtube.com/watch?v=UjDaHMKwQl4http://www.youtube.com/watch?v=UjDaHMKwQl4 Visually Impaired Navigation Tool- http://www.youtube.com/watch?v=l6QY-eb6NoQhttp://www.youtube.com/watch?v=l6QY-eb6NoQ ZigFu: Single bundle install NITE, OpenNI, PrimeSense Sensor, everything you need to work outside of the official SDK: http://zigfu.com/devtools.htmlhttp://zigfu.com/devtools.html
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.