4 daenet was established in 1998 Gold Certified Partner WPC Partner of the Year 2005, 2010 Regular “Inner Circle” / “Core Partner” with Microsoft.

Slides:



Advertisements
Similar presentations
ARTIFICIAL PASSENGER.
Advertisements

K - News By: Elie Ain Malak Nour Assi Vasken Sarkis Georges Wakim.
Kinect is an accessory for the Xbox 360 brings games and entertainment to life in extraordinary new ways with no controller required. Simply step in.
What’s New in Kinect for Windows v2 Click to add title
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
Capturing Your Audience with Kinect
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
KINECT REHABILITATION
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source.
Kinect + TFS aka Kinban Jeremy Novak Farm Credit Services of America.
Wait, what? More than just technology catch-up. Johnny Lee (Carnegie Mellon) * Motion-Tracking/Head-Tracking/Virtual Whiteboard
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
-Baljeet Aulakh -Arnold Csok -Jared Shepherd -Amandeep Singh EEC 490 Spring 2012 Kinect Fitness Trainer 1.
Lynne Grewe, Steven Magaña-Zook CSUEB, A cyber-physical system for senior collapse detection.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Game Development with Kinect
4 daenet was established in 1998 Gold Certified Partner WPC Partner of the Year 2005, 2010 Regular “Inner Circle” / “Core Partner” with Microsoft.
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers,
CHUCK NORRIS HAD TO DESTROY THE PERIODIC TABLE… HE ONLY RECOGNIZES THE ELEMENT OF SURPRISE!
Modeling and Animation with 3DS MAX R 3.1 Graphics Lab. Korea Univ. Reference URL :
EEC-492/592 Kinect Application Development Lecture 10 Wenbing Zhao
Kinect Part II Anna Loparev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Human Gesture Recognition Using Kinect Camera Presented by Carolina Vettorazzo and Diego Santo Orasa Patsadu, Chakarida Nukoolkit and Bunthit Watanapa.
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
Page 1 | Microsoft Work With Color Data Kinect for Windows Video Courses Jan 2013.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
1 EEC-492/592 Kinect Application Development Lecture 2 Wenbing Zhao
S ENSORS U SED I N G AMES By Wusqa Waqar. What are sensors and how are they used in games? A sensor is a converter that measures a physical quantity and.
Human Interaction Development Using the Countess Quanta Robot Brad Pitney Yin Shi.
By: 1- Aws Al-Nabulsi 2- Ibrahim Wahbeh 3- Odai Abdallah Supervised by: Dr. Kamel Saleh.
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Spectrograms Revisited Feature Extraction Filter Bank Analysis EEG.
Ben Lower Kinect Community Evangelism Kinect for Windows in 5 Minutes.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
Multimedia System and Networking UTD Slide- 1 University of Texas at Dallas B. Prabhakaran Rigging.
New Human Machine Interfaces for Games Narrated by Michael Song Digiwinner Limited Aug
EEC 490 GROUP PRESENTATION: KINECT TASK VALIDATION Scott Kruger Nate Dick Pete Hogrefe James Kulon.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
KINECT FOR WINDOWS Ken Casada Developer Evangelist, Microsoft Switzerland | blogblog.
Coding4Fun: Build Fun, Cool, Commercial Applications Using the Kinect for Windows SDK Dan Fernandez Director Microsoft Corporation Rick Barraza Senior.
Introduction to Kinect For Windows SDK
It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses.
EEC-693/793 Applied Computer Vision with Depth Cameras Lecture 8 Wenbing Zhao
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Coding4Fun: Build Fun, Cool, Commercial Applications Using the Kinect for Windows SDK Dan Fernandez Director Microsoft Corporation Brian Peek Senior Technical.
Presenter: Jae Sung Park
CHAPTER 8 Sensors and Camera. Chapter objectives: Understand Motion Sensors, Environmental Sensors and Positional Sensors Learn how to acquire measurement.
Rob Relyea | Program Manager, Kinect Team Johan Marien | Program Manager, Kinect Team.
Preliminary project assignment Smart house Natural User Interface for Business NUIT4B.
FINGERTEC FACE ID FACE RECOGNITION Technology Overview.
Creative Coding & the New Kinect
Southern Taiwan University Department of Electrical Engineering
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Video-based human motion recognition using 3D mocap data
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Fusion, Face, HD Face Matthew Simari | Program Manager, Kinect Team
EEC-693/793 Applied Computer Vision with Depth Cameras
Kinect for Creative Development with open source frameworks
CS2310 Zihang Huang Project :gesture recognition using Kinect
Multimodal Caricatural Mirror
EEC-693/793 Applied Computer Vision with Depth Cameras
EEC-693/793 Applied Computer Vision with Depth Cameras
Presentation transcript:

4 daenet was established in 1998 Gold Certified Partner WPC Partner of the Year 2005, 2010 Regular “Inner Circle” / “Core Partner” with Microsoft Featured in joint Microsoft-Gartner newsletter and in Gartner Magic Quadrant More than 15 years experience in the “Integration” and architecture space V-TSP Integration Consulting, Coaching, Implementation, Outsourcing Scrum Master, Trainer, Coach Passionate about Innovation Community, User Group, Presentations Geos: Western Europe (Germany-HQ) Americas (USA) South-eastern Europe

A new kind of Camera, that can see in 3D A traditional video camera A 4-element microphone array A powerful SDK which can: Recognize that a person is in front of the sensor, and what position their body is in A software controllable directional microphone, and speech recognition technology And lots more

TRAININGRETAILHEALTHCARETHERAPY EDUCATION

Power Light RGB Camera IR Emitters Microphone Array

Color (1920x1080) at 30 FPS (15 FPS low light)Infrared FPSDepth FPSBody-index pixel-mask at FPSPerson detection and joint tracking, up to 6 FPSFloor plane detectionTransformation between coordinate systemsEvent driven and polling-based APIs J. Sell and P. O’Connor, “The Xbox One System on a Chip and Kinect Sensor,” IEEE Micro, 2014

Audio Color Infrared Depth BodyIndex Body

Engagement Targeting Press Panning/Zoom

Engagement for Global Commands “Xbox” Engagement for App Commands “Xbox Select” Xbox One w/ Kinect Windows 8.0 or 8.1 w/ Kinect N/A “Kinect” or … “In Experience” Commands App- defined grammar

Infrared 13 MB/s Depth 13 MB/s BodyFrameBodyIndex Color 120 MB/s Audio 32 KB/s Legend Record/Play Record Only

V2.0

Heuristic Machine Learning (ML) with G.B.

Floor normal Upper leg vector Angle between Floor normal and bone Spine less than 45 ˚ from floor normal Upper legs less than 45 ˚ from floor normal Standing Spine less than 45 ˚ from floor normal Upper legs more than 45 ˚ from floor normal Sitting Spine more than 45 ˚ from floor normal Upper legs more than 45 ˚ from floor normal Lying

Rinse and repeat Your Application Record example gestures using Kinect Studio Tag gestures using Gesture Builder Build and analyze gestures using Gesture Builder Preview gestures in Gesture Builder Call gesture detector for tracked bodies

Fusion FaceHD Face

Integrate functionalities in controlling applicationVisualize work per joint for linear and angular motionDisplay velocity and acceleration vectorsDisplay energy expenditure rate and total energy expenditureDisplay activity levelDisplay fall-detection stateDisplay detected postureCenter of massEnable/Disable “Jitter” filtersEnables logging

Sensors of the armband: Accelerometer Heat flux Near-Body Ambient temperature Galvanic Skin Response Sensor Skin Temperature  Perform physical exercise with 2 different models of a BodyMedia SenseWear armband worn at the same time  Test subject: Male 43 yrs 178 cm 72.6 kg Non-smoker right handed Idea: Compare Kinect-based EE estimation with third-party EE estimation method

Run in place Pause Intense Squats And Jumps Lighter but mixed intensity running jumps/squats Warmup running in place Cooldown

Andreas Erben +1 (206) (179) HackAndi.com

2 Create Reconstruction Pass in a set of parameters defining the environment and processing type Outputs the reconstruction 3 Other considerations Can enable color to be lifted from the scan Experimental pose finders (6DoF trackers) are also included to play with 1 Process Function Provides “six degrees of freedom” transform Can be rendered and use to begin the Fusion Fusion | Face | HD Face

Detection Alignment Orientation Expressions Outputs a bounding box around the face Can be visualized in color or IR Identifies 5 facial landmarks on the face Operation performed in color or IR Returns quaternion of the head joint with respect to the sensor Quaternion prevents gimbal lock Provides classifiers for happy, left / right eye open, engagement, mouth open and mouth moving Alignment and detection performs the operation in IR, but are converted to color using the coordinate mapper for output Fusion | Face | HD Face

Face Model Builder Once of cluster of classes that implement the capture interaction Interactive API Provides collections status and evaluation of frames you’ve collected 1 Face Model Set of 94 shape units, scale, hair color, and skin color All adjustments are set against an “average face” mesh that is deformed by the shape units acquired during the face capture 2 Use the Mesh The mesh is analogous to most animation meshes for easy application to other rigs Mesh topology is the same for all faces represented in a standard way (numTriangles, numVertices, etc.) 3 UX is a non-trivial consideration in a good implementation of the Face Model Builder. Fusion | Face | HD Face

Interval to fall down Sum of max downwards force per joint over threshold Important joint (spine, shoulders, hips) close to floor Interval to recover Important joint locations above threshold height Interval to fall down Head position moves downward more than threshold value Base of spine and head close to floor Interval to recover Head and spine not in recovery threshold distance