Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

K - News By: Elie Ain Malak Nour Assi Vasken Sarkis Georges Wakim.
Davide Spano CNR-ISTI, HIIS Laboratory, Via G. Moruzzi Pisa, Italy.
What’s New in Kinect for Windows v2 Click to add title
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
KINECT Vinayak Thapliyal and Noah Balsmeyer 1. Overview  What is the Kinect?  Why was it made?  How does it work?  How does it compare to other sensors?
BRETT WATT COMPUTER SCIENCE 1631 WINTER.  Originally known by the code name “Project Natal”  Microsoft Kinect is a hands free gaming system built for.
KINECT REHABILITATION
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source.
Ryan C. Bergsmith Ross Kelly Kevin Warne Sponsor: Steve Peralta Motion Music Controller.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Multi-scenario Gesture Recognition Using Kinect Student : Sin- Jhu YE Student Id : MA Computer Engineering & Computer Science University of Louisville.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
ALFRED THOMPSON MICROSOFT ACADEMIC TEAM Kinect for FRC 2012.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
SM1205 Interactivity Topic 01: Introduction Spring 2012SCM-CityU1.
SM1205 Interactivity Topic 01: Introduction Spring 2010SCM-CityU1.
Game Development with Kinect
SM1205 Interactivity Topic 01: Introduction Spring 2011SCM-CityU1.
Application Programming Interface For Tracking Face & Eye Motion Team Members Tharaka Roshan Pathberiya Nimesh Saveendra Chamara Susantha Gayan Gunarathne.
XBOX 360 Presented by, KARTHIK.S. CONTENTS  INTRODUCTION  WHAT IS XBOX  HISTORY  About XBOX 360  PERIPHERALS Controller Kinect Console Headset 
1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers,
(CONTROLLER-FREE GAMING
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Gerardo Cabral Jr MIS 304 Professor Fang Fang.  Project Natal” is the code name for a revolutionary new way to play on your Xbox 360.  Natal is pronounced.
GAMING CONSOLES … “Evolution to the next level” Presented by:- Manindar Singh Ratan Regd.No.: Branch:- IT, Group:- ‘B’ Sem:- 7 th,CVRCE Game.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium.
Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud A0569 Satya Swarup Sahoo A0574 G. Shivadeep.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Page 1 | Microsoft Work With Skeleton Data Kinect for Windows Video Courses Jan 2013.
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
S ENSORS U SED I N G AMES By Wusqa Waqar. What are sensors and how are they used in games? A sensor is a converter that measures a physical quantity and.
CONTENT 1. Abstract 2. Introduction 3. Research Method 4. Discussions 5. Conclusion 6. References 1.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Ben Lower Kinect Community Evangelism Kinect for Windows in 5 Minutes.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
TITLE of your PROJECT Your Name / Collins & Zokaites / 3D Tools – ART 218 / Digital Culture / Fall 2013 Abstract A Kinect© structured.
EEC 490 GROUP PRESENTATION: KINECT TASK VALIDATION Scott Kruger Nate Dick Pete Hogrefe James Kulon.
What is Xbox Kinect? Kinect is the motion control device created by Microsoft to be used with their Xbox 360 console. You use your body (head, hands, and.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Technology Traditional Kinect Based Therapy Exercises Patient have the option to do typical therapy exercise that would normally be performed at an in-office.
Virtual Image Peephole By Kyle Patience Supervisor: Reg Dodds Co Supervisor: Mehrdad Ghaziasgar.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Introduction to Kinect For Windows SDK
ExerciseTechnology A New Equipment/Software Exercise with Technology Elle Rivera.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
A Gesture Based System Humanize Technology.  Communication is the way we learn.  What about learners with communication difficulties?  Make technology.
Vision Based hand tracking for Interaction The 7th International Conference on Applications and Principles of Information Science (APIS2008) Dept. of Visual.
What is Multimedia Anyway? David Millard and Paul Lewis.
On Wikipedia you can find the following definition of NUI: “In computing, a natural user interface, or NUI, or Natural Interface is the common parlance.
Kinect for Windows By: Craig Delzangle COSC 380. What I’m going to cover: History How Kinect works Kinect and Windows Uses Conclusion Questions.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Microsoft Kinect Jason Wong Pierce Nichols Rick Berggreen Tri Le.
Microsoft Kinect How does a machine infer body position?
KINECT AMERICAN SIGN TRANSLATOR (KAST)
Creative Coding & the New Kinect
Southern Taiwan University Department of Electrical Engineering
EEC-693/793 Applied Computer Vision with Depth Cameras
AN OPEN BIOMECHANICS SYSTEM USING COMMODITY HARDWARE
VIRTUAL INTELLIGENCE PROJECT NATAL (Kinect & Xbox 360)
Xbox Kinect: A Brand New Console
TITLE of your PROJECT Your Name /
GESTURE CONTROLLED ROBOTIC ARM
TOUCHLESS TOUCHSCREEN USER INTERFACE
EEC-693/793 Applied Computer Vision with Depth Cameras
CS2310 Zihang Huang Project :gesture recognition using Kinect
Presentation transcript:

Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment. Hardware The Kinect sensor is a motion sensing device and is capable of calculating depth of the scene. It employs an IR camera, IR projector, an RGB camera and a multi-array microphone. The IR projector produces a speckle pattern of infra red dots on the scene which is then read by the IR camera and from the deformation of the IR pattern, it calculates the depth of the objects in the scene. Software -Microsoft Visual Studio 2012 C# - Kinect SDK v1.7: This SDK helps in utilizing the full resources and potential of the Kinect sensor. It has Human Tracking and motion gesture recognition tools. Simulation Software & Results Conclusion & Future Work The system works reasonably well. A large number of simulation tests were performed with a number of different candidates; the system produced reliable results most of the time. Voice commands works with small sentences and the results are adequate. The Validation process has a great potential to be improved. By employing the second generation of Kinect, the system performance can be vastly improved in terms of reliability and performance. Also with the new Kinect, voice commands can be taken to a next level by incorporating longer sentences. Supervised by Dr. Brett Wilkinson, Flinders University, Adelaide, Australia Literature cited [1] A. K. Jain, R. Bolle, and S. Pankanti, Biometrics: personal identication in networked society, Kluwer Academic Publishers 1999 [2] X. Qinghan, "Technology review - Biometrics-Technology, Application, Challenge, and Computational Intelligence Solutions”, IEEE Computational Intelligence Magazine 2007, vol.2, pp [3] DG. Lowe.: Object Recognition from Local Scale-Invariant Features. Proceedings of the International Conference on Computer Vision. 2 (1999). Human Tracking Kinect depth sensor can locate a person in the scene by using the IR Depth camera. It involves capturing depth information of the environment, looking for the largest moving object in the scene and then inferring body parts from a decision tree with a large number of training examples. A set of 20 points representing 20 joint positions is marked. Each of the 20 joint values describes the positions of a specific joint in 3D with respect to the Kinect sensor. The system allows access to an authorised person only. It takes note of the skeletal features of a person which involves the height, arms’ length and width of a person. This helps in identifying an authorised person from others when there are many people in the scene. The person can move the Tank with either hand. The Tank moves in accordance with the motion of the hand The Tank object can be fixed at a position by raising the opposite hand to the chest Once the position is fixed, it will not move irrespective of the motion of the hand. The Tank can be made to move again by raising both the hands The Tank can be moved by the opposite hand again The Tank can be moved by opposite hand by touching both the hands. The simulation scenario can be changed by raising both the hands up Any of the targets in any scenario can be fired at by voice commands. Person Motion Gesture Accuracy % Voice Recognition Accuracy % Validation Accuracy % A B C D E90 80 F Person Motion Gesture Accuracy % Voice Recognition Accuracy % Validation Accuracy % A B C D E70 75 F 8070 Environment 1 Environment 2 Test Results showing the accuracy of the software Simulation software was designed to utilise the motion gesture capabilities of the Kinect Sensor. For this a system was designed which involves moving a Tank with either hand and then taking certain actions based on the persons’ gestures and voice commands. The motivation of this project is to give an overview of the actual training field scenario to new soldiers by giving them a demonstration in the simulation software. The cost and set up time for in-field training exercises can be expensive. By developing a simulation it is expected that costs can be reduced and immediate feedback can be supplied to the trainees. The simulation software has been designed to provide new soldiers with an emulation of the battlefield. This scenario allows the soldiers to understand issues surrounding command, manoeuvres, fields of view and firing lines. The simulation represents a strategic planning exercise and is not intended as a 3D game. The Kinect sensor provides a solid ground for the simulation system. The RGB camera of the Kinect sensor helps in providing a view of the scene while the 3D Depth sensor helps in allowing the program to understand how the scene looks like and then certain actions can be taken based on the content of the scene. In this system, the user is placed in an environment where all the interaction can be controlled via hand motion and voice commands.