Download presentation
Presentation is loading. Please wait.
Published byAyla Barke Modified over 10 years ago
1
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source : 1
2
SPECIFIC USE Flinders University Update the conventional systems in educating the fresh soldiers with Kinect based systems Develop a thorough understanding of the training field via simulation prior to entering the training field Flinders University
3
MULTIPURPOSE RANGE COMPLEX (MPRC) 133 TARGET Tank set at different positions with the motion of hand Can be controlled by either hand Different gestures cause different actions. The tank can fire a set of targets when the person speaks Uses the Kinect array microphone
4
20 BATTLE POSITION IN MPRC
5
80 INVENTORY TARGET
6
PROJECT AIM Develop hardware and software that provides for the control of different objects in simulation via motion gestures and speech recognition. Implement a simulation environment to deploy the Kinect based system Deploy the application to a real world scenario such as battle field training (MPRC). Flinders University
7
W HAT IS THE K INECT Developed by Microsoft Infrared projector and RGB camera Microphone array 3D depth sensors Complete hands free interaction Flinders University
8
BACKGROUND Extensive research done on depth based sensors Has the potential to be used in mapping applications Hand gestures operates robustly in uncontrolled environments and is insensitive to hand variations and distortions See References [1,2,3,4,5] Flinders University
9
T ECHNOLOGY Hardware Microsoft XBOX-360 Kinect Monitor display/ Projector display Standard computer/PC Software Microsoft Visual Studio 2012 Kinect SDK v1.7 for Windows Windows 7 Flinders University
10
DEPTH INFORMATION Use an IR projector to project IR pattern on the scene Use a hardwired IR pattern Use IR camera to see the IR pattern Infer depth from the projected IR pattern and hardwired pattern
11
IDENTIFY A PERSON IN THE ENVIRONMENT Based on the fact that human is usually the largest moving object in the scene. Based on the depth data of Kinect camera Separate the person depth data from the surrounding data Assign 20 joint positions
12
DIFFERENTIATE BETWEEN PERSONS Able to differentiate between two or more persons. Can track up to six people Identify and memorize skeletal features and use it for tracking Respond to one specific person Based on skeletal features of the person
13
SIMULATION EXAMPLE The system is able to identify different hand gestures Takes different actions according to the gestures Take action on the targets by saying its name
14
CONCLUSION Flinders University Implement a functional simulation environment. Interact through motion gestures Develop an understanding of the Kinect SDK and middleware applications for Natural Interactions Conduct user evaluations
15
FUTURE WORK Investigate into the possibilities of Kinect based applications Make use of this system relevant to my field; ultimate goal is to assist in the training access. Determine stability of low cost technology for such systems. Flinders University
16
REFERENCES [1]. Thomas B Moeslund, Adrian Hilton, and Volker Kru. A Survey of Advances in Visionbased Human Motion Capture and Analysis. Computer Vision and Image Understanding, 104:90–126, 2006. [2]. B. Freedman, A. Shpunt, M. Machline, and Y. Arieli. Depth Mapping Using Projected Patterns. Patent Application, 10 2008. WO 2008/120217 A2. [3]. Mark Schneider and Charles Stevens. Development and Testing of a New Magnetictracking Device for Image Guidance. SPIE Medical Imaging, pages 65090I–65090I– 11, 2007. [4]. Ali Erol, George Bebis, Mircea Nicolescu, Richard D. Boyle, and Xander Twombly. Vision-based Hand Pose Estimation: A review. Computer Vision and Image Understanding, 108(1-2):52–73, 2007. [5]. Iason Oikonomidis, Nikolaos Kyriazis, and Antonis A. Argyros. Markerless and Efficient 26-DOF Hand Pose Recovery. ACCV, pages 744–757, 2010. Flinders University
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.