Download presentation
Presentation is loading. Please wait.
Published byElla Lynch Modified over 9 years ago
1
Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud - 08211A0569 Satya Swarup Sahoo - 08211A0574 G. Shivadeep - 08211A0576 Rohit Reddy - 07211A0561
2
Abstract Kinect Working with Kinect iRobot Create Working with Create Integrating Kinect and Create Use Case Diagram Class Diagram Implementation Progress Sample Code Conclusion GESTURE BASED COMPUTING
3
Technology has advanced at a very higher rate in the recent years and has made lives of humans very easy and comfortable. In the field of electronics and computers, people have grown a higher affinity towards comfortable devices that involve very less interaction of the users. One such technology developed is the Gesture Based Computing, where users have minimal physical involvement. And this kind of computing uses sensors that can sense the user and detect the gestures made by the user. Hence, a good 3D sensor KINECT is used for this purpose. It is a 3D sensor which can capture the user's movements and make it computable by using its software development kit. It has its impact even in the field of robotics, where user can control and operate a robot with his gestures. GESTURE BASED COMPUTING
4
Kinect is the product of Microsoft's Project Natal and launched in America on November 4th, 2010 It's a well equipped 3D sensing device. It has sensors for RGB image, depth image, and multi-array microphone GESTURE BASED COMPUTING
6
Using Microsoft Visual Studio 2010 and Kinect SDK. We don’t have any simulation environment while working on Kinect. It supports many different Languages like VB.NET, C++, C#.NET. GESTURE BASED COMPUTING
9
Using Microsoft Robotics Developer Studio. Real time and Simulation Environment while working with iRobot Create. Support many different Languages like VB.NET, C++, C#.NET. GESTURE BASED COMPUTING
10
User’s gestures are scanned by the Kinect and the corresponding action to be performed by the iRobot will be initiated to it via Bluetooth. GESTURE BASED COMPUTING
11
Robot KINECT User Getting connected through services Capture gestures from user Convert the gesture to a functionality Make gestures Perform the operation GESTURE BASED COMPUTING
12
User Kinect Robot makeGestures(); identifyGestures(); compute(); performAction(); getConnection(); performAction(); GESTURE BASED COMPUTING skeleton; joints; _mainPort;
13
Connection to the robot Gestures from the user Kinect captures gestures Functionalities provided to each gesture Actions implemented as per the gesture GESTURE BASED COMPUTING
14
We have successfully provided functionalities to some gestures through which we can implement the usage of various keys of the keyboard (such as BACKSPACE, ARROW KEYS, TAB, etc.) We have been successful in controlling a robot through our system. And are now working on integrating the gesture based computing in it and controlling the robot through our gestures. GESTURE BASED COMPUTING
15
A good part of our project is that it can be integrated with many user applications and reduce the physical involvement to a large extent. Currently we are integrating it with the iRobot Create robot using the Microsoft Robotics Studio. The outcome of the project will provide the user with the power of controlling the robot using his/her gestures. GESTURE BASED COMPUTING
16
Thank You !!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.