Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud - 08211A0569 Satya Swarup Sahoo - 08211A0574 G. Shivadeep.

Slides:



Advertisements
Similar presentations
K - News By: Elie Ain Malak Nour Assi Vasken Sarkis Georges Wakim.
Advertisements

1 Chapter 3 Input Devices. 2 Overview of the Input Process.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
KINECT REHABILITATION
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2Image 3 Source.
Ryan C. Bergsmith Ross Kelly Kevin Warne Sponsor: Steve Peralta Motion Music Controller.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
ALFRED THOMPSON MICROSOFT ACADEMIC TEAM Kinect for FRC 2012.
Stay Kinected: A Home Monitoring System Combining Safety and Comfort Abstract The purpose of this project is to use the Microsoft Kinect sensor to implement.
SDP 12 Project “PRASER” Senior Design Project 2012 Team Mosaic Advisor: Professor Lixin Gao Members: Allen Chew, Charles Essien, Brian Giang, Simon Ma.
CSE Design Lab – Milestone 2 James Hopkins Dave Festa Dennis O’Flaherty Karl Schwirz.
TOUCH SCREEN AND ZIGBEE BASED WIRELESS COMMUNICATION ASSISTANT
Use of Multimedia in Engineering. Mechatronics engineering is based on the combination from three basic engineering field that is mechaninal, electronics.
1 References: 1. J.M. Hart, Windows System Programming, 4th Ed., Addison-Wesley, 2010, Ch.12 2.Microsoft Kinect SDK for Developers,
TOUCHLESS TOUCH SCREEN
ASSISTIVE TECHNOLOGY PRESENTED BY ABDUL BARI KP. CONTENTS WHAT IS ASSISTIVE TECHNOLOGY? OUT PUT: Screen magnifier Speech to Recogonizing system Text to.
Department of Electrical Engineering, Southern Taiwan University of Science and Technology Chi-Jo Wang Professor and Associate Chair December 17, 2012.
Kinect Part II Anna Loparev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
INTERACTING WITH SIMULATION ENVIRONMENTS THROUGH THE KINECT Fayez Alazmi Supervisor: Dr. Brett Wilkinson Flinders University Image 1Image 2 Source : 1.
InkIDE Ink and Gesture Enabled Integrated Development Environment.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Page 1 | Microsoft Work With Color Data Kinect for Windows Video Courses Jan 2013.
ECE 477 Final Presentation Team 16 − Spring 2013 Scott Stack Neil Kumar Jon Roose John Hubberts.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
IBM - CVUT Student Research Projects Remote Control of a Furby Toy with BlueTooth Tomáš Kunc
NURIZAH RAMLI 10D0124 DTE. Any data or instruction entered into the computer is known as INPUT. An input device helps you to communicate with the computer.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
Andrew Piñeiro Xiaofeng Zhu Mentor: Dr. J. Zalewski.
Welcome to Control ! Hi ! Styx Innov. What is Control ? Control is an android application which enables us to remotely control our PC via Wireless Fidelity.
Input Devices Ali El-Achmar Matt Leclair TEJ20. What Are Input Devices? ● Input Devices are data going into the computer from the user.
Project By: Brent Elder, Mike Holovka, Hisham Algadaibi.
S ENSORS U SED I N G AMES By Wusqa Waqar. What are sensors and how are they used in games? A sensor is a converter that measures a physical quantity and.
CONTENT 1. Abstract 2. Introduction 3. Research Method 4. Discussions 5. Conclusion 6. References 1.
Software Graduation Project Mai Abushamma Rawan Abuzahra Zainab Imran Supervisor : Dr. Samer Arandi.
Butler Bot Sai Srivatsava Vemu Graduate Student Mechanical and Aerospace Engineering.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
GPS And XBee Based Robot For War Field Surveillance
Student Name USN NO Guide Name H.O.D Name Name Of The College & Dept.
Introduction to Kinect For Windows SDK
Student’s Name with USN No.
CSE Design Lab Milestone III Karl SchwirzJames Hopkins Dennis O’FlahertyDave Festa.
Product: Microsoft Kinect Team I Alex Styborski Brandon Sayre Brandon Rouhier Section 2B.
Stay Kinected: A Home Monitoring System Combining Safety and Comfort Abstract The purpose of this project is to use the Microsoft Kinect sensor to implement.
Kinect Hacked!! Now what… Exploring the affordances of Microsoft Kinect David Park Suchit Dubey Tarun Chakravorty.
A Gesture Based System Humanize Technology.  Communication is the way we learn.  What about learners with communication difficulties?  Make technology.
On Wikipedia you can find the following definition of NUI: “In computing, a natural user interface, or NUI, or Natural Interface is the common parlance.
E8: Digital Humans Option E AHL: Human Factors Design IB Technology.
Nickolas McCarley University of Alabama Abstract Robotic Navigation through Gesture Based Control (RNGBC) assists people who may not be able to operate.
Standard Input Devices
KINECT AMERICAN SIGN TRANSLATOR (KAST)
Southern Taiwan University Department of Electrical Engineering
A seminar on Touchless Touchscreen Technology
How Microsoft has leveraged
VIRTUAL INTELLIGENCE PROJECT NATAL (Kinect & Xbox 360)
Sliding Puzzle Project
War Field Spying Robot with Night Vision Wireless Camera
ARD Presentation January, 2012 BodyPointer.
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
P1: Smart Phone Interaction Workaround
A seminar on Touchless Technology
Senior Design Capstone Project I
Physics-based simulation for visual computing applications
Invisible Computing Angie, Collin, Nick, Will.
Skinput Technology by Gaurav Aswani.
Presentation transcript:

Project Guide: Mr. B.RAVINDER Assistant Professor (CSE) Batch No.: CSB33 Team Members: D. Sai Goud A0569 Satya Swarup Sahoo A0574 G. Shivadeep A0576 Rohit Reddy A0561

 Abstract  Kinect  Working with Kinect  iRobot Create  Working with Create  Integrating Kinect and Create  Use Case Diagram  Class Diagram  Implementation  Progress  Sample Code  Conclusion GESTURE BASED COMPUTING

 Technology has advanced at a very higher rate in the recent years and has made lives of humans very easy and comfortable. In the field of electronics and computers, people have grown a higher affinity towards comfortable devices that involve very less interaction of the users. One such technology developed is the Gesture Based Computing, where users have minimal physical involvement. And this kind of computing uses sensors that can sense the user and detect the gestures made by the user. Hence, a good 3D sensor KINECT is used for this purpose. It is a 3D sensor which can capture the user's movements and make it computable by using its software development kit. It has its impact even in the field of robotics, where user can control and operate a robot with his gestures. GESTURE BASED COMPUTING

Kinect is the product of Microsoft's Project Natal and launched in America on November 4th, 2010 It's a well equipped 3D sensing device. It has sensors for RGB image, depth image, and multi-array microphone GESTURE BASED COMPUTING

 Using Microsoft Visual Studio 2010 and Kinect SDK.  We don’t have any simulation environment while working on Kinect.  It supports many different Languages like VB.NET, C++, C#.NET. GESTURE BASED COMPUTING

 Using Microsoft Robotics Developer Studio.  Real time and Simulation Environment while working with iRobot Create.  Support many different Languages like VB.NET, C++, C#.NET. GESTURE BASED COMPUTING

User’s gestures are scanned by the Kinect and the corresponding action to be performed by the iRobot will be initiated to it via Bluetooth. GESTURE BASED COMPUTING

Robot KINECT User Getting connected through services Capture gestures from user Convert the gesture to a functionality Make gestures Perform the operation GESTURE BASED COMPUTING

User Kinect Robot makeGestures(); identifyGestures(); compute(); performAction(); getConnection(); performAction(); GESTURE BASED COMPUTING skeleton; joints; _mainPort;

Connection to the robot Gestures from the user Kinect captures gestures Functionalities provided to each gesture Actions implemented as per the gesture GESTURE BASED COMPUTING

 We have successfully provided functionalities to some gestures through which we can implement the usage of various keys of the keyboard (such as BACKSPACE, ARROW KEYS, TAB, etc.)  We have been successful in controlling a robot through our system. And are now working on integrating the gesture based computing in it and controlling the robot through our gestures. GESTURE BASED COMPUTING

A good part of our project is that it can be integrated with many user applications and reduce the physical involvement to a large extent. Currently we are integrating it with the iRobot Create robot using the Microsoft Robotics Studio. The outcome of the project will provide the user with the power of controlling the robot using his/her gestures. GESTURE BASED COMPUTING

Thank You !!