TELEKINESYS Group Members: Mir Murtaza SM Rasikh Mukarram Shiraz Sohail.

Slides:



Advertisements
Similar presentations
Interactive Space – An application of OpenCV
Advertisements

1 1 Mechanical Design and Production Dept, Faculty of Engineering, Zagazig University, Egypt. Mechanical Design and Production Dept, Faculty of Engineering,
1/12 Vision based rock, paper, scissors game Patrik Malm Standa Mikeš József Németh István Vincze.
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
 Projector .55” x 2.36” x 4.64”  133 g with battery  16:9 and 4:3 aspect ratio  848 x 480 pixels  Laser Pointers  5 mW output power  532 +/- 10.
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Greg Langlois Nick Jacek Dmitry Kovalenkov.
Hand Gesture Based User Interface For Generic Windows Applications –Computer Interface A 2D/3D input device based on real time hand trackingA 2D/3D input.
Electronic and Computer Engineering Colin Grogan Final Year Project: Design and Build an Air Mouse for people with lower mobility.
Shweta Jain 1. Motivation ProMOTE Introduction Architectural Choices Pro-MOTE system Architecture Experimentation Conclusion and Future Work Acknowledgement.
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Greg Langlois Nick Jacek Dmitry Kovalenkov.
The Science of Digital Media Microsoft Surface 7May Metropolia University of Applied Sciences Display Technologies Seminar.
UNIX Chapter 01 Overview of Operating Systems Mr. Mohammad A. Smirat.
Touchscreen Implementation for Multi-Touch
Eye Tracking Project Project Supervisor: Ido Cohen By: Gilad Ambar
IN YOUR FACE Anish Mathur Brian Thompson Sunny Atluri Rushikesh Sheth.
Cambodia-India Entrepreneurship Development Centre - : :.... :-:-
Progress Presentation Final Year Project Air-Mouse for Windows/Linux PC Colin Grogan
BY: SACHIN SHRIVASTAVA Operating System By : Sachin Shrivastava 1.
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Jr. Greg Langlois Nick Jacek Dmitry Kovalenkov.
8. INPUT, OUTPUT and storage DEVICES i/o units
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
The Implementation of a Glove-Based User Interface Chris Carey.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
June 10, 2009 – CMPE 123b Project Presentations Jas Condley Eddie Izumoto Kevin Nelson Matt Thrailkill Zach Walker.
   Input Devices Main Memory Backing Storage PROCESSOR
1.1 1 Introduction Foundations of Computer Science  Cengage Learning.
1 Interacting with your computer Chapter 3 Mailto: Web :
PortableVision-based HCI A Hand Mouse System on Portable Devices 連矩鋒 (Burt C.F. Lien) Computer Science and Information Engineering Department National.
USER INTERFACE.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Surface Computing Turning everyday surfaces into interactive intelligent interfaces Co-located input and output Mixed reality: tangible objects, natural.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
WEBCAS (WEBCAM BASED CLASSROOM ATTENDANCE SYSTEM)
Home Guard Security System. Introduction & Basic Ideas Home Guard Security System.
Digital Image Processing CCS331 Relationships of Pixel 1.
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
May 16-18, Skeletons and Asynchronous RPC for Embedded Data- and Task Parallel Image Processing IAPR Conference on Machine Vision Applications Wouter.
E.g.: MS-DOS interface. DIR C: /W /A:D will list all the directories in the root directory of drive C in wide list format. Disadvantage is that commands.
Object Recognition in ROS Using Feature Extractors and Feature Matchers By Dolev Shapira.
Choosing interaction devices: hardware components
The Implementation of a Glove-Based User Interface Chris Carey.
Virtual Characters. Overview What is a digital character? What is a digital character? Why do would we want digital characters? Why do would we want digital.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Electrical and Computer Engineer Large Portable Projected Peripheral Touchscreen Team Jackson Brian Gosselin Jr. Greg Langlois Nick Jacek Dmitry Kovalenkov.
Knowledge Systems Lab JN 1/15/2016 Facilitating User Interaction with Complex Systems via Hand Gesture Recognition MCIS Department Knowledge Systems Laboratory.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Project Multi-Touch Albert You Aditya Mittal Paul Ferrara Jim Wallace.
Alan Cleary ‘12, Kendric Evans ‘10, Michael Reed ‘12, Dr. John Peterson Who is Western State? Western State College of Colorado is a small college located.
Input Devices. Input devices allow us to enter data into the computer system –Mouse –Keyboard –Graphics Tablet –TrackPad –Touch-sensitive screen - Scanner.
Frank Bergschneider February 21, 2014 Presented to National Instruments.
Computer Components Part #2 – Input and Output. Let’s Review.
Reading 1D Barcodes with Mobile Phones Using Deformable Templates.
SixthSense Technology Visit to download
OpenCV C++ Image Processing
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Hand Gestures Based Applications
Gesture Control interface
HARDWARE The hardware is the part you can see the computer, ie all components of their physical structure. The screen, keyboard, and mouse tower of the.
Group 4 Alix Krahn Denis Lachance Adam Thomsen
Group 4 Alix Krahn Denis Lachance Adam Thomsen
ARD Presentation January, 2012 BodyPointer.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Senior Capstone Project Gaze Tracking System
Higher School of Economics , Moscow, 2016
The Implementation of a Glove-Based User Interface
Teaching slides Chapter 6.
Chapter 1: Digital Communication Tools
Higher School of Economics , Moscow, 2016
Higher School of Economics , Moscow, 2016
Presentation transcript:

TELEKINESYS Group Members: Mir Murtaza SM Rasikh Mukarram Shiraz Sohail

INTRODUCTION REAL WORLD PROBLEM Projection of screen is widely used in presentations and teaching at IBA. The presenter keeps himself/herself in the proximity of the computer system to control it. Lacks the naturalness of the interaction. Our proposed solution: Control the mouse functions with hand gestures IBA CED

SOLUTION Using camera to allow the same kind of user input as touch-screens do, but with a lower price and a larger screen.

OUR APPROACH CAMERA-PROJECTOR CONFIGURATION CONTROLS THE MOUSE. CAMERA HAND + GESTURES’ DETECTION HAND DETECTION AND GESTURE DETECTION: USING LED LIGHTS HAND +GESTURES DETECTION MOUSE FUNCTIONS

IMPLEMENTATION THE SYSTEM DESIGN INPUT INTERFACE OUTPUT

METHODOLOGY Input includes: 1.Valid Region Detection 2.Hand Detection Using LED Lights 3.Gesture Detection Interface  communicate between the recognition part and the OS. Input data  Interface Interface  performs the action

Valid Region Detection Valid region is the projected screen. Rectangular Screen property. openCV function cvfindcountours. Valid and Invalid region The alignment of web camera User confirmation of screen

Hand Detection Using LED Lights LED lights  locates position of the hand. Three LED lights sufficient. Noise filtering can be done in two ways: 1.Set the level of threshold to filter all the light of less intensity. 2.Cover the camera lens with a black film negative.

Gesture Detection A maximum of three light sources can be detected: 1.The light that is closest to the track point is the first light F; 2.The second light at the left hand side of F is L; 3.The third light at the right hand side of F is R.

Gesture Detection By LED Poistions forefinger LED light  point F When light L appears on the left hand side of light F, then left click is executed. When light R appears on the right hand side of light F, then right click is executed. If three lights appear for a second, then the function of scrolling is executed. Valid and invalid gestures.

Software Requirements OpenCV – computer vision library. C++ – working with OpenCV to analyze the hand shape and gesture. Bloodshed Dev-C++ – acts as complier Matlab – for machine learning.

Hardware Requirements 1.Projector Installed in every class of IBA. 2.Webcam Easily available and affordable. Laptop’s camera can also be used. 3.LED Lights: Three 1V LED lights + 9V battery + Gloves

End deliverables A webcam with a particular resolution with its lens covered with a black film negative to filter all the noise and light entering into the classroom. A Hand Glove with LED system integrated. This covers three 1Volt LED lights connected in parallel with a 9 volt battery. The battery lifetime is around 20 days if kept switched on continuously. The system also consists of 1pos On-Off switch which is used for simultaneous control of all 3 LED lights.

Related Work Sit Chu Wah, Sin Kwok San and Luk Tsan Kwong Prof. Brian Kan-Wing MAK Attila Licsár1, Tamás Szirányi1,2

Future Scope Projects’ completion is base for uncovered hand and gesture recognition system. The hand gestures complexity Our research and work plan for future.

References Sit Chu Wah, Sin Kwok San and Luk Tsan Kwong Prof. Brian Kan-Wing MAK Attila Licsár1, Tamás Szirányi1,2 Antonis A. Argyros and Manolis I.A. Lourakis