Download presentation
Published byThomasina Flynn Modified over 9 years ago
1
Gesture-Based Wheelchair Control for Physically-Challenged
2
ARJUN ASHOK V ARUNKUMAR M DANIEL C J NIVED KRISHNAN
THE TEAM ARJUN ASHOK V ARUNKUMAR M DANIEL C J NIVED KRISHNAN
3
CONTENTS Abstract Problem Definition Project Overview Implementation
Verification Future Work References
4
ABSTRACT We have developed a gesture-controlled wheelchair system for use by quadriplegics and other physically-disabled persons. Salient features: Effortless to use Customizable Economical Power-efficient Non-intrusive
5
PROBLEM DEFINITION Existing powered wheelchairs demand exertion of force for controlling them, making them unusable to a class of the physically-challenged called quadriplegics. Movements such as pressing buttons or controlling a joystick are impossible for quadriplegics, who lack fine motor control.
6
PROJECT OVERVIEW The quadriplegics often retain some imprecise motion of their fingers. Therefore, the best option is a gesture-based interaction with their environment, in particular their wheelchairs. We have developed a robust, real-time vision-based hand gesture recognition engine reliable enough for steering a wheelchair.
7
PROJECT OVERVIEW Hardware specification: Software specification:
IR-sensitive USB webcam Diffusion masks + mounting/enclosure x64/x86 PC platform Stripped USB keyboard circuit PIC 16F877A microcontroller-based interface DC motor steering mechanism with ULN2804 and relay-based H-bridge Software specification: MATLAB® from The MathWorks™ Win.32/64-based operating system PIC-based H-bridge control
9
IMPLEMENTATION Gesture Capture Module developed.
A regular web-camera was modified into an IR-sensitive version. An IR-illuminated backlit surface for gesture-capture was created. We used IR LEDs in a 8x8 matrix layout. We used an assembly of tracing paper sheets to create a diffuser for IR radiation. We used an aluminium foil mask to define an active area.
12
IMPLEMENTATION Gesture Recognition Module developed.
Successfully captured and processed images from IR-camera in real-time. Defined a protocol of gestures. These are extremely simplified so as to be easy to use by the physically-challenged. Move forward Move backward Turn left Turn right Stop/brake
13
IMPLEMENTATION Gesture Recognition Module developed.
Created a set of templates for these commands, which are then correlated with the hand positions detected by the gesture-capture module. There is an initial training phase for the software, where the user is able to customize all movement gestures according to their specific needs.
16
Return
17
IMPLEMENTATION Interfacing Module developed.
A USB keyboard circuit was stripped to give access to the three indicator LEDs – Scroll Lock, Caps Lock, Num Lock. The LEDs were controlled from keyboard control libraries linked to MATLAB®, according to the gesture recognized. This gives a 3-bit code for each gesture. These codes are given to a PIC16F877A microcontroller, which produces outputs to control an H-bridge circuit.
18
H-bridge control signal
IMPLEMENTATION 3-bit signal Gesture H-bridge control signal 000 Brake 001 Forward 010 Reverse 011 Turn Left 100 Turn Right 101 110 111
19
IMPLEMENTATION Motor Control Module developed.
Two DC motors are controlled based on inputs from the Interfacing Module. These inputs are linked to an H-bridge circuit through ULN2804. An H-bridge circuit consisting of 8 relays is used to direct the two motors attached to the rear wheels of the wheelchair.
20
IMPLEMENTATION Both the rear wheels should turn in the same direction for forward/reverse motion of wheelchair. To turn left, the left wheel should turn backward while right wheel turns forward. The reverse is true for right turns. In order to brake, the wheels are made to turn in the opposite direction to current spin for quarter of a second. Then the motors are disconnected from the supply.
23
VERIFICATION Gesture detection verification DC motor motion test
Gestures that are captured have been correlated in real time with the templates that are stored. Satisfactory results have been achieved. DC motor motion test All possible combinations of 3-bit data were fed to the microcontroller to test motion of DC motors. The inputs were fed externally. The wheels connected to DC motors turned as expected.
24
VERIFICATION System Test
Motion of wheels has been tested as per the input generated by the gesture capture module. The input fed by PC via USB cable is decoded by microcontroller. Each gesture generates a different 3-bit signal. Redundant signals have never been generated.
25
FUTURE PROSPECTS Porting the whole system to an embedded system based on the Intel® Atom™ processor. Fine-tuning the system for power-efficiency and compactness.
26
REFERENCES Real-Time Vision-Based Hand Tracking and Gesture Recognition by Qing Chen. Digital Image Processing by William K Pratt. Fundamentals of Digital Image Processing by Anil K Jain. MATLAB Image Processing Toolbox for use with MATLAB.
27
THANK YOU
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.