Download presentation
Presentation is loading. Please wait.
Published byMervyn Ball Modified over 8 years ago
1
Giri.K.R [4jn08ec016] Harish.Kenchangowdar[4jn10ec401] Sandesh.S[4jn08ec043] Mahabusaheb.P[4jn09ec040]
2
Introduction “Speech” and “gestures” are the expressions, which are mostly used in communication between human beings. Learning of their use begins with the first years of life. In human communication, the use of speech and gestures is completely coordinated. Machine gesture and sign language recognition is about recognition of gestures and sign language using gloves.
3
A number of hardware techniques are used for gathering information about body positioning; typically either image-based (using cameras, moving lights etc) or device-based (using instrumented gloves, position trackers etc.), although hybrids are beginning to come about. We are going to implement this project with the help of instrumented glove and Arduino.
4
However, getting the data is only the first step. The next step, that of recognizing the sign or gesture once it has been captured is much more challenging, especially in a continuous stream. In fact currently, this is the focus of our project. We are implementing this project to help the dumb people to communicate with normal people.
5
Functional Block Diagram
6
The system aims at bridging communication gaps between the deaf community and other people. When fully operational the system will help in minimizing communication gaps, easier collaboration and will also enable sharing of ideas and experiences.
7
Implementation using glove
8
The data is analyzed from an instrumented data glove for use in recognition of some signs and gestures. A system is developed for recognizing these signs and their conversion into text. The input from the flex sensors is given to the analog input of the Arduino board. The atmega8 process the analog inputs and converts it into digital form, which will be displayed on the monitor / lcd display.
9
The following gesture will be displayed in our project.
10
If time permits….. The gesture of alphabets will be implemented.
11
Future applications Using this concept the dumb people can communicate efficiently through android cell phones. Can control home appliances.
12
Bibliography Arduino blog. IEEE conference on Intelligent Transportation Systems, 2004. Recognition of arm gestures using multiple orientation sensors: gesture classification. Arduino basics blog.
13
Thank You
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.