Download presentation
Presentation is loading. Please wait.
Published byLinda Collins Modified over 9 years ago
1
Friday, 4/8/2011 Professor Wyatt Newman Smart Wheelchairs
2
Outline What/Why Smart Wheelchairs? Incremental Modules – Reflexive collision avoidance – Localization, trajectory generation, steering and smart buildings – Speech-driven wheelchair control Natural language interfaces
3
Architecture Natural language/ speech processing localization/motion control (or joystick) reflexes/local mapping Wheelchair command sensors
4
“Otto” instrumented wheelchair *Kinect *Hokuyo *“Neato” *ultrasound
5
Sensing the world All mobile vehicles should avoid collision. “Ranger” sensors – Actively emit energy to detect obstacles Cameras – Passively absorb light and can use machine vision techniques to estimate obstacle positions.
6
Rangers Simple rangers – Can be sonar or infrared. – Limited information arises from wide “cone” emitted by sensor.
7
Laser Scanners Lidars (LI Detection And Ranging) – Much better information. – Many radial points of data. Velodyne – Three dimensional lidar. – Very expensive.
8
Laser Scanners Neato sensor: – Low-cost sensor – 1-deg range values – Not yet available as separate unit
9
Cameras Monocular cameras cannot return depth information. Stereo cameras do return depth information. – This requires two sensors and has computational and calibration overhead. Hybrid sensor: Swiss Ranger – Uses infrared time of flight calculations with a monocular camera to produce a 3D map. Kinect sensor: – Low-cost, mass-produced camera for computer gaming – Uses structured light to infer 3-D
10
Autonomous Mode Localization – Relative frame – Global frame Navigation – Goal planning – Path planning – Path following/Steering
11
Localization Local frame sensors – Odometry – Gyros – Accelerometers Fusion with Kalman Filter Drifty and unreliable for long term position estimation
12
Localization Global frame – SLAM (Simultaneous Localization & Mapping) – AMCL (Adaptive Monte Carlo Localization)
13
Navigation Rviz (robot’s perception) video
14
Smart Building Coordination & Cooperation – Smart devices work together to improve quality of life for users – Multi-robot path planning and congestion control – Robots invoke services within buildings video
15
Vocal Joystick A hands free control system for a wheelchair will provide restored independence – Quadriplegics, ALS, MS, Cognitive Disorders, Stroke Assistive Technology – High Level of Abandonment – Comfort – Difficult interface – Doesn’t properly fit the problem – Hard to make small adjustments
16
Alternative Wheelchair Control Voiced – Path Selection vs. Goal Selection (“Go to”) – “Natural” language commands (Left, Right) Non-Voiced – Humming controller Mouth-Controlled – “sip and puff” – tongue
17
Alternative Wheelchair Control Head Joystick Eye movement (“Gaze”) Chin Control EMG
18
Why not voice? Voice is the most natural way to interface with a wheelchair. Why have we not seen voice activated wheelchairs in the market? – Recognition problems – Over simplified – Difficulty in precision control without collision avoidance – Difficult HMI – Hard to make small adjustments
19
Speech-driven Wheelchair Control A naturalistic “vocal” joystick for a wheelchair (or any other mobile vehicle). Prosodic features will be extracted from the user when giving a command. – Pitch, Stress, and Intensity – Modeled and learned (through training simulations) Uses a Small corpus – Users wont have to manage many commands. – With added prosodic features could provide a more natural means and solve the small changes in velocity, a problem described earlier. video
20
A linguistic interface Longer-term research in natural human interfaces There are three ways to think and speak about space in order to travel through it.
21
(1) MOTION driving, (2) voyage DRIVING, and (3) goal driven speech control of motion: (1)–>(2)–>(3) We control each others’ movements, when it is relevant, by (1) motor commands, (2) indications of paths, and (3) volitive expressions of goals. So: Speaking to a taxi driver, (3) the mention of a goal is normally enough to achieve proper transportation. Speaking to a private driver as his navigator, we would instead give (2) indications for the trajectory by referring to perceived landmarks. Speaking to a blindfolded person pushing your wheelchair, we would finally just use (1) commands corresponding to simply using a joystick in a videogame.
22
Interface Architecture: Local Ontology Incl. sites and known objects Local Ontology Incl. sites and known objects SPEECH Rec. & Prod. Visual display Sensor signal Parsing & Inter- pretation Motor action ? ! Obstacle avoidance
23
Future Work Wheelchair as personal assistant – Safety monitoring – Health monitoring – Assistive functions Wheelchair users focus group input User trials Add-on modules – Automated seat pressure redistribution – Medication reminders/monitoring – BP and weight monitoring – Distress sensing/response
24
Summary/Q&A Reflexive collision avoidance—near-term product? Localization, trajectory generation and steering Verbal joystick w/ prosody a priori maps vs. teaching/map-making; smart buildings/smart products Natural language processing and human interfaces— longer term
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.