Presentation is loading. Please wait.

Presentation is loading. Please wait.

Shared User-Computer Control of a Robotic Wheelchair System Holly Yanco MIT AI Lab Thesis Supervisor: Rod Brooks Committee Members: Eric Grimson, Rosalind.

Similar presentations


Presentation on theme: "Shared User-Computer Control of a Robotic Wheelchair System Holly Yanco MIT AI Lab Thesis Supervisor: Rod Brooks Committee Members: Eric Grimson, Rosalind."— Presentation transcript:

1 Shared User-Computer Control of a Robotic Wheelchair System Holly Yanco MIT AI Lab Thesis Supervisor: Rod Brooks Committee Members: Eric Grimson, Rosalind Picard

2 Problem Statement Some people are unable to drive standard powered wheelchairs and must rely upon caregivers This population is estimated to be at least 15,000 people in the United States A “conservative estimate indicates that over 2 million people with severe special needs within the EC could benefit from an individually configurable intelligent wheelchair” [Borgolte 98]

3 Potential User Groups Initial onset of Guillian-Barré syndrome Multiple sclerosis Cerebral palsy Spinal cord injury Brain injury

4 Research Goals Assist user with navigation in indoor and outdoor environments Immediately navigate novel environments safely Ensure the usability of the system by including an interface that can be controlled by many different access devices

5 Wheelesley

6 Research Contributions First indoor/outdoor robotic wheelchair system Indoor navigation: 71% less effort Outdoor navigation: 74% less effort Indoor/outdoor mode detector: 1.7% error rate Customizable user interface demonstrated with eye tracking and single switch scanning

7 Standard Powered Wheelchair Wheelesley

8 Related Work: Travel Restrictions Magnetic lane [Wakaumi et al 92] Maps [Radhakrishnan and Nourbakhsh 99] [Wang 97] [Madarasz 91] Trained paths [Yoder et al 96] [Stanton et al 91]

9 Related Work: Outdoor Navigation TAO Project [Gomi and Griffith 98]: tested outdoors with 3 ft high snow walls on either side of sidewalk Intelligent Wheelchair Project [Gribble et al. 98]: plans to include outdoor navigation [Radhakrishnan and Nourbakhsh 99]: plan to develop outdoor navigation

10 Related Work: Interfaces OMNI [Buhler et al 97]: user interface for joystick, customized for row/column scanning with switch VAHM [Bourhis and Pino 96]: interface for single switch scanning Joystick only: [Yoder et al 96] [Tahboub and Asada 99] [Simpson et al 99] Joystick with additional buttons or switches [Connell and Viola 90] [Miller and Slack 95] Voice control [Stanton et al 91] [Amori 92] [Simpson and Levine 97] Ultrasonic head control [Jaffe 81] [Ford and Sheredos 95] Face tracking [Adachi et al 98] [Bergasa et al 99]

11 Navigation Typical Planning-Reaction Architecture Wheelesley architecture

12 General Navigation User provides high level control –Straight/left/right at path choices Wheelchair provides low level control –Path following –Obstacle avoidance

13 Indoor Navigation Sonar and infrared sensors Sensor clustering Robotic assistance –Hallway following –Obstacle avoidance

14 Indoor User Tests 14 able-bodied subjects –7 men, 7 women –age 18 to 43 2 robotic trials, 2 manual trials 71% improvement in user effort 25% improvement in time to traverse course

15 Indoor User Tests: Results

16 Outdoor Navigation Vision system –STH-V1 Stereo Head from Videre Designs Robotic assistance –Sidewalk following –Obstacle avoidance

17 Local Path Detection Image Median filtering Edges from intensity gradient Neighbor eliminationRemove far points Lines calculated using absolute deviation

18 Sidewalk following Select left or right line based upon which has more edge points on the line Use edge to steer If neither edge is a good candidate, move forward slowly to regain lines present in an earlier frame

19 Obstacle Detection

20 Obstacle avoidance Takes precedence over sidewalk following Slows if obstacle detected in far center region (~5 to 10 feet away) Stops if obstacle detected in close center region (~2 to 5 feet away) image Close center Far center

21 Outdoor User Tests 7 able-bodied subjects –4 women, 3 men –age 24 to 31 2 robotic trials, 2 manual trials 74% improvement in user effort 20% improvement in time to traverse course grass road

22 Outdoor User Tests: Results

23 Indoor/Outdoor Detector Uses multiple sensors to determine if chair is indoors or outdoors –Temperature –Sonar –Light: uv filter –Light: ir filter –Light, no filter

24 Mode detection C4.5 used to learn a decision tree Data set consists of 547 indoor data vectors and 647 outdoor data vectors Decision tree learned has a 1.7% error rate

25 User Interface

26 Access Methods Joystick Joystick with plate Single switch Multiple switch arrays Sip and puff Chin joystick Mouth plate Eye tracking Means for controlling a powered wheelchair Usually selected by the wheelchair provider to meet the user’s needs and abilities

27 Access Method: EagleEyes Eye tracking system developed by Jim Gips at Boston College Measure EOG using electrodes Use measurements to control mouse

28 Access Method: Single Switch Scanning Interface scans through 4 arrows: –Forward –Right –Left –Back User hits switch when desired command is highlighted

29 Physical Therapist Evaluation 12 physical therapists at Spaulding Rehabiliation in Boston System demo Seen as tool for training as well as for everyday use Offered patients as future test subjects

30 Physical Therapist Evaluation Suggested changes –Sensor-guided driving for reverse –Appearance of wheelchair –Powered wheelchair brand

31 Summary Research resulted in first indoor/outdoor wheelchair system Navigation assistance reduces user effort and travel time Easily customized user interface can be used with many different access methods


Download ppt "Shared User-Computer Control of a Robotic Wheelchair System Holly Yanco MIT AI Lab Thesis Supervisor: Rod Brooks Committee Members: Eric Grimson, Rosalind."

Similar presentations


Ads by Google