Download presentation
Presentation is loading. Please wait.
1
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer Science Department Robotics Laboratory http://www.cs.uml.edu/robots
3
Collaborators University of Central Florida: Aman Behal Crotched Mountain Rehabilitation Center: David Kontak Exact Dynamics: GertWilem Romer NSF IIS-0534364 http://www.cs.uml.edu/robots
4
Research Question What is the most effective user interface to manipulate a robot arm? Our target audience is power wheelchair users, specifically: –Physically disabled, cognitively aware people. –Cognitively impaired people who do not have fine motor control. http://www.cs.uml.edu/robots
5
Hardware Manus ARM by Exact Dynamics –6 DoF –Joint encoders, slip couplings –Cameras Manual and computer control modes –Both are capable of individual joint movement and Cartesian movement of the wrist. http://www.cs.uml.edu/robots
6
Interface Design Interface is compatible with single switch scanning. Left: –Original image is quartered. –Quadrant containing the desired object is selected. Middle: –Selection is repeated a second time. Right: –Desired object is in 1/16th close-up view. http://www.cs.uml.edu/robots
7
User Testing: Hypotheses H1: Users will prefer a visual interface to a menu based system. H2: With greater levels of autonomy, less user input is necessary for control. H3: It should be faster to move to the target in computer control than in manual control. http://www.cs.uml.edu/robots
8
User Testing: Experiment Participants –12 able-bodied participants (10 male, 2 female) –Age: [18, 52] –67% technologically capable –Computer usage per week (including job related): 67% 20+ hours; 25% 10 to 20 hours; 8% 3 to 10 hours –1/3 had prior robot experience: 1 industry; 2 university course; 1 “toy” robots http://www.cs.uml.edu/robots
9
User Testing: Experiment Methodology Two tested conditions: manual and computer control. Input device was single switch for both controls. Each user performed 6 runs (3 manual, 3 computer). Start control was randomized and alternated. 6 targets were randomly chosen. http://www.cs.uml.edu/robots
10
User Testing: Experiment Methodology Neither fine control nor depth existed in implementation of computer control during user testing. In manual control, users were instructed to move the opened gripper “sufficiently close” to the target. http://www.cs.uml.edu/robots
11
User Testing: Experiment Methodology Manual control procedure, using single switch and single switch menu: –Unfold ARM. –Using Cartesian movement, maneuver opened gripper “sufficiently close” to target. http://www.cs.uml.edu/robots
12
User Testing: Experiment Methodology Computer control procedure: –Turn on ARM. –Select image using single switch. –Select major quadrant using single switch. –Select minor quadrant using single switch. –Color calibrate using single switch. http://www.cs.uml.edu/robots
13
User Testing: Results H1: Users will prefer a visual interface to a menu based system. 83% stated preference for manual control in exit interviews. Likert scale rate of manual and computer control (1 to 5) showed no significant difference in user experience preference. H1 was not proven. Why? Color calibration http://www.cs.uml.edu/robots
14
User Testing: Results H2: With greater levels of autonomy, less user input is necessary for control. In manual control, counted the number of clicks executed by users during runs, divide by run time. This yields average clicks per second. In computer control, the number of clicks is fixed. H2 was confirmed. http://www.cs.uml.edu/robots
15
User Testing: Results H3: It should be faster to move to the target in computer control than in manual control. Distance to time ratio: moving distance X takes Y time. Under computer control, ARM moved farther in less time. H3 was confirmed. http://www.cs.uml.edu/robots
16
Current/Future Work Identify specific volunteers User interface User testing: –H1 –Baseline evaluation –Initial testing at Crotched Mountain Integration with power wheelchair Depth extraction Object occlusion http://www.cs.uml.edu/robots
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.