Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.

Slides:



Advertisements
Similar presentations
PRESS C7000/C7000P/C6000 Color Density Control Color Balance
Advertisements

TEMPLATE DESIGN © Learning Effect With Repeated Use of the DynaVision D2 Visual Motor Evaluation William P. McCormack,
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
IMAGIS-GRAVIR / IMAG Augmented Reality Collaborative Environment: Calibration and Interactive Scene Editing Raphaël Grasset Xavier Decoret Jean-Dominique.
®® Microsoft Windows 7 for Power Users Tutorial 2 Customizing Microsoft Windows 7.
PlaceLab- A Live-In Laboratory Data Collection – Only electric current information – 37 different electrical circuits fitted with current sensors – Circuits.
User Testing & Experiments. Objectives Explain the process of running a user testing or experiment session. Describe evaluation scripts and pilot tests.
Social Presence in Web Surveys by Mick P. Couper, Roger Tourangeau, Darby M. Steiger presented by Neal Audenaert.
1 Voice Command Generation for Teleoperated Robot Systems Authors : M. Ferre, J. Macias-Guarasa, R. Aracil, A. Barrientos Presented by M. Ferre. Universidad.
Wheelesley : A Robotic Wheelchair System: Indoor Navigation and User Interface Holly A. Yanco Woo Hyun Soo DESC Lab.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell Katherine.
Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.
Autonomous Vehicle: Navigation by a Line Created By: Noam Brown and Amir Meiri Mentor: Johanan Erez and Ronel Veksler Location: Mayer Building (Electrical.
Autonomy Mode Suggestions for Improving Human-Robot Interaction Michael Baker Holly A. Yanco University of Massachusetts Lowell.
Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of.
Introducing the Computer Self-Efficacy to the Expectation-Confirmation Model: In Virtual Learning Environments 授課老師:游佳萍 老師 學 生:吳雅真 學 號:
Power wheelchair is a feasible alternative to small children with developmental and motor disabilities, such as Cerebral Palsy and Spinal Muscular Atrophy.
®® Microsoft Windows 7 for Power Users Tutorial 2p1 Customizing Microsoft Windows 7.
ICBV Course Final Project Arik Krol Aviad Pinkovezky.
Design and Virtual Prototyping of Human-worn Manipulation Devices Peng Song GRASP Laboratory University of Pennsylvania ASME DETC99/CIE-9029 GRASP Laboratory.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Biomedical design projects developed around OEM devices Jean-Michel I Maarek Department of Biomedical Engineering University of Southern California Los.
Modularly Adaptable Rover and Integrated Control System Mars Society International Conference 2003 – Eugene, Oregon.
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
PHASE 4 SYSTEMS IMPLEMENTATION Application Development SYSTEMS ANALYSIS & DESIGN.
Portland State University Autoscope for Dummies What can it do? What is it? The Autoscope consists of a hardware unit and accompanying software. The software.
Leslie Luyt Supervisor: Dr. Karen Bradshaw 2 November 2009.
Majid Sarrafzadeh Computer Science Department UCLA.
Computers and Disability Case Study IB Computer Science II Paul Bui.
Autonomous Robot Project Lauren Mitchell Ashley Francis.
Logo Add Your Company Slogan A field evaluation of driver eye and head movement strategies toward environmental targets and distracters Professor: Liu.
J. Alan Atherton Michael Goodrich Brigham Young University Department of Computer Science April 9, 2009 Funded in part by Idaho National Laboratory And.
THE USE OF THE INTERNET IN EDUCATION UNESCO IITE.
HARDWARE INTERFACE FOR A 3-DOF SURGICAL ROBOT ARM Ahmet Atasoy 1, Mehmed Ozkan 2, Duygun Erol Barkana 3 1 Institute of Biomedical Engineering, Bogazici.
Human Factors Issues Chapter 9. Human Factors = ergonomics WWII based – military significance… … a necessary part of medical device design…
Lubbock Independent School District Technology Plan By Stacey Price.
Assistive Technology in the Classroom Setting Rebecca Puckett CAE6100 – GQ1 (24494) Dec. 7, 2009.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
1 Design a User Interface and Prepare a Brief Design Document Jeremy Cannella EDIT 773 – Wanda Mally May 2, 2005 Informal Communication Performance Support.
Lynxmotion Robotic Arm
HUMAN CONTROLLED DEVICE WITH MACHINE INTERVENTION FOR COLLISION AVOIDANCE Jake KarlCSE321 Zac BergquistFALL 2010.
Summary We had used and the need to learn and implement using the software of Microsoft Visual Studio 2008 into our system upgrade. The report documents.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Comparing Performance on the Virtual Laparoscopic and Robotic Simulators Among Medical Students Pursuing Surgical versus Non-surgical Residencies Amanda.
CSE Design Lab Milestone III Karl SchwirzJames Hopkins Dennis O’FlahertyDave Festa.
FlowArm PLTW Programming
Lynxmotion Robotic Arm © 2013 Project Lead The Way, Inc.Computer Integrated Manufacturing
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mehrdad Ghaziasgar.
FlowArm PLTW Motions © 2013 Project Lead The Way, Inc.Computer Integrated Manufacturing.
Ali Ghadirzadeh, Atsuto Maki, Mårten Björkman Sept 28- Oct Hamburg Germany Presented by Jen-Fang Chang 1.
Getting Ready for the NOCTI test April 30, Study checklist #1 Analyze Programming Problems and Flowchart Solutions Study Checklist.
TGDC Meeting, Jan 2011 VVSG 2.0 and Beyond: Usability and Accessibility Issues, Gaps, and Performance Tests Sharon Laskowski, PhD National Institute of.
Information Processes and Technology Information Processes Collecting.
Using the Cyton Viewer Intro to the viewer.
Alfred Daviso, Ph.D. The University of Akron Carol Feldman-Sparber
FlowArm PLTW Motions Computer Integrated Manufacturing
Where have we been? Cartography as communication systems: then (linear) and now (cyclical) Geospatial data Visualization Pipeline (Ben Fry) Getting a message.
FlowArm PLTW Programming
Development Commitment Package
Product Evaluation & Quality Improvement
Product Evaluation & Quality Improvement
Chapter 2 Human Information Processing
Intelligent Machine Design Laboratory
Silhouette Intersection
JPEG Still Image Data Compression Standard
SYSTEMS ANALYSIS & DESIGN
Presentation transcript:

Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer Science Department Robotics Laboratory

Collaborators University of Central Florida: Aman Behal Crotched Mountain Rehabilitation Center: David Kontak Exact Dynamics: GertWilem Romer NSF IIS

Research Question What is the most effective user interface to manipulate a robot arm? Our target audience is power wheelchair users, specifically: –Physically disabled, cognitively aware people. –Cognitively impaired people who do not have fine motor control.

Hardware Manus ARM by Exact Dynamics –6 DoF –Joint encoders, slip couplings –Cameras Manual and computer control modes –Both are capable of individual joint movement and Cartesian movement of the wrist.

Interface Design Interface is compatible with single switch scanning. Left: –Original image is quartered. –Quadrant containing the desired object is selected. Middle: –Selection is repeated a second time. Right: –Desired object is in 1/16th close-up view.

User Testing: Hypotheses H1: Users will prefer a visual interface to a menu based system. H2: With greater levels of autonomy, less user input is necessary for control. H3: It should be faster to move to the target in computer control than in manual control.

User Testing: Experiment Participants –12 able-bodied participants (10 male, 2 female) –Age: [18, 52] –67% technologically capable –Computer usage per week (including job related): 67% 20+ hours; 25% 10 to 20 hours; 8% 3 to 10 hours –1/3 had prior robot experience: 1 industry; 2 university course; 1 “toy” robots

User Testing: Experiment Methodology Two tested conditions: manual and computer control. Input device was single switch for both controls. Each user performed 6 runs (3 manual, 3 computer). Start control was randomized and alternated. 6 targets were randomly chosen.

User Testing: Experiment Methodology Neither fine control nor depth existed in implementation of computer control during user testing. In manual control, users were instructed to move the opened gripper “sufficiently close” to the target.

User Testing: Experiment Methodology Manual control procedure, using single switch and single switch menu: –Unfold ARM. –Using Cartesian movement, maneuver opened gripper “sufficiently close” to target.

User Testing: Experiment Methodology Computer control procedure: –Turn on ARM. –Select image using single switch. –Select major quadrant using single switch. –Select minor quadrant using single switch. –Color calibrate using single switch.

User Testing: Results H1: Users will prefer a visual interface to a menu based system. 83% stated preference for manual control in exit interviews. Likert scale rate of manual and computer control (1 to 5) showed no significant difference in user experience preference. H1 was not proven. Why? Color calibration

User Testing: Results H2: With greater levels of autonomy, less user input is necessary for control. In manual control, counted the number of clicks executed by users during runs, divide by run time. This yields average clicks per second. In computer control, the number of clicks is fixed. H2 was confirmed.

User Testing: Results H3: It should be faster to move to the target in computer control than in manual control. Distance to time ratio: moving distance X takes Y time. Under computer control, ARM moved farther in less time. H3 was confirmed.

Current/Future Work Identify specific volunteers User interface User testing: –H1 –Baseline evaluation –Initial testing at Crotched Mountain Integration with power wheelchair Depth extraction Object occlusion