Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.

Slides:



Advertisements
Similar presentations
Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Advertisements

PRESS C7000/C7000P/C6000 Color Density Control Color Balance
RoboCell and Cell Setup
Multimedia Specification Design and Production 2012 / Semester 1 / week 6 Lecturer: Dr. Nikos Gazepidis
Team Spot Cooperative Light Finding Robots A Robotics Academy Project Louise Flannery, Laurel Hesch, Emily Mower, and Adeline Sutphen Under the direction.
The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
Television Production Team
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell Katherine.
Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.
Assistive Technology Rachael N. Rzepka. Pen Scanner Used like a marker or a highlighter Scans text and then stores it which can be transferred as text.
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.
Mechatronics 1 Week 9 & 10. Learning Outcomes By the end of week 9-10 session, students will understand the control system of industrial robots.
Hand Movement Recognition By: Tokman Niv Levenbroun Guy Instructor: Todtfeld Ari.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
Algorithm Optimization for Swept Source Fourier Domain Optical Coherence Tomography (OCT) with Galvo Scanner and Digital Signal Processor (DSP) Kevin Seekell.
Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of.
COMPUTER SYSTEM COMPONENTS ACTIVITY
Introduction to the course January 9, Points to Cover  What is GIS?  GIS and Geographic Information Science  Components of GIS Spatial data.
SL Introduction to Optical Inspection1. Introduction to Optical Inspection Helge Jordfald Sales & Marketing Manager Tordivel AS – Norway.
The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities.
Configuring the MagicInfo Pro Display
Manufacturing Engineering Department Lecture 9 – Automated Inspection
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
3-D Scanning Robot Steve Alexander Jeff Bonham John Johansson Adam Mewha Faculty Advisor: Dr. C. Macnab.
© Paradigm Publishing Inc. 2-1 Chapter 2 Input and Processing.
Real-Time Human Posture Reconstruction in Wireless Smart Camera Networks Chen Wu, Hamid Aghajan Wireless Sensor Network Lab, Stanford University, USA IPSN.
Programming Concepts Part B Ping Hsu. Functions A function is a way to organize the program so that: – frequently used sets of instructions or – a set.
Kaitlin Peranski Spencer Wasilewski Kyle Jensen Kyle Lasher Jeremy Berke Chris Caporale.
OVERVIEW- What is GIS? A geographic information system (GIS) integrates hardware, software, and data for capturing, managing, analyzing, and displaying.
Autonomous Robot Project Lauren Mitchell Ashley Francis.
Aquatic Spectrometer & Turbidity Meter Preliminary Design Review ECE 4007 L1, Group 8 Paul Johnson Daniel Lundy John Reese Asad Hashim.
© Paradigm Publishing Inc. 2-1 Chapter 2 Input and Processing.
IT Introduction to Information Technology CHAPTER 01.
Computers: Tools for an Information Age Chapter 1.
Difference Between Raster and Vector Images Raster and vector are the two basic data structures for storing and manipulating images and graphics data on.
Highly Capable Program th and 8 th Grade.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
By Tony Hoff ECE 4220 – Real Time Embedded Computing University of Missouri - Columbia Course Instructor: Dr. Guiherme DeSouza.
Stores the OS/data currently in use and software currently in use Memory Unit 21.
March 2004 Charles A. DiMarzio, Northeastern University ECEG287 Optical Detection Course Notes Part 15: Introduction to Array Detectors Profs.
Elliptical Head Tracking Using Intensity Gradients and Color Histograms Stan Birchfield Stanford University Autodesk Advanced Products Group
Slide 1 Archive Computing: Scalable Computing Environments on Very Large Archives Andreas J. Wicenec 13-June-2002.
Copyright © 2006 Prentice-Hall. All rights reserved.1 Computer Literacy for IC 3 Unit 1: Computing Fundamentals Project 1: Identifying Types of Computers.
Vision and Obstacle Avoidance In Cartesian Space.
Robot Basics Motion and Nomenclature. Robot Main Components Programming Terminal Controller Manipulator Manual Pendent.
Magic Camera Master’s Project Defense By Adam Meadows Project Committee: Dr. Eamonn Keogh Dr. Doug Tolbert.
Lynxmotion Robotic Arm
MECH1500 Chapter 3.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Vision-Guided Robot Position Control SKYNET Tony BaumgartnerBrock Shepard Jeff Clements Norm Pond Nicholas Vidovich Advisors: Dr. Juliet Hurtig & Dr. J.D.
System Software Design Dan Sweet May 6 th, 2008 Western Washington University Bicycle Power Meter.
Automatic Control Robot Arms Teacher : Ru-Li Lin Student: Bo-Lin Chen En-Hao Li Southern Taiwan University of Science and Technology.
RoboCup KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr.
FlowArm PLTW Programming
Lynxmotion Robotic Arm © 2013 Project Lead The Way, Inc.Computer Integrated Manufacturing
FlowArm PLTW Motions © 2013 Project Lead The Way, Inc.Computer Integrated Manufacturing.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
Using the Cyton Viewer Intro to the viewer.
Lynxmotion Robotic Arm
FlowArm PLTW Motions Computer Integrated Manufacturing
FlowArm PLTW Programming
The Graphics Rendering Pipeline
Find It VR Project (234329) Students: Yosef Albo, Bar Albo
Vision Tracking System
Introduction to Industrial Robot Programming
724 Temperature Calibrator
Maker Education Manipulator
Presentation transcript:

Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006

Summary of Work  Programming the ARM  Accomplishments  Challenges  Future Work  Programming the ARM  Accomplishments  Challenges  Future Work

Programming the ARM  Control Modes  Receiving and Sending Packets  Structure of Communication Packet  Control Modes  Receiving and Sending Packets  Structure of Communication Packet

Control Modes  Manual and Transparent Control Modes  Both are capable of Joint Movement and Cartesian Movement.  Manual and Transparent Control Modes  Both are capable of Joint Movement and Cartesian Movement.  Joint Movement - One of six joints move at a given time.  Cartesian Movement - Wrist moves linearly in 3D space; joints may move simultaneously.

Control Modes: Manual Control  The maximum velocity is 9 cm/s.  Using Cartesian Movement, the ARM can only move linearly in X, Y, or Z.  Math processor handles safety checking, Cartesian coordinate transform checking, and calculation of necessary motor torques for velocity inputs.  The maximum velocity is 9 cm/s.  Using Cartesian Movement, the ARM can only move linearly in X, Y, or Z.  Math processor handles safety checking, Cartesian coordinate transform checking, and calculation of necessary motor torques for velocity inputs.

Control Modes: Transparent Mode  The maximum velocity is 25 cm/s.  Using Cartesian Movement, the ARM can simultaneously move in X, Y, and Z.  Math processor is bypassed; safety check is not done.  The maximum velocity is 25 cm/s.  Using Cartesian Movement, the ARM can simultaneously move in X, Y, and Z.  Math processor is bypassed; safety check is not done.

Communication: Receiving and Sending Packets  Communication thread is spawned during program initialization.  Based on single producer, single consumer donut factory problem.  Incoming packets from the ARM are stored in reader 10,000 slot semaphore.  Outgoing packets to the ARM are stored in writer 10,000 slot semaphore.  Communication thread is spawned during program initialization.  Based on single producer, single consumer donut factory problem.  Incoming packets from the ARM are stored in reader 10,000 slot semaphore.  Outgoing packets to the ARM are stored in writer 10,000 slot semaphore.

Communication: Receiving and Sending Packets  ARM sends packet to PC every 20 ms (hardware limitation).  3 types of incoming packets.  ID = 0x350, 0x360, 0x37F  ARM sends packet to PC every 20 ms (hardware limitation).  3 types of incoming packets.  ID = 0x350, 0x360, 0x37F

General Communication Packet

Cartesian Mode Packet Interpretation

Cartesian Mode Packet Transmission  Velocity: p/(20 * ) mm/s

Accomplishments  Design of initial interface  Demo at Robotics: Science and Systems 2006 Conference Workshop on Manipulation for Human Environments  User Testing  Paper presentation (forthcoming) at AAAI-07 Spring Symposium Series Workshop on Multidisciplinary Collaboration for Socially Assistive Robotics  Design of initial interface  Demo at Robotics: Science and Systems 2006 Conference Workshop on Manipulation for Human Environments  User Testing  Paper presentation (forthcoming) at AAAI-07 Spring Symposium Series Workshop on Multidisciplinary Collaboration for Socially Assistive Robotics

Interface Design  Interface is compatible with single switch scanning.  Left:  Original image is quartered.  Quadrant containing the desired object is selected.  Middle:  Selection is repeated a second time.  Right:  Desired object is in 1/16th close-up view.

Demo

User Testing: Hypotheses  H1: Users will prefer a visual interface to a menu based system.  H2: With greater levels of autonomy, less user input is necessary for control.  H3: It should be faster to move to the target in computer control than in manual control.  H1: Users will prefer a visual interface to a menu based system.  H2: With greater levels of autonomy, less user input is necessary for control.  H3: It should be faster to move to the target in computer control than in manual control.

User Testing: Experiment  Participants  12 participants (10 male, 2 female)  Age: [18, 52]  67% technologically capable  Computer usage per week (including job related):  67% 20+ hours; 25% 10 to 20 hours; 8% 3 to 10 hours  1/3 had prior robot experience:  1 industry; 2 university course; 1 “toy” robots  Participants  12 participants (10 male, 2 female)  Age: [18, 52]  67% technologically capable  Computer usage per week (including job related):  67% 20+ hours; 25% 10 to 20 hours; 8% 3 to 10 hours  1/3 had prior robot experience:  1 industry; 2 university course; 1 “toy” robots

User Testing: Experiment Methodology  Two tested conditions: manual and computer control.  Input device was single switch for both controls.  Each user performed 6 runs (3 manual, 3 computer).  Start control was randomized and alternated.  6 targets were randomly chosen.  Two tested conditions: manual and computer control.  Input device was single switch for both controls.  Each user performed 6 runs (3 manual, 3 computer).  Start control was randomized and alternated.  6 targets were randomly chosen.

User Testing: Experiment Methodology  Neither fine control nor depth existed in implementation of computer control during user testing.  In manual control, users were instructed to move the opened gripper “sufficiently close” to the target.  Neither fine control nor depth existed in implementation of computer control during user testing.  In manual control, users were instructed to move the opened gripper “sufficiently close” to the target.

User Testing: Experiment Methodology  Manual Control Procedure, using single switch and single switch menu:  Unfold ARM.  Using Cartesian movement, maneuver opened gripper “sufficiently close” to target.  Manual Control Procedure, using single switch and single switch menu:  Unfold ARM.  Using Cartesian movement, maneuver opened gripper “sufficiently close” to target.

User Testing: Experiment Methodology  Computer Control Procedure:  Turn on ARM.  Select image using single switch.  Select major quadrant using single switch.  Select minor quadrant using single switch.  Color calibrate using single switch.  Computer Control Procedure:  Turn on ARM.  Select image using single switch.  Select major quadrant using single switch.  Select minor quadrant using single switch.  Color calibrate using single switch.

User Testing: Results H1: Users will prefer a visual interface to a menu based system.  83% stated preference for manual control in exit interviews.  Likert scale rate of manual and computer control (1 to 5) showed no significant difference in user experience preference.  H1 was not proven.  Why? Color calibration H1: Users will prefer a visual interface to a menu based system.  83% stated preference for manual control in exit interviews.  Likert scale rate of manual and computer control (1 to 5) showed no significant difference in user experience preference.  H1 was not proven.  Why? Color calibration

User Testing: Results H2: With greater levels of autonomy, less user input is necessary for control.  In manual control, counted the number of clicks executed by users during runs, divide by run time. This yields average clicks per second.  In computer control, the number of clicks is fixed.  H2 was confirmed. H2: With greater levels of autonomy, less user input is necessary for control.  In manual control, counted the number of clicks executed by users during runs, divide by run time. This yields average clicks per second.  In computer control, the number of clicks is fixed.  H2 was confirmed.

User Testing: Results H3: It should be faster to move to the target in computer control than in manual control.  Distance to time ratio: moving distance X takes Y time.  Under computer control, ARM moved farther in less time.  H3 was confirmed. H3: It should be faster to move to the target in computer control than in manual control.  Distance to time ratio: moving distance X takes Y time.  Under computer control, ARM moved farther in less time.  H3 was confirmed.

Challenges  Vision system  Shoulder camera  Gripper camera  Vision system  Shoulder camera  Gripper camera

Evolution of UML Vision System: Shoulder Camera

Evolution of UML Vision System: Gripper Camera

Current UML Vision System  Shoulder (occupant’s view) camera is a Canon VC-C50i Pan-Tilt-Zoom.  Specifications (NTSC):  340,000 pixels  460 horizontal lines, 350 vertical lines  2:1 interlaced  26x digital zoom  Focal length: [3.5, 91.0] mm  Shoulder (occupant’s view) camera is a Canon VC-C50i Pan-Tilt-Zoom.  Specifications (NTSC):  340,000 pixels  460 horizontal lines, 350 vertical lines  2:1 interlaced  26x digital zoom  Focal length: [3.5, 91.0] mm

Current UML Vision System  Gripper camera is CCD Snake Camera.  Specifications (NTSC):  1/4” color CCD  510 x 492 pixels  350 vertical lines  2:1 interlaced  Focal length: 3.1 mm  Processor board located 30 cm from CCD.  Gripper camera is CCD Snake Camera.  Specifications (NTSC):  1/4” color CCD  510 x 492 pixels  350 vertical lines  2:1 interlaced  Focal length: 3.1 mm  Processor board located 30 cm from CCD.

Gripper Camera Placement  Our choice was to place camera within gripper.  Camera is inline with axis.  Our choice was to place camera within gripper.  Camera is inline with axis.

Gripper Camera Concerns  Wired:  Wires impede movement of ARM.  Wireless:  Image quality.  Placement not within gripper:  Not within axis of movement.  Accidental knocking off of camera.  Folded position.  Wired:  Wires impede movement of ARM.  Wireless:  Image quality.  Placement not within gripper:  Not within axis of movement.  Accidental knocking off of camera.  Folded position.

Current/Future Work  Integration of ARM with power wheelchair  Depth extraction (image registration, motion filter, optical flow)  Occlusion  User interface  Initial testing at Crotched Mountain  Integration of ARM with power wheelchair  Depth extraction (image registration, motion filter, optical flow)  Occlusion  User interface  Initial testing at Crotched Mountain