Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.

Slides:



Advertisements
Similar presentations
Voice Controlled Surgical Assistant ECE 7995 Dupindar ghotra, Muhammad Syed, Sam li, Sophia lei.
Advertisements

Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
Introduction University of Bridgeport 1 Introduction to ROBOTICS.
COMP322/S2000/L41 Classification of Robot Arms:by Control Method The Control unit is the brain of the robot. It contains the instructions that direct the.
Kinematics & Grasping Need to know: Representing mechanism geometry Standard configurations Degrees of freedom Grippers and graspability conditions Goal.
Autonomy using Encoders Intro to Robotics. Goal Our new task is to navigate a labyrinth. But this time we will NOT use motor commands in conjunction with.
IE 447 COMPUTER INTEGRATED MANUFACTURING CHAPTER 9 Material Handling System 1 IE CIM Lecture Notes - Chapter 9 MHS.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
CSCE 641: Forward kinematics and inverse kinematics Jinxiang Chai.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell Katherine.
Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Learning From Demonstration Atkeson and Schaal Dang, RLAB Feb 28 th, 2007.
Chapter 25: Robotics April 27, The Week Ahead … Wednesday: Dmitrii Zagorodnov Thursday: Jeff Elser’s presentation, general discussion Friday: Rafal.
ABSTRACT: Children with disabilities are often confined to life in a wheelchair, facing simple motor functions such as opening a door and picking up a.
CSCE 689: Forward Kinematics and Inverse Kinematics
© 2003 by Davi GeigerComputer Vision October 2003 L1.1 Structure-from-EgoMotion (based on notes from David Jacobs, CS-Maryland) Determining the 3-D structure.
Stockman MSU Fall Computing Motion from Images Chapter 9 of S&S plus otherwork.
Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of.
Quick Overview of Robotics and Computer Vision. Computer Vision Agent Environment camera Light ?
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Advanced Graphics (and Animation) Spring 2002
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Manipulator Motion (Jacobians) Professor Nicola Ferrier ME 2246,
Programming Concepts Part B Ping Hsu. Functions A function is a way to organize the program so that: – frequently used sets of instructions or – a set.
INVERSE KINEMATICS IN A ROBOTIC ARM AND METHODS TO AVOID SINGULARITIES Submitted By :-Course Instructor :- Avinash Kumar Prof. Bhaskar Dasgupta Roll No.-
Chapter 5 Trajectory Planning 5.1 INTRODUCTION In this chapters …….  Path and trajectory planning means the way that a robot is moved from one location.
Chapter 5 Trajectory Planning 5.1 INTRODUCTION In this chapters …….  Path and trajectory planning means the way that a robot is moved from one location.
KINEMATICS ANALYSIS OF ROBOTS (Part 2)
Art 315 Lecture 6 Dr. J. Parker. Variables Variables are one of a few key concepts in programming that must be understood. Many engineering/cs students.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Introduction to Robotics A Force of the Future.
Robotica Lecture 3. 2 Robot Control Robot control is the mean by which the sensing and action of a robot are coordinated The infinitely many possible.
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
Lecture 22 Dimitar Stefanov.
Virtual Reality Lecture2. Some VR Systems & Applications 고려대학교 그래픽스 연구실.
CSCE 441: Computer Graphics Forward/Inverse kinematics Jinxiang Chai.
Kinematic Redundancy A manipulator may have more DOFs than are necessary to control a desired variable What do you do w/ the extra DOFs? However, even.
Robot Basics Motion and Nomenclature. Robot Main Components Programming Terminal Controller Manipulator Manual Pendent.
Autonomy for General Assembly Reid Simmons Research Professor Robotics Institute Carnegie Mellon University.
Student Name USN NO Guide Name H.O.D Name Name Of The College & Dept.
Network Components By Kagan Strayer. Network Components This presentation will cover various network components and their functions. The components that.
Lesson Using Robotics Systems. Interest Approach Think of some practical uses of a robot. Think of some practical uses of a robot.
ROBOTC Software EV3 Robot Workshop
Jeopardy-CH 1 Q $200 Q $400 Q $600 Q $800 Q $200 Q $400 Q $600 Q $800 Final Jeopardy.
1 Robonaut: A Humanoid Robotic Assistant for On-Orbit and Planetary Missions Nicolaus Radford Automation, Robotics and Simulation Division NASA/Johnson.
KAASHIV INFOTECH – A SOFTWARE CUM RESEARCH COMPANY IN ELECTRONICS, ELECTRICAL, CIVIL AND MECHANICAL AREAS
ROBOTIC COMPONENTS, MOVEMENTS AND ARTICULATION DESIGN & APPLIED ENGINEERING II MR. RANDT.
How to Work a Camera. Before you start working your camera: The camera is not yours to keep. You are responsible for the same camera for the duration.
KAASHIV INFOTECH – A SOFTWARE CUM RESEARCH COMPANY IN ELECTRONICS, ELECTRICAL, CIVIL AND MECHANICAL AREAS
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
Using the Cyton Viewer Intro to the viewer.
CSCE 441: Computer Graphics Forward/Inverse kinematics
First-person Teleoperation of Humanoid Robots
Setting Up the Initial Scene
ROBOTS AND ROBOTIC ARM -by lalithej VVK.
Automation as the Subject of Mechanical Engineer’s interest
Project Overview Introduction to Factory Automation Numerical Control
IENG 475: Computer-Controlled Manufacturing Systems
ROBOTICS.
Common Classification Tasks
CSCE 441: Computer Graphics Forward/Inverse kinematics
Multiple View Geometry for Robotics
Cell Simulation Pick and Place Routine.
IENG 475: Computer-Controlled Manufacturing Systems Intro to Robotics
Special English for Industrial Robot
Introduction to Robotics
Chapter 4 . Trajectory planning and Inverse kinematics
Presentation transcript:

Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell

Challenge Using a standard controller, put the ball in the cup. 1. Turn the cup over. 2. Pick up the ball. 3. Put the ball in the cup. Average time of execution: ~3-5 minutes, even by middle school children with video game experience! Although this is a simplified example, similar tasks may occur repeated throughout a handicapped person’s daily life.

Motivation Why?  Unintuitive controllers  Operator sensory overload For severely handicapped people, activities of daily life are difficult enough to perform. While an assitive robotic device like ours allows for limited independence, it can be frustrating and tiresome to operate. Let’s abstract this away…

Hardware Manus Assistive Robotic Manipulator (ARM) by Exact Dynamics  6 Degree of Freedom (DoF)  plus 2 DoF gripper end- effector  Joint encoders  Cameras  Shoulder view  Gripper view

Standard Control Movement for a “out of the box” configuration is done by menus accessed from single switch, keypad, or joystick input.

Standard Control: Using the Joint Menu Direct Joint Mode + Direct control of individual joints - Unable to temporally simultaneously move joints - Not how humans do it… we don’t think in terms like move shoulder up, rotate wrist, extend forearm, etc.

Standard Control: Using the Cartesian Menu Direct Cartesian Mode + Gripper moves linearly in 3D + Joints can move collaterally in space and time - Still not how humans do it… we don’t think in term of moving left, right, up, down, etc.

Alternative Control Transparent Mode  ARM has Controller Area Network (CAN) communication with PC  ARM transmits status packages at 20ms intervals. t=20ms: message 0x350 gives ARM status and position t=40ms: message 0x360 gives gripper position t=60ms: message 0x37F asks for return package t=80ms: message 0x350…  Every 60ms, when message 0x37F is sent, movement information can be returned as ARM input.

How should the ARM move? Like humans do! Think: I want the cell phone. Actions: See, reach, grasp However, the intended users may not be capable of these actions, therefore we simplify.

Selection Process Given what the user sees directly ahead of them, and assuming the desired, unobstructed object is within reach… zoom in on the cell phone!

Movement From the user selection, we know the x,y position of where the ARM should be. How do we move there? By using Phission and joint encoder feedback to determine movement length, speed, and direction:  Phission: We’ve trained on the color we desire to track. While the center of the blob is not near the desired (x,y), move towards.  Feedback: Monitor ARM status and position

Drop for Z Z = Bf/(x L – x R ) Depth information is deduced through simulated stereo vision. 2 images are sequentially taken as the gripper moves along the y-axis; B is known. Disparity between the images yields depth Z and the ARM moves “close” to the desired object.

Future Work  Distance sensing: laser  Non-rigid stereo vision