Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of.

Slides:



Advertisements
Similar presentations
SOUTHEASTCON I KARMA ECE IEEE SoutheastCon Hardware Competition Must build an autonomous robot that can –Start at rest at the Starting Station.
Advertisements

MCECS Guide Robot Project Project Update 5/23/2012.
1. 2 Mobile Robot Navigation with Human Interface Device David Buckles Brian Walsh Advisor: Dr. Malinowski.
A vision-based system for grasping novel objects in cluttered environments Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng 2007 Learning to.
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
Sensors For Robotics Robotics Academy All Rights Reserved.
Markovito’s Team (INAOE, Puebla, Mexico). Team members.
Lesson 4 Alternative Methods Of Input.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
Wheelesley : A Robotic Wheelchair System: Indoor Navigation and User Interface Holly A. Yanco Woo Hyun Soo DESC Lab.
The Gaze Controlled Robotic Platform creates a sensor system using a webcam. A specialized robot built upon the Arduino platform responds to the webcam.
INTEGRATION OF A SPATIAL MAPPING SYSTEM USING GPS AND STEREO MACHINE VISION Ta-Te Lin, Wei-Jung Chen, Fu-Ming Lu Department of Bio-Industrial Mechatronics.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell Katherine.
Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Background S.A.U.V.I.M. Semi - Autonomous Underwater Vehicle for
Human-in-the-Loop Control of an Assistive Robot Arm Katherine Tsui and Holly Yanco University of Massachusetts, Lowell.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Team GPS Rover Alex Waskiewicz Andrew Bousky Baird McKevitt Dan Regelson Zach Hornback.
Lecture 25 Dimitar Stefanov.
Mobile Robot Navigation with Human Interface Device David Buckles Brian Walsh Advisor: Dr. Malinowski.
Improved Interfaces for Human-Robot Interaction in Urban Search and Rescue Michael Baker Robert Casey Brenden Keyes Holly A. Yanco University of Massachusetts.
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
A Navigation System for Increasing the Autonomy and the Security of Powered Wheelchairs S. Fioretti, T. Leo, and S.Longhi yhseo, AIMM lab.
An Integral System for Assisted Mobility Manuel Mazo & the Research group of the SIAMO Project Yuchi Ming, IC LAB.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
LESSON 2 Input and Output Devices
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Active Display Robot System Using Ubiquitous Network Byung-Ju Yi Hanyang University.
Tablet PCs In Socially Relevant Projects Michael Buckley University of Buffalo.
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Cooperating AmigoBots Framework and Algorithms
Navi Rutgers University 2012 Design Presentation
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
Computers and Disability Case Study IB Computer Science II Paul Bui.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
Ground Robotics Reliability Center Andrew Niedert, Yazan Aljeroudi, Dr. Nassif Rayess, and Dr. Richard Hill Department of Mechanical Engineering, University.
HARDWARE INTERFACE FOR A 3-DOF SURGICAL ROBOT ARM Ahmet Atasoy 1, Mehmed Ozkan 2, Duygun Erol Barkana 3 1 Institute of Biomedical Engineering, Bogazici.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
INEMO™ Demonstration Kit DOF (Degrees of Freedom) platform  The STEVAL-MKI062V2 combines accelerometers, gyroscopes and magnetometers with pressure.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
1 Structure of Aalborg University Welcome to Aalborg University.
Child-sized 3D Printed igus Humanoid Open Platform Philipp Allgeuer, Hafez Farazi, Michael Schreiber and Sven Behnke Autonomous Intelligent Systems University.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Scaling Human Robot Teams Prasanna Velagapudi Paul Scerri Katia Sycara Mike Lewis Robotics Institute Carnegie Mellon University Pittsburgh, PA.
Smart Lens Robot William McCombie IMDL Spring 2007.
Mobile Robots Why do robots need to move?. What defines a robot? Sense – a robot has to take in information about its environment Plan – a robot has to.
TOUCHLESS TOUCHSCREEN USER INTERFACE
Dynamic Framerate and Resolution Scaling on Mobile Devices Kent W. Nixon, Xiang Chen, Yiran Chen University of Pittsburgh January 29, 2016.
Lesson Objectives Aims You should be able to:
Lesson 4 Alternative Methods Of Input.
Alternative Methods Of Input
Brendon Knapp, Edmund Sannda, Carlton Allred, Kyle Upton
Sensors For Robotics Robotics Academy All Rights Reserved.
Automation as the Subject of Mechanical Engineer’s interest
Sensors For Robotics Robotics Academy All Rights Reserved.
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Manipulation in Human Environments
Lesson 4 Alternative Methods Of Input.
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Lesson 4 Alternative Methods Of Input.
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Toward Drone Privacy via Regulating Altitude and Payload
Presentation transcript:

Development of Vision-Based Navigation for a Robotic Wheelchair Matt Bailey, Andrew Chanler, Mark Micire, Katherine Tsui, and Holly Yanco University of Massachusetts, Lowell Bruce Maxwell Swarthmore College

Outline Goal Redesign of Wheeley SLAM using stereo vision Human cue detection Manipulation Future work

Goal: How do I get to…? Photo from

Wheeley: Hardware Wheelesley v2 Vector Mobility prototype chassis Differential drive RobotEQ AX2850 motor controller Custom PC Sensor platform Vision system

Wheeley: Robot Arm Exact Dynamic’s Manus Assistive Robotic Manipulator (ARM) –6+2 DoF –Joint encoders, slip couplings –14.3 kg –80 cm reach –20 N clamping force –1.5 kg payload capacity –Keypad, joystick, single switch input devices –Programmable Image by Exact Dynamics

Wheeley: Vision System Manipulation –Shoulder camera Canon VC-C50i Pan-Tilt-Zoom –Gripper camera PC229XP Snake Camera 0.25 in x 0.25 in x 0.75 in

Wheeley: Vision System Navigation –Videre Design’s STH-V1 –19 cm x 3.2 cm –69 mm baseline –6.5 mm focal length –60 degrees FoV

SLAM using Stereo Vision Why use vision instead of traditional ranging devices? –Accuracy –Cost –Detail

Vision and Mapping Libraries Phission – Videre Design’s Small Vision System (SVS) Simple Mapping Utility (pmap) –Laser stabilized odometry –Particle-based mapping –Relaxation over local constraints –Occupancy grid mapping

SLAM Data Flow

Results

Human Cue Detection Swarthmore Vision Module (SVM) –Basic text detector and optical character recognition

Manipulation: Motivation Direct inputs from 4x4 keypad, joystick, or single switch May not correlate well with user’s physical capabilities Layered menus Micromanage task and progress Image by Exact Dynamics

Manipulation: Visual Control

Manipulation: Experiments Able bodied, August 2006 –Confirmed: With greater levels of autonomy, less user input is necessary for control –Confirmed: Faster to move to the target in computer –Unconfirmed: Preference of visual interface Target audience, Summer 2007 –Access methods –Cognitive ability –Recreation of previous experiment

Future Work Additional Wheeley modifications: –PC for mapping –Mount touch screen LCD –New Videre Stereo Head –Mount robotic arm Integrate Wheelesley navigation

Acknowledgements Research supported by NSF grants IIS , IIS , and IIS Collaborators: –David Kontak at Crotched Mountain Rehabilitation Center –GertWillem Romer at Exact Dynamics –Aman Behal at the University of Central Florida

Questions?