1 A Point-and-Click Interface for the Real World: Laser Designation of Objects for Mobile Manipulation Presented by Gal Peleg CSCI2950-Z, Brown University.

Slides:



Advertisements
Similar presentations
Regis Kopper Mara G. Silva Ryan P. McMahan Doug A. Bowman.
Advertisements

Robot Sensor Networks. Introduction For the current sensor network the topography and stability of the environment is uncertain and of course time is.
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
1. 2 Mobile Robot Navigation with Human Interface Device David Buckles Brian Walsh Advisor: Dr. Malinowski.
A vision-based system for grasping novel objects in cluttered environments Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng 2007 Learning to.
Spatial Hypermedia and Augmented Reality
User-System Interaction a challenge for the present and the future Prof. dr. Matthias Rauterberg IPO Center for User-System Interaction Eindhoven University.
SienceSpace Virtual Realities for Learning Complex and Abstract Scientific Concepts.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
Behaviors for Compliant Robots Benjamin Stephens Christopher Atkeson We are developing models and controllers for human balance, which are evaluated on.
Handhelds and Collaborative Command and Control Brad Myers Human Computer Interaction Institute Carnegie Mellon University February 23, 2001.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
Retrieval Evaluation. Brief Review Evaluation of implementations in computer science often is in terms of time and space complexity. With large document.
Retrieval Evaluation: Precision and Recall. Introduction Evaluation of implementations in computer science often is in terms of time and space complexity.
Design Process …and the project.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
User studies. Why user studies? How do we know security and privacy solutions are really usable? Have to observe users! –you may be surprised by what.
Evaluation Methods April 20, 2005 Tara Matthews CS 160.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Retrieval Evaluation. Introduction Evaluation of implementations in computer science often is in terms of time and space complexity. With large document.
User interface design Designing effective interfaces for software systems Objectives To suggest some general design principles for user interface design.
IEEM Human-Computer Systems Dr. Vincent Duffy - IEEM Week 9 - Approaches to Human- Computer Interaction Mar. 30, 1999
From Controlled to Natural Settings
Presented by Gal Peleg CSCI2950-Z, Brown University February 8, 2010 BY CHARLES C. KEMP, AARON EDSINGER, AND EDUARDO TORRES-JARA (March 2007) 1 IEEE Robotics.
Vision Guided Robotics
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
WebQuilt and Mobile Devices: A Web Usability Testing and Analysis Tool for the Mobile Internet Tara Matthews Seattle University April 5, 2001 Faculty Mentor:
MACHINE VISION GROUP Multimodal sensing-based camera applications Miguel Bordallo 1, Jari Hannuksela 1, Olli Silvén 1 and Markku Vehviläinen 2 1 University.
 At the end of this class, students are able to  Describe definition of input devices clearly  List out the examples of input devices  Describe.
Usability 2009 J T Burns1 Usability & Usability Engineering.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Towards a Unified Interaction Framework for Ubicomp User Interfaces Jason I. Hong Scott Lederer Mark W. Newman G r o u p f o r User Interface Research.
Real-Time Human Posture Reconstruction in Wireless Smart Camera Networks Chen Wu, Hamid Aghajan Wireless Sensor Network Lab, Stanford University, USA IPSN.
Phase Retrieval Applied to Asteroid Silhouette Characterization by Stellar Occultation Russell Trahan & David Hyland JPL Foundry Meeting – April 21, 2014.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Oct 30, 2006 LUONNOS Navigation techniques for construction industry product models Jukka Rönkkö, HUT/VTT
A sensor for measuring the acceleration of a moving or vibrating body.
Household appliances control device for the elderly On how to encourage universal usability in the home environment.
Usability Testing CS774 Human Computer Interaction Spring 2004.
Final Honours Presentation Principal Investigator: João Lourenço Supervisor: Dr Hannah Thinyane.
Fundamentals of Information Systems, Third Edition2 Principles and Learning Objectives Artificial intelligence systems form a broad and diverse set of.
Human – Computer Interaction
J. Alan Atherton Michael Goodrich Brigham Young University Department of Computer Science April 9, 2009 Funded in part by Idaho National Laboratory And.
ANDROME – WP3 - Applications definition and development | Enabling Community Communications-Platforms and Applications | WP3 : Applications.
Software Architecture
THE DISABILITY EXPERIENCE CONFERENCE ROBOTS TO MOTIVATE YOUNGHYUN CHUNG.
Models for Human Interaction with Mobile Service Robots Helge Hütttenrauch Helge Hüttenrauch
Major Disciplines in Computer Science Ken Nguyen Department of Information Technology Clayton State University.
By: LaToya Prescod-Williams.  Higher order Thinking Activities  Easy classroom Interfacing  Learning goals /meeting student and teacher needs  List.
Users’ Quality Ratings of Handheld devices: Supervisor: Dr. Gary Burnett Student: Hsin-Wei Chen Investigating the Most Important Sense among Vision, Hearing.
General features of GUI's Applicable to all methodologies and all platforms: Linux/UNIX Windows Android OS-X.
Wireless Appliance Controller TeamClientFaculty AdvisorStudent Members sdmay03-20 Senior DesignDr. Arun SomaniDonny PrabowoBenjamin Taylor Todd RoushKheng-Hin.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
Fundamentals of Information Systems, Third Edition1 The Knowledge Base Stores all relevant information, data, rules, cases, and relationships used by the.
TAUCHI – Tampere Unit for Computer-Human Interaction Development of the text entry self-training system for people with low vision Tatiana Evreinova Multimodal.
User-Centric Design of a Vision System for Interactive Applications Stanislaw Borkowski, Julien Letessier, François Bérard, and James L. Crowley ICVS’06.
Auto-Park for Social Robots By Team Daedalus. Requirements for FVE Functional Receive commands from user via smartphone app Share data with other cars.
3D Modeling with the Tinmith Mobile Outdoor Augmented Reality System Editors: Lawrence Rosenblum and Simon Julier.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Manipulation in Human Environments
Fundamentals of Information Systems
Automation as the Subject of Mechanical Engineer’s interest
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
When to engage in interaction – and how
Manipulation in Human Environments
HOME AUTOMATION HOME-NETWORK REMOTE CONTROL Armando Roy.
眼動儀與互動介面設計 廖文宏 6/26/2009.
Expanding User Interaction in Virtual Reality at VR Medialab
Challenges For Next Year
Presentation transcript:

1 A Point-and-Click Interface for the Real World: Laser Designation of Objects for Mobile Manipulation Presented by Gal Peleg CSCI2950-Z, Brown University April 5, 2010 By C. Kemp, C. Anderson, H. Nguyen, A. Trevor, and Z. Xu Human-Robot Interaction, pp , 2008.

Presentation Outline Motivation & Problem Statement Demonstration (Video) Prior & Related Work Implementation Experiments Results Discussion & Future Work 2

Motivation & Problem Statement Robots need to be able to: manipulate common hand-held objects do so by receiving direction from people The authors’ approach: requires no instrumentation of the environment makes use of everyday pointing devices has low spatial error is fully mobile 3

Demonstration (Videos) ELE, an assistant robot, performing object retrieval interacting with users through the clickable world interface ELE, an assistant robot, performing object retrieval interacting with users through the clickable world interface 4

Prior & Related Work (Problems with) 2D point-and-click GUI controlling from a remote location display size, device usability, and comfort requires that the user switch perspectives (Problems with) Natural Pointing communicating a precise location is difficult produces high uncertainty 5

Prior & Related Work (Continued) (Problems with) Intelligent Devices need specialized, networked, computer-operated, intelligent devices can’t interact with physical environment (Problems with) “Traditional” Laser Designation tasks are constrained few well-modeled objects 6

Implementation Three main steps: 1. detect the laser spot in field of view 2. move stereo camera to look at the detected spot 3. estimate 3D location of spot relative to robot’s body (4). move to, and grasp object 7

Implementation (Continued) Observing the environment authors use a catadioptric, omnidirectional camera placed at 1.6m well-matched to human environments 8

Implementation (Continued) Detecting the Laser Spot high visibility: humans vs. robots using video cameras use a monochrome camera with a narrow-band, green filter matched to the frequency of green laser pointers Estimating the 3D Location detect laser spot and estimate its 3D location eight times, if successful, return average else, keep looking 9

Experiment 1: Setup Designating Positions of Objects 9 different household objects “Office” environment 180 trials of the laser detection and position estimation 10

Experiment 1: Results Test resulted in 178 successful “clicks” out of 179 attempts (99.4%) Average error of 9.75 cm with respect to hand-measured locations 11

Experiment 2 Mobile Manipulation: Grasping Selected Objects here, robot also approaches, grasps, and lifts object able to successfully grasp nine out of the ten objects 12

Discussion & Future Work One Idea: combining laser pointer interface with speech One Problem: no studies conducted with non-expert users Claim: “The current laser pointer interface is robust enough for realistic applications” Your thoughts? 13