Vision-Based Interactive Systems Martin Jagersand c610.

Slides:



Advertisements
Similar presentations
人機介面 Gesture Recognition
Advertisements

Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Vision Based Control Motion Matt Baker Kevin VanDyke.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
1Notes  Assignment 0 marks should be ready by tonight (hand back in class on Monday)
Introduction to Robotics In the name of Allah. Introduction to Robotics o Leila Sharif o o Lecture #2: The Big.
Vision Based Motion Control Martin Jagersand University of Alberta CIRA 2001.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
Lecture 7 Date: 23rd February
CMPUT 301: Lecture 25 Graphic Design Lecturer: Martin Jagersand Department of Computing Science University of Alberta Notes based on previous courses by.
User Interface Design: Methods of Interaction. Accepted design principles Interface design needs to consider the following issues: 1. Visual clarity 2.
CMPUT 301: Lecture 01 Introduction Lecturer: Martin Jagersand Department of Computing Science University of Alberta Notes based on previous courses by.
Direct Manipulation and Vision and Touch based User Interfaces Martin Jägersand Johns Hopkins University CIPS and CISST groups (Computational Interaction.
Teleoperation Interfaces. Introduction Interface between the operator and teleoperator! Teleoperation interface is like any other HMI H(mobile)RI = TI.
Interactive Sand Art Draw Using RGB-D Sensor Presenter : Senhua Chang.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Human Computer Interface. Human Computer Interface? HCI is not just about software design HCI applies to more than just desktop PCs!!! No such thing as.
SOFTWARE.
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
RAGEEVGANDHI MEMORIAL COLLEGE OF ENGINEERING AND TECHNOLOGY
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Real-Time Human Posture Reconstruction in Wireless Smart Camera Networks Chen Wu, Hamid Aghajan Wireless Sensor Network Lab, Stanford University, USA IPSN.
Interacting with Visualization Colin Ware, Information Visualization, Chapter 10, page 335.
Hand Gesture Recognition System for HCI and Sign Language Interfaces Cem Keskin Ayşe Naz Erkan Furkan Kıraç Özge Güler Lale Akarun.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Gesture-Based Interactive Beam- Bending By Justin Gigliotti Mentored by Professor Tarek El Dokor And Dr. David Lanning Arizona/NASA Space Grant Symposium.
GCSE Information and Communications Technology. Assessment The course is split into 60% coursework and 40% exam You will produce coursework in year 10.
Probabilistic Context Free Grammars for Representing Action Song Mao November 14, 2000.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Dr. Gallimore10/18/20151 Cognitive Issues in VR Chapter 13 Wickens & Baker.
©2001 Southern Illinois University, Edwardsville All rights reserved. Today Fun with Icons Thursday Presentation Lottery Q & A on Final Exam Course Evaluations.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
3D Interaction Techniques for Virtual Environments
Robotics Sharif In the name of Allah. Robotics Sharif Introduction to Robotics o Leila Sharif o o Lecture #2: The.
Computer Vision Michael Isard and Dimitris Metaxas.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
1 CSC111H User Interface Design Dennis Burford
CS-378: Game Technology Lecture #13: Animation Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
Navigating 3D Worlds via 2D Multi- Touch Interfaces Daniel Cope Supervised by Stuart Marshall 1.
Action and Gait Recognition From Recovered 3-D Human Joints IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS— PART B: CYBERNETICS, VOL. 40, NO. 4, AUGUST.
Rick Parent - CIS681 Motion Analysis – Human Figure Processing video to extract information of objects Motion tracking Pose reconstruction Motion and subject.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Computer Software Types Three layers of software Operation.
Hand-Eye Coordination and Vision-based Interaction / cmput610 Martin Jagersand.
The Structure of the User Interface Lecture # 8 1 Gabriel Spitz.
Conceptual Model Design Informing the user what to do Lecture # 10 (a) Gabriel Spitz.
User-Centric Design of a Vision System for Interactive Applications Stanislaw Borkowski, Julien Letessier, François Bérard, and James L. Crowley ICVS’06.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Interaction Styles Chris North cs3724: HCI. Presentations mike miller sean king Vote: UI Hall of Fame/Shame?
IEEE International Conference on Multimedia and Expo.
SE7EN. Our Target Audience Friends that have been separated due to some reason or another. Romantic Couples Youth FOCUSING ASPECT Lack of PERSONAL and.
Face Recognition and Tracking for Human-Robot Interaction using PeCoFiSH Alex Eisner This material is based upon work supported by the National Science.
Design Visualization Software Introduction / Review.
Making Research Tools Accessible for All AI Students Zach Dodds, Christine Alvarado, and Sara Sood Though a compelling area of research with many applications,
introductionwhyexamples What is a Web site? A web site is: a presentation tool; a way to communicate; a learning tool; a teaching tool; a marketing important.
KAASHIV INFOTECH – A SOFTWARE CUM RESEARCH COMPANY IN ELECTRONICS, ELECTRICAL, CIVIL AND MECHANICAL AREAS
Lens Gestures: Integrating Compound Gesture Inputs for Shortcut Activation.
MIT Artificial Intelligence Laboratory — Research Directions Intelligent Perceptual Interfaces Trevor Darrell Eric Grimson.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
A Multi-Touch Display for Robotic Team Control
Hand Gestures Based Applications
SIXTH SENSE TECHNOLOGY
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
Virtual Reality By: brady adger.
Chapter 11 Interaction styles
Chapter 9 System Control
Presentation transcript:

Vision-Based Interactive Systems Martin Jagersand c610

Applications for vision in User Interfaces Interaction with machines and robots – Service robotics – Surgical robots – Emergency response Interaction with software – A store or museum information kiosk

Service robots Mobile manipulators, semi-autonomous DIST TU BerlinKAIST

TORSO with 2 WAMs

Service tasks This is completely hardwired! Found no real task on WWW

But Maybe first applications in tasks humans can’t do?

Why is humanlike robotics so hard to achieve? See human task: – Tracking motion, seeing gestures Understand: – Motion understanding: Translate to correct reference frame – High level task understanding? Do: – Vision based control

Types of robotic systems Auton omy Generality Supervisory control Tele-assistance Programming by demonstration Preprogrammed systems

Interaction styles If A then end Conventional : Low bandwidth interaction Partial or indirect system state displayed User works from internal mental model

Interaction styles Direct Manipulation Direct Manipulation: High bandwidth interactionHigh bandwidth interaction Interact directly and intuitively with objects (affordance)Interact directly and intuitively with objects (affordance) See system state (visibility)See system state (visibility) (Reversible actions)(Reversible actions)

Examples of Direct Manipulation Drawing programs e.g. Mac Paint Video games, flight simulator Robot/machine teaching by showing Tele-assistance Spreadsheet programs Some window system desktops But can you always see effects (visibility)?

xfig drawing program Icons afford use Results visible Direct spatial action- result mapping line([10, 20],[30, 85]); patch([35, 22],[15, 35], C); % C complex structure text(70,30,'Kalle'); % Potentially add font, size, etc matlab drawing:

Why direct manipulation? Recognition quicker than recall. Human uses “the world” as memory/model Human skilled at interacting spatially How quick is direct? Subsecond! Experiments show human performance decreased at 0.4s delay. Subsecond! Experiments show human performance decreased at 0.4s delay.

Vision and Touch based UI Typical UI today: Symbolic, 1D (slider), 2D But human skilled at 3D, 6D, n-D spatial interaction with the world Supports Direct Manip!

Seeing a task Tracking movement – See directions, movements in tasks Recognizing gestures – Static hand and body postures Combination: Spatio-temporal gestures

Tracking movement Tracking the human is hard: – Appearance varies – Large search space, 60 parameters – Unobservable: Joint angles have to be inffered from limb positions, clothing etc. – Motion is non-linear. – Difficult to track 3D from 2D image plane info – Self occlusion of limbs

Trick 1: Physical model Reduce number of DOF’s by coupled model of articulated motion (Hedvig, Mike)

Trick 2: Use uniqueness of skin color Can be tracked at real time

Gestures: Identifying gestures is hard – Hard to segment hand parts – Self occlusion – Variability in viewpoints

Trick 3: Scale space Define hand gesture in course to fine terms

Trick 4: Variability filters

Programming by Demonstration From assembly relations From temporal assembly sequence – Segmenting manipulation sequence into parts (subtasks) is hard Using a gesture language

Tele-assistance: Gestures + context

Robust manipulations

Conclusions Most aspects of Robot see – robot do are hard Conventional methods are – Incapable of seeing task – Incapable of understanding what’s going on – Incapable of performing human manipulation tasks Uncalibrated methods are more promising