Robotic Perception and Action - Project

Slides:



Advertisements
Similar presentations
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Advertisements

1. 2 LabVIEW for FRC Doug Norman National Instruments January 6, 2012.
The Steerable Projector and Camera Unit in an Instrumented Environment Lübomira Spassova Saarland University, Saarbrücken, Germany.
Virtual Reality Interface in MATLAB/Simulink for mechatronic interface D. Todorović, M. Božić, V. Zerbe, and G. S. Đor đ ević.
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
1 Advanced Educational Interfaces: Teaching and Learning with Augmented Reality Brett E. Shelton Ph.C. Educational Technology College of Education University.
Game Development with Kinect
Mixed Reality Benefits For Design Perception Phillip S. Dunston, Ph.D. and Xiangyu Wang Construction Engineering and Management School of Civil Engineering.
Computing ESSENTIALS     CHAPTER Ch 9Copyright 2003 The McGraw-Hill Companies, Inc Graphics, Multimedia, and Artificial Intelligence computing.
Use of Multimedia in Engineering. Mechatronics engineering is based on the combination from three basic engineering field that is mechaninal, electronics.
Friday, 4/8/2011 Professor Wyatt Newman Smart Wheelchairs.
Reprojection of 3D points of Superquadrics Curvature caught by Kinect IR-depth sensor to CCD of RGB camera Mariolino De Cecco, Nicolo Biasi, Ilya Afanasyev.
Professor : Yih-Ran Sheu Student’s name : Nguyen Van Binh Student ID: MA02B203 Kinect camera 1 Southern Taiwan University Department of Electrical Engineering.
GIP: Computer Graphics & Image Processing 1 1 Medical Image Processing & 3D Modeling.
Sketch­based interface on a handheld augmented reality system Rhys Moyne Honours Minor Thesis Supervisor: Dr. Christian Sandor.
INTRODUCTION Generally, after stroke, patient usually has cerebral cortex functional barrier, for example, the impairment in the following capabilities,
A Method for Hand Gesture Recognition Jaya Shukla Department of Computer Science Shiv Nadar University Gautam Budh Nagar, India Ashutosh Dwivedi.
Active Pursuit Tracking in a Projector-Camera System with Application to Augmented Reality Shilpi Gupta and Christopher Jaynes University of Kentucky.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
FLAO system test plan in solar tower S. Esposito, G. Brusa, L. Busoni FLAO system external review, Florence, 30/31 March 2009.
UNIT 7 Describing how an item functions [2] (infinitive with or without ‘to’)
Augmented Reality and 3D modelling By Stafford Joemat Supervised by Mr James Connan.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
3D trajectreconstructie (pingpong) Carlo Goessaert, Roy Van Coillie, Sven De Langhe, Vincent Sercu, Matthias Blomme.
Some Libraries for Kinect 1 2. Open NI Open NI supports many API functions, can combine middleware to increase ability for Kinect. * Support: - Detect.
Professor : Tsung Fu Chien Student’s name : Nguyen Trong Tuyen Student ID: MA02B208 An application Kinect camera controls Vehicles by Gesture 1 Southern.
Assistive Technology in the Classroom Setting Rebecca Puckett CAE6100 – GQ1 (24494) Dec. 7, 2009.
Teleoperation In Mixed Initiative Systems. What is teleoperation? Remote operation of robots by humans Can be very difficult for human operator Possible.
Robotic Chapter 8. Artificial IntelligenceChapter 72 Robotic 1) Robotics is the intelligent connection of perception action. 2) A robotic is anything.
Product Associated Displays and SearchLight – New Developments Based on the Fluid Beam Application.
Choosing interaction devices: hardware components
Realtime Robotic Radiation Oncology Brian Murphy 4ECE.
SciFest Overview Neil Gannon. Outline Demonstrations using a Microsoft Kinect sensor – Image Manipulation Real-time Invisibility Background removal (green.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
Cloud Based Robotics.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Interactive Mirror System based on Personal Purchase Information Donghyun Kim 1, Younsam Chae 2, Jonghun Shin 2, Uyeol Baek 2, Seoksoo Kim * 1,* Dept of.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Detection system Optimal parameter estimation framework Conclusion 2.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
1 IR Camera IR Projector Camera View Projector image Projection View Result View Projection Warp image Camera Warp image Result View Hc Hp From Camera.
1 Robotic Chapter AI & ESChapter 7 Robotic 2 Robotic 1) Robotics is the intelligent connection of perception action. 2) A robotic is anything.
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
BPMN.  BPMN will provide businesses with the capability of understanding their internal business procedures in a graphical notation.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Finger Detection system Optimal parameter estimation framework Conclusion.
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
BeNeFri University Vision assistance for people with serious sight problems Moreno Colombo Marin Tomić Future User Interfaces, BeNeFri University.
Virtual Reality Prepared By Name - Abhilash Mund Regd.No Branch - Comp.sc & engg.
Interactive Animation
Southern Taiwan University Department of Electrical Engineering
Nicole looks at faces Development of a visual robot interface to interpret facial expressions NICOLE: Future robots will share their environment with humans.
SIXTH SENSE TECHNOLOGY
Human Computer Interaction
Object Model for Live Actor and Entity in a MAR world
Reactable Sergio Avedillo.
Live Actor and Entity Representation in MAR
Projects in the areas of: Multimedia processing (speech, image, video)
Augmented Reality By: Erica Baas.
Furniture Assembly using Augmented Reality
Augmented Reality SDK Introduction
10/18/16 Autonomous Battery Charging of Quadcopter
The 4th Industrial Revolution
Mixed Reality Server under Robot Operating System
for Display Antique and Art Object Information
Process Monitoring and Control Systems
Chapter I Introduction
Natural User Interaction with Perceptual Computing
Specialized Application Software
Interesting materials…. Homework: Watch each of these animations
Technology of Data Glove
Presentation transcript:

Robotic Perception and Action - Project Augmented Reality for the “Man in the Loop”

NOTE: Introduci l’animazione e AR in industria 4.0 Chiedi a Kato e De Paolis materiale ppt se lo vogliono condividere

AR interaction technologies ToF cameras (HW) UNITY basics C# Introduction to AR AR interaction technologies ToF cameras (HW) UNITY basics C# State machine with UNITY How to display an image on the projector with UNITY A software based on a state machine structure able to interact with the user with UNITY Identification and localization module: Superquadrics Optimization for fitting RANSAC Exercitation with matlab PCL C++ Calibration of Tof camera, traditional cameras and projectors on a plane Project – prerequisites

A projector to display on a table indications to the user (HW) A table with retro-reflective sheets? (HW) A Kinect v2 ToF camera (HW) A software based on a state machine structure able to interact with the user (sw) An identification and localization module (sw) Calibration of Tof camera with the projector and with the table (table and projector could simply fit by moving properly the projector) Project – components

An example of setup from literature Jaakko Hyry, Max Krichenbauer, Goshiro Yamamoto, Takafumi Taketomi, Christian Sandor, Hirokazu Kato, Petri Pulli, Design of Assistive Tabletop Projector-Camera System for the Elderly with Cognitive and Motor Skill Impairments An example of setup from literature

The user quest for help (voice, gesture or virtual button on the table) The system: Item := 1; Detect the item to manipulate Highlight it to the user Animate the action to be performed Verify that the item was moved Ask if the first step was performed correctly Item = item + 1 If not end-procedure, goto point 2 Project – interaction sequence

Project – modules to develop Description Software to use Groups State Machine > Main software that manages the HMI interaction. > Main projected interface. HW: virtual buttons with retroreflective material C# - UNITY Objects Localization with AR Toolkit > Markers placed on the objects extraction (position and location on the projector space). > Calibration of camera/projector/furnitures HW: kitchen simulated furnitures Skeleton acquisition > Skeleton acquisition with ToF (kinect) > Control of task execution UNITY Animations Development of proper anmations (for example with Blender) UNITY - Blender Objects Localization with ToF camera > Point cloud extraction and its use for objects recognition and localization C++ (PCL) Sensor Fusion > Sensor Fusion of Objects locations estimated with Ar Toolkit and the ToF camera Project – modules to develop

1. State Machine

Objects Localization with AR Toolkit

Skeleton acquisition

Link to some examples: https://www. youtube. com/watch 2D: https://www.youtube.com/watch?v=RKEuxDXpcFA Animations

Objects Localization with ToF camera Link to our project: http://www.robosense.it/index.php/component/allvideoshare/video/tof-pallet-identification-mocup?Itemid=172 Objects Localization with ToF camera

Sensor Fusion