Download presentation
Presentation is loading. Please wait.
Published byMorgan Sutton Modified over 9 years ago
2
Modular components to be re-used in future years Tasks and Team 1. Robot graphical user interface (Zwivhuya Tshitovha) 2. Robot control interface (Jaco Colyn) 3. Robot visual system (Jonathan Dowling) 8 Masters students working in the Robotics and Agents lab 2
3
Reduce risk for rescue workers Map generation Rescue tasks Detection and localisation of victims Detection of hazardous materials Disasters that have used robots: Japan nuclear disaster Three Mile Island nuclear disaster World Trade Centre terrorist attack 3
4
Usage in Japan nuclear disaster And in the World Trade Centre 4
5
World trade centre (continued) 5
6
6
7
How can an effective system be built to automatically detect simulated disaster zone objects? human victims hazmat signs rolling E’s Using the normal IR cameras available on the robot and Applying existing computer vision techniques to extract the features 7
8
How can a usable and efficient GUI be built by hiding information and preventing sensory overload? 8
9
How can an intuitive robot control system be created by mapping the robots functions onto a human interface device? 9
10
RoboCup – an international organisation based in Sweden Urban search and rescue tests 10
11
Simulate human subjects and disaster environments Robot vision tests Rolling E’s Hazmat signs Victim detection Remote operation of robot through competition arena to carry out tasks such as retrieving an object 11
12
12
13
13
14
14
15
Client for software engineering project: Stephen Marais from Robotics and Agents Lab The deliverables required for this project are: A visual system for the robot to be able to pass the “rolling E’s” test Hazardous material signs detection Human victim detection Easy to use interface for controlling the robot via a human interface device Intelligent graphical user interface that highlights only important information to the operator 15
16
Goals and challenges:
17
Driving the robot Controlling the robotic arm Adjusting the robots tracks 17
18
No direct line of sight to the robot Robot is “on its own” Has to return by itself Prevent Falling over Driving out of range 18
19
Efficient control Single Teleoperator Reduce fatigue Interpret and perform commands in real time Reduce input delay “lag” Efficient mapping of commands to the control interface Micire, M., Mccann, E., Desai, M., Tsui, K.M., Norton, A., and Yanco, H.A. Hand and Finger Registration for Multi-Touch Joysticks on Software-Based Operator Control Units. (2011), 88-93. 19
20
Designing a “clean” Graphical User Interface (GUI) Reducing sensory overload – which leads to fatigue Relay locations of important objects that cannot be directly seen Only seen through the control interface and GUI Interface must provide situational awareness 20
21
Iterative and Incremental software engineering. Human Computer Interaction (HCI) Human Robot Interaction (HRI) Use case scenarios taken from Urban Rescue Scenarios 21
22
Goals and challenges:
23
Detect Human body parts (primarily limbs) Detect and identify Hazmat signs Detect the “rolling-E” 23
24
Input stream of video The general procedure Transform the image’s data into a simpler representation Apply a machine learning algorithm onto the transformed data to detect features. 24
26
Robot control interface and GUI User centred design approach. Qualitative and Quantitative results Feedback can be generated from ▪ Test subjects ▪ Asked to complete specific tasks using the interface. Performance measurements and quantitative results from simulation testing. Hazmat signs Human body parts Rolling “E” 26
27
Requirements RoboCup Rescue Competition (RRC) Urban Search and Rescue Initial testing on each sub-system Final testing on the system as a whole Based on expected challenges in the RRC 27
28
Demonstrate its capabilities on dummies No external users because robot quite expensive Ethical clearance 28
29
Robot vision Mikolajczyk et al [1] Robot control and Interface Richer et al [2] Adams [3] 29
30
Team Description Papers 2003 Team RoBrno [4] 30
31
Hardware 19 inch wide screen Robot Human interface device Software OpenCV Bosch SDK ROS 31
32
Accessing the robot Robot hardware damage Missing project milestones 32
33
System used for a rescue robot that is developed by the Robotics and Agents Research Laboratory. Key success factors Robot vision -- Accurate detection in smallest time Robot interface and control -- Usable and efficient given by measure of time and effort 33
35
35
36
36
37
37
38
[1] Mikolajczyk, K,Schmid, C & Zisserman,A 2004, „Human detection based on a probabilistic assembly of robust part detectors ‟. European Conference on Computer Vision, Oxford, United Kingdom. [2] Adams, JA 2002, ‟ Critical Considerations for Human-Robot Interface Developmen ‟, Proceedings of 2002 AAAI Fall Symposium, 2002 - aaai.org, Rochester Institute of Technology Rochester, New York. [3] Richer, J & Drury, JL 2006 „A Video Game-Based Framework for Analyzing Human-Robot Interaction ‟, Characterizing Interface Design in Real-Time Interactive Multimedia Applications. MITRE Corporation, United states of America [4] Uschin, K, Inteam, S, Sae-eaw,N, Trakultongchai, A, Klairith, P, Jitviriya, W & Jenkittiyon, P, ‟ RoboCup Rescue 2009 - Robot League Team iRAP_Pro (Thailand) ‟,King Mongkut ‟ s University of Technology North Bangkok (KMUTNB), Bangkok Thailand. 38
39
? 39
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.