Robotic Perception and Action

Slides:



Advertisements
Similar presentations
M. De Cecco - Lucidi del corso di Robotica e Sensor Fusion - INTRODUCTION TO ROBOTICS 1. Rotation matrices, solution to the direct kinematics solution.
Advertisements

Discussion topics SLAM overview Range and Odometry data Landmarks
KinectFusion: Real-Time Dense Surface Mapping and Tracking
Sensors For Robotics Robotics Academy All Rights Reserved.
Introduction to Probabilistic Robot Mapping. What is Robot Mapping? General Definitions for robot mapping.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
1 File system Casi di studio (2). 2 Struttura del File System in Windows 2000 (1) La master file table di NTFS.
[insert cool codename here] Ames Bielenberg David Saltzman.
1 Player Tutorial Boyoon Jung Robotic Embedded Systems Lab Robotics Research Lab Center for Robotics and Embedded Systems.
Head Tracking and Virtual Reality by Benjamin Nielsen.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
Teaching Assistant: Roi Yehoshua
Waikato Margaret Jefferies Dept of Computer Science University of Waikato.
Teaching Assistant: Roi Yehoshua
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Visual Perception PhD Program in Information Technologies Description: Obtention of 3D Information. Study of the problem of triangulation, camera calibration.
A Framework for use in SLAM algorithms Principle Investigator: Shaun Egan Supervisor: Dr. Karen Bradshaw.
Recursive Bayes Filters and related models for mobile robots.
Ling Chen ( From Shanghai University 1.
M. De Cecco - Lucidi del corso di Robotica e Sensor Fusion Laser Range Finder Camera  direct depth measurement  wide accuracy span (till 200 m)  only.
COMP322/S2000/L261 Geometric and Physical Models of Objects Geometric Models l defined as the spatial information (i.e. dimension, volume, shape) of objects.
The Hardware Design of the Humanoid Robot RO-PE and the Self-localization Algorithm in RoboCup Tian Bo Control and Mechatronics Lab Mechanical Engineering.
City College of New York 1 Player Stage Gazebo Rex Wong CCNY Robotic Lab A robotic research and development environment.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
Tutorial Visual Perception Towards Computer Vision
Contents: 1. Introduction 2. Gyroscope specifications 3. Drift rate compensation 4. Orientation error correction 5. Results 6. Gyroscope and odometers.
Mission To Mars Driving Simulator Based on the recent Curiosity Landing On Mars Based In Year 2020 Score based game, with collectable items to unlock.
Learning Roomba Module 5 - Localization. Outline What is Localization? Why is Localization important? Why is Localization hard? Some Approaches Using.
M. De Cecco - Lucidi del corso di Robotica e Sensor Fusion Kinematic model of a differential drive robot.
Basilio Bona DAUIN – Politecnico di Torino
ECLIPSE IDE & PACKAGES. ECLIPSE IDE Setting up workspace Making a new project How to make classes Packages (will explain more about this) Useful short-cuts.
Lesson Objectives Aims You should be able to:
Jacob White and Andrew Keller
CSE-473 Mobile Robot Mapping.
First-person Teleoperation of Humanoid Robots
Lecturer: Roi Yehoshua
Using Sensor Data Effectively
Sensors For Robotics Robotics Academy All Rights Reserved.
TATVA INSTITUTE OF TECHNOLOGICAL STUDIES, MODASA (GTU)
CS b659: Intelligent Robotics
Robot technologies for order picking
Virtual Business Challenge
1-3 Example 1 A tricycle has 3 wheels. How many wheels are on 7 tricycles? 1. Each tricycle has 3 wheels. Lesson 1-3 Example 1.
Sensors For Robotics Robotics Academy All Rights Reserved.
Developing Artificial Intelligence in Robotics
Irving Vasquez, L. Enrique Sucar, Rafael Murrieta-Cid
Introduction to robotics
TIGERBOT 2 REJUVENATION
Developing systems with advanced perception, cognition, and interaction capabilities for learning a robotic assembly in one day Dr. Dimitrios Tzovaras.
11/12/2015 Team HARP (Abhishek, Alex, Feroze, Lekha, Rick)
Robots with four wheels
Mixed Reality Server under Robot Operating System
A Short Introduction to the Bayes Filter and Related Models
Homework 0: Overview CSE 409R.
The Basics Subtitle.
Motion Models (cont) 2/16/2019.
Sensor Placement Agile Robotics Program Review August 8, 2008
Probabilistic Map Based Localization
Quick Introduction to ROS
DARPA Subterranean Challenge “Software” Projects (first version)
Robotics and Perception
Robotic Perception and Action
Robotic Perception and Action
Principle of Bayesian Robot Localization.
Robotic Perception and Action
Robotic Perception and Action
Midway Design Review Team 16 December 6,
Comprehensive Design Review
EXPLICIT DIRECT INSTRUCTION
Presentation transcript:

Robotic Perception and Action ROS Homework

Create a ROS package called homework in the current workspace. All the directories, files and generated code must be inside the homework directory. Rules

Robot Modeling and Simulation Create a new robot and simulate it in a virtual world in gazebo. The robot structure must be different from the robot explained during the lessons. The differences could be in the robot sizes, number of wheels, number and type of sensors ecc. Gazebo Plugin Requirements Differential Drive Sensor: Laser or Camera (Kinect, real sense...) Index

Create a subscriber node called acquire_data.cpp with the 2) Data Acquisition Create a subscriber node called acquire_data.cpp with the aim to acquire data from the robot. The data to collect is from the odometry (pose) and the sensor (laser scan or point cloud data), during a motion of the robot. Show the data acquired on the terminal during the motion of the robot (manually with the teleop_twist keyboard) in the virtual world simulated in gazebo. The subscribers are: Odometry Sensor data Index

Given the virtual world and the created robot, 3) Sensor Fusion Given the virtual world and the created robot, create a node called fuse_data.cpp with the aim to make the sensor fusion from the odometry and the sensor data. The odometry must be corrupted from a noisy signal. Show the results on the terminal. The subscribers are: Odometry Sensor data Homework

ALTRE IDEE Creare un nodo che pubblica comandi di velocità per far seguire al robot in maniera non manuale una traiettoria specifica ... Homework