11/12/2015 Team HARP (Abhishek, Alex, Feroze, Lekha, Rick)

Slides:



Advertisements
Similar presentations
We consider situations in which the object is unknown the only way of doing pose estimation is then building a map between image measurements (features)
Advertisements

Evidential modeling for pose estimation Fabio Cuzzolin, Ruggero Frezza Computer Science Department UCLA.
Team 10: SHERPA May 3, Our Team Outline Project Selection Our Project Design Choices Obstacles Our Prototype Future Additions What We Learned Acknowledgements.
TECHNOLOGICAL INSTITUTE Center for Robot Technology.
Greg Beau SerajAnanya. Outline  Project overview  Project-specific success criteria  Block diagram  Component selection rationale  Packaging design.
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
A vision-based system for grasping novel objects in cluttered environments Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng 2007 Learning to.
Agile Software Engineering Frank Maurer Agile Software Engineering Lab, University of Calgary
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Neural Network Grasping Controller for Continuum Robots David Braganza, Darren M. Dawson, Ian D. Walker, and Nitendra Nath David Braganza, Darren M. Dawson,
Bryan Willimon Master’s Thesis Defense Interactive Perception for Cluttered Environments.
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
Computer Vision REU Week 2 Adam Kavanaugh. Video Canny Put canny into a loop in order to process multiple frames of a video sequence Put canny into a.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Planning Motions with Intentions By Chris Montgomery A presentation on the paper Planning Motions with Intentions written by Yoshihito Koga, Koichi Kondo,
1 Robotics and Biology Laboratory – Department of Computer Science Dov Katz Oliver Brock June 1 st, 2007 New England Manipulation Symposium Extracting.
Embedded Visual Control for Robotics 5HC99 ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters,
Animat Vision: Active Vision in Artificial Animals by Demetri Terzopoulos and Tamer F. Rabie.
NCDA: Pickle Sorter Final Design Review Project Sponsored by Ed Kee of Keeman Produce, Lincoln, DE.
High Speed Obstacle Avoidance using Monocular Vision and Reinforcement Learning Jeff Michels Ashutosh Saxena Andrew Y. Ng Stanford University ICML 2005.
Center for Robotics and Intelligent Systems (CRIS) By PROFESSOR RAVI PRAKASH CO-ORDINATOR 2006.
IMPLEMENTATION ISSUES REGARDING A 3D ROBOT – BASED LASER SCANNING SYSTEM Theodor Borangiu, Anamaria Dogar, Alexandru Dumitrache University Politehnica.
Estimation of physical properties of real world objects Rohan Chabra & Akash Bapat.
Constraints-based Motion Planning for an Automatic, Flexible Laser Scanning Robotized Platform Th. Borangiu, A. Dogar, A. Dumitrache University Politehnica.
Joint International Master Project Dennis Böck & Dirk C. Aumueller 1.
Navi Rutgers University 2012 Design Presentation
Vision-based Landing of an Unmanned Air Vehicle
Service Robots A Tire Changing Manipulator. Overview n The robotic System changes the tires of Formula 1 type cars. n Considerations: Speed, Accuracy,
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
Bryan Willimon, Stan Birchfield, Ian Walker Department of Electrical and Computer Engineering Clemson University IROS 2010 Rigid and Non-Rigid Classification.
Autonomous Soil Investigator. What Is the ASI? Designed to complete the 2013 IEEE student robotics challenge Collects "soil" samples from a simulated.
ROBOT VISION LABORATORY 김 형 석 Robot Applications-B
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Univ logo Research and Teaching using a Hydraulically-Actuated Nuclear Decommissioning Robot Craig West Supervisors: C. J. Taylor, S. Monk, A. Montazeri.
Computer Components Natasha Baker.
Rick Parent - CIS681 Reaching and Grasping Reaching control synthetic human arm to reach for object or position in space while possibly avoiding obstacles.
Auto-Park for Social Robots By Team Daedalus. Requirements for FVE Functional Receive commands from user via smartphone app Share data with other cars.
Collaborative Grasp Planning with Multiple Object Representations Peter Brook Matei Ciocarlie Kaijen Hsiao.
Team Theremin1 Analog Theremin Using Vacuum Tubes and Frequency Detection via Band-pass Filters Ryan Adams, Matt Britt, Yuri Yelizarov, William Findley.
Learning of Coordination of a Quad-Rotors Team for the Construction of Multiple Structures. Sérgio Ronaldo Barros dos Santos. Supervisor: Cairo Lúcio Nascimento.
Toward humanoid manipulation in human-centered environments T. Asfour, P. Azad, N. Vahrenkamp, K. Regenstein, A. Bierbaum, K. Welke, J. Schroder, R. Dillmann.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Vision-Guided Humanoid Footstep Planning for Dynamic Environments
Using the Cyton Viewer Intro to the viewer.
What Do Computers Do? A computer system is
FlowArm PLTW Motions Computer Integrated Manufacturing
Design and Development of an Autonomous Surface Watercraft
On Multi-Arm Manipulation Planning
UAV Vision Landing Motivation Data Gathered Research Plan Background
FlowArm PLTW Programming
domestic environments
Game Strategy for APC 2016.
Irving Vasquez, L. Enrique Sucar, Rafael Murrieta-Cid
3.2 Virtualisation.
Jorg Stuckler, Ricarda Steffens, Dirk Holz, and Sven Behnke
Flying Universal Gripper
What is Arduino? By James Tedder.
TIGERBOT 2 REJUVENATION
Robust Belief-based Execution of Manipulation Programs
CS2310 Zihang Huang Project :gesture recognition using Kinect
Progress Review 11/23 Team D.
Progress Review.
Project P06441: See Through Fog Imaging
Sensor Placement Agile Robotics Program Review August 8, 2008
Quick Introduction to ROS
Robotics and Perception
Humanoid Motion Planning for Dual-Arm Manipulation and Re-Grasping Tasks Nikolaus Vahrenkamp, Dmitry Berenson, Tamim Asfour, James Kuffner, Rudiger Dillmann.
PRELIMINARY DESIGN REVIEW
Presentation transcript:

11/12/2015 Team HARP (Abhishek, Alex, Feroze, Lekha, Rick) Carnegie Mellon University

Suction Acquired shop vac for final suction system. Re-iterated on the prototype suction gripper design. Added the custom gripper in simulation.

Suction Finalized PCB design Designed and ordered components for electronics enclosure

Perception Developed algorithm to remove shelf from point cloud Improved clustering by switching from color-based to euclidian Estimated object based on bounding box dimensions compared to ground truth Compiled all vision subsystems into one complete pipeline

Perception Ran 1000 image test on point clouds from Rutgers database Accurately identified grasp position within 3 cm on over 56% of attempts Investigating results to identify ways to improve algorithm (70% required by FVE)

Platform Move actual PR2 arm using Inverse Kinematics action The Inverse Kinematics arm motion in simulator was tested on the actual robot The program was loaded to the robot’s computer and executed.

Platform Smach state controller was used to call MoveIt planner The target pose was defined using the movable interactive marker inside rviz The arm was moved to target pose successfully Eventually, the perception algorithm will return the target pose to smach which will then be used to plan and execute arm motion

Goals for next week: Suction Final PCB and enclosure. Create test circuit with components and build final hardware. Perception Procure computer for Kinect2. Run algorithms directly grabbing real-time data from Kinect-2. Platform Use moveit_pr2 to avoid collisions. Demonstrate a simulation that can move the arms to an item, move base and move the arm to order bin.