June 2 nd 2010 Line Tracking & Mission to Mars Embedded Motion Control Group 1 Rene Thijssen Luke Lathouwers Maarten van Stuijvenberg Roel ten Have Bastian.

Slides:



Advertisements
Similar presentations
Robofest 2005 Introduction to Programming RIS 2.0 RCX Code.
Advertisements

Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
By Zheng Sun, Aveek Purohit, Shijia Pan, Frank Mokaya, Raja Bose, and Pei Zhang final38.pdf.
Remote sensing in meteorology
ROBOT HUNTER Group C Boddu, Rajani Cole,Craig Gaverneni,Sai Joshi, Preeti Rao, Vikranth MAE 476/576 Mechatronics.
A Strange Phenomenon There is a type of unstable particles called Muon. They are produced in the upper atmosphere 14 km above Earth’s surface and travel.
Video Processing EN292 Class Project By Anat Kaspi.
Autonomous Vehicle: Navigation by a Line Created By: Noam Brown and Amir Meiri Mentor: Johanan Erez and Ronel Veksler Location: Mayer Building (Electrical.
Debugging (updated 9/20/06 12:48pm) It doesn’t work…. What do I do????
Senior Project Design Review Remote Visual Surveillance Vehicle (RVSV) Manoj Bhambwani Tameka Thomas.
©2006 CSUC Institute for Research in Intelligent Systems Introduction to Coding June 15, 2006.
Team Mejor – The Spyder Controls and Logic – 10/5/04 Brian Shula Perry Smith.
PT 5000 Pooja Rao Ted Tomporowski December 7, 2004.
Using the Light Sensor A LEGO light sensor utilizes a LED and a phototransistor to read the reflection of light off a surface. Light sensors are useful.
EDGE AVOIDER ROBOT USING I-BOT mini V3. EDGE AVOIDER USING I-BOT mini V3 Edge Avoider Robot is built using the IR based Line Detecting Module. The modules.
EG1003: Introduction to Engineering and Design Sensors.
Challenge #1 – Relay Race Robot #1 will be randomly placed on starting line #1, #2 or #3. Robot #1 will drive until it detects the “Dark Line” - Robot.
BAR CODE SCANNER A.ANUSHA (06N61A0402). What is bar code? A barcode is a machine readable representation of information. Barcode stores data in widths.
LEGO Mindstorms NXT Introduction. Component NXT Brick Touch Sensor Light Sensor Ultrasonic Sensor Interactive Servo Motors MMN Lab.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Patterns, frameworks & Lejos
3-D Scanning Robot Steve Alexander Jeff Bonham John Johansson Adam Mewha Faculty Advisor: Dr. C. Macnab.
Capacitor Connection in to LED socket Connection to 5v and ground Connection to light sensor pin.
Engaging Undergraduate Students with Robotic Design Projects James O. Hamblen School of ECE, Georgia Tech, Atlanta, GA
Autonomous Robot Project Lauren Mitchell Ashley Francis.
Automated Bridge Scour Inspection FSU/FAMU College of Engineering Team 7 Proposal 10/27/2010.
EV3 Workshop Oct 3, 2015 Instructor: Chris Cartwright
Chapter Four: Motion  4.1 Position, Speed and Velocity  4.2 Graphs of Motion  4.3 Acceleration.
Juan David Rios IMDL FALL 2012 Dr. Eric M. Schwartz – A. Antonio Arroyo September 18/2012.
By: Eric Backman Advisor: Dr. Malinowski.  Introduction  Goals  Project Overview and Changes  Work Completed  Updated Schedule.
Motion Constraint Train Down the Track. Apply Angle Constraint to Workplanes Train is assembled at this point. Place a Workplane through the center of.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
By Eric Greene RMS / I. S. 192 Q. Smart Start Question How would you get the robot to flirt with disaster by touching the edge of the “table” as many.
Laboratory 7: Sensors. Overview Objective Background Materials Procedure Report / Presentation Closing.
EV3 Software EV3 Robot Workshop
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
MARS EXPLORATION PROGRAM NSTA Robotics E-Class MARS EXPLORATION PROGRAM NSTA Robotics E-Class Sheri Klug Mars Space Flight Facility Arizona State University.
Team 19 Project Br ö sel. Team Members 2/9 The Project Design Alternatives Testing Future Work Questions Team Members Nathan Leduc Electrical/Computer.
ROBOTC Software EV3 Robot Workshop
Suggested Machine Learning Class: – learning-supervised-learning--ud675
Jay Summet -Intel Mentor: Rahul Sukthankar BurningWell Micro-controller based sensor localization and tracking system using projected light patterns GVU.
ROBOTC Software EV3 Robot Workshop Lawrence Technological University.
EG1003: Introduction to Engineering and Design Laboratory 4: Sensors.
Motion and Force Chapter Three: Motion 3.1 Position and Velocity 3.2 Graphs of Motion 3.3 Acceleration.
Best Practice T-Scan5 Version T-Scan 5 vs. TS50-A PropertiesTS50-AT-Scan 5 Range51 – 119mm (stand- off 80mm / total 68mm) 94 – 194mm (stand-off.
Software Overview Walking Robot by Sharon Davis Kernel MicroC/OSII.
Application Case Study Security Camera Controller
ROBOTC for VEX Online Professional Development
Selection Learning Objective: to be able to design algorithms that use selection.
Uncontrolled Modulation Imaging
Motion Position, Speed and Velocity Graphs of Motion Acceleration.
UNIT-III FEEDBACK DEVICES
Touch Sensor.
Chapter Four: Motion 4.1 Position, Speed and Velocity
Review and Ideas for future Projects
MOTION.
Sensors Training.
Systems Design Nursebot
Line Following Behavior
Motion and Force. Motion and Force Chapter Three: Motion 3.1 Position and Velocity 3.2 Graphs of Motion 3.3 Acceleration.
Motion and Force. Motion and Force Chapter Twelve: Distance, Time, and Speed 12.1 Distance, Direction, and Position 12.2 Speed 12.3 Graphs of Motion.
CSE 321 – Object Detection & Tracking
Line Following Behavior
Motion and Force. Motion and Force Chapter Three: Motion 3.1 Position and Velocity 3.2 Graphs of Motion 3.3 Acceleration.
Debugging It doesn’t work…. What do I do????
Chapter Four: Motion 4.1 Position, Speed and Velocity
Nanyang Technological University
Remote sensing in meteorology
Distance, Direction and Position
Motion and Force. Motion and Force Chapter Three: Motion 3.1 Position and Velocity 3.2 Graphs of Motion 3.3 Acceleration.
Presentation transcript:

June 2 nd 2010 Line Tracking & Mission to Mars Embedded Motion Control Group 1 Rene Thijssen Luke Lathouwers Maarten van Stuijvenberg Roel ten Have Bastian Eenink

Line Tracking – Hardware Design Overview functionality (I/O per RCX): Two light sensors are used to track the line Both light sensors are positioned above the line Encoders for steering and distance measurement A 1 B RCX1 2 C 3 A 1 B RCX2 2 C 3 Light sensor left Light sensor right Steering encoder Distance encoder IR Distance motor Steering motor

Line Tracking – Software Callibration Measure the intensity of light and dark between the surface and the line. The treshold is the average between the two measurements. Performance If one light-sensor goes of the line, the Mars Rover steers back in opposite direction. When both light-sensors are back on the line, the Mars Rover steers straigth forward. This minimizes the number of corrections. Steering-angle Controlled by a P-controller to be able to steer in different angles. Distance measurement The motor-encoder measures the number of encoder increments. The measured increments are converted to millimeters after the end of the line is reached.

Line Tracking – Software

Mission to Mars – Strategy Scan the area for lakes Raw data of the light sensors is used to achieve a higher accuracy. Calibration and measurement of the light sensors every 250 ms. A driving pattern is designed to scan the area of Mars effectivly.

Mission to Mars – Strategy Overall progress Edge detection and driving pattern is working The earth computer can read coördinates from the camera Approaching lakes by received coördinates (work in progress) Motion control for driving, steering and temperature To do: Finishing approaching lakes by received coördinates Link sub-programms Testing and debugging