Viability of vanishing point navigation for mobile devices

Slides:



Advertisements
Similar presentations
How Do You Make a Program Wait?. 1. What is an algorithm? 2. Can you think of a reason why it might be inconvenient to program your robot to always go.
Advertisements

Aerial Surveillance Drone Design Review Tuesday, March 12, 2013 Colin Donahue Samantha Kenyon Jason Lowden Benjamin Wheeler.
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
Challenges Close Shave Sprint, Spin, Sprint The Labyrinth
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
A vision-based system for grasping novel objects in cluttered environments Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng 2007 Learning to.
The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
IMagic Senior Project Emre AYDIN Asil Kaan BOZCUOĞLU Onur ÖZBEK Egemen VARDAR Onur YÜRÜTEN Project Supervisor: Asst. Prof. Uluç Saranlı.
Autonomy using Encoders Intro to Robotics. Goal Our new task is to navigate a labyrinth. But this time we will NOT use motor commands in conjunction with.
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Autonomous Path Following By Andrey Zhdanov and Dan Feldman.
Brent Dingle Marco A. Morales Texas A&M University, Spring 2002
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
STC Robot Optimally Covering an Unknown Indoor Environment Majd Srour, Anis Abboud Under the supervision of: Yotam Elor and Prof. Alfred Bruckstein.
Ruolin Fan, Silas Lam, Emanuel Lin, Oleksandr Artemenkoⱡ, Mario Gerla
The Smartwatch BUS 237 – Group Project TA: Kevin Chua Anthony Siu
Challenge #1 – Relay Race Robot #1 will be randomly placed on starting line #1, #2 or #3. Robot #1 will drive until it detects the “Dark Line” - Robot.
RESEARCH, INSTRUCTION, SERVICE, ENTREPRENEURSHIP – EYE IN THE SKY RESEARCH RISE STUDENT: ZACH PECK MENTOR: JOHN WELSH.
Loops and Switches. 1. What kind of blocks are these? 2. Name two kinds of controls that can be specified to determine how long a loop repeats. 3. Give.
Android Phone App JOHN MICHAEL MARIANO UNDERGRADUATE MECHANICAL ENGINEERING STUDENT EEL 4665 INTELLIGENT MACHINES DESIGN LABORATORY OCTOBER 30, 2014.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
FEATURE EXTRACTION FOR JAVA CHARACTER RECOGNITION Rudy Adipranata, Liliana, Meiliana Indrawijaya, Gregorius Satia Budhi Informatics Department, Petra Christian.
Team Spot A Cooperative Robotics Problem A Robotics Academy Project: Laurel Hesch Emily Mower Addie Sutphen.
Link State Routing Protocol W.lilakiatsakun. Introduction (1) Link-state routing protocols are also known as shortest path first protocols and built around.
CS 376b Introduction to Computer Vision 04 / 29 / 2008 Instructor: Michael Eckmann.
Tour Guide Robot Project Face Detection and Face Orientation on The Mobile Robot Robotino Gökhan Remzi Yavuz Ayşenur Bilgin.
Autonomous Robot Project Lauren Mitchell Ashley Francis.
Mobile Controlled Car Students : Tasneem J. Hamayel Hanan I. Mansour Supervisor : Dr.Aladdin.
Final Presentation.  Software / hardware combination  Implement Microsoft Robotics Studio  Lego NXT Platform  Flexible Platform.
SecureLocation Abhinav Tyagi. What is SecureLocation? SecureLocation demonstrate use of BluetoothLE based beacons for securing a region. The application.
Obstacle Avoidance using Machine Vision Joose Rautemaa
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Microcomputers Final Project.  Camera surveillance is an important aspect of Robotics.  Autonomous robots require the use of servos for camera control.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
The Smart watch 1. ① Introduction ② ADVT.&DISADVT. ③ Examples ④ Future ⑤ Conclusion Agenda 2.
Senior Design Project Megan Luh Hao Luo Febrary
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Distance Estimation Ohad Eliyahoo And Ori Zakin. Introduction Current range estimation techniques require use of an active device such as a laser or radar.
Learning Roomba Module 5 - Localization. Outline What is Localization? Why is Localization important? Why is Localization hard? Some Approaches Using.
Robot Project by Ahmad Shtaiyat Supervised by Dr. Salem Al-Agtash.
SEMINAR ON “ REMOTE CONTROLLING OF TECHNICAL SYSTEMS USING MOBILE DEVICES”
Student Defined Lab Demo Date: April 29, 2009 Robot Exchange.
LOGO “ Add your company slogan ” Final Project Group: T2H2 Mai Thi Thu Nguyen Van Thanh Do Van Huu Pham Ngoc Huy Supervisor: DungHA TrungNT T2H2 Group:
© 2006 Carnegie Mellon Robotics Academy Designed for use with the LEGO MINDSTORMS ® Education NXT Software and Base Set #9797 Sentry System Two-Way Communication.
Mingze Zhang, Mun Choon Chan and A. L. Ananda School of Computing
Network Layer COMPUTER NETWORKS Networking Standards (Network LAYER)
How Do You Make a Program Wait?
Self-Navigation Robot Using 360˚ Sensor Array
NXT Mindstorms Kit Programming
Face Detection and Notification System Conclusion and Reference
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
Mapping Robot Department of Electrical & Computer Engineering
Depth Analysis With Stereo Cameras
Winning Strategy in Programming Game Robocode
Monitoring Robot Prepared by: Hanin Mizyed ,Abdalla Melhem
Materials and Methods (Continued)
War Field Spying Robot with Night Vision Wireless Camera
Submitted by: Ala Berawi Sujod Makhlof Samah Hanani Supervisor:
Parallel Density-based Hybrid Clustering
Dead Reckoning, a location tracking app for Android™ smartphones Nisarg Patel Mentored by Adam Schofield and Michael Caporellie Introduction Results (cont.)
CS 7455 Term Project Robot control by Machine learning
PixelLaser: Range from texture
Navigation In Dynamic Environment
Elecbits.
How electronic systems work using the Systems approach.
Loops and Switches How Do You Make Loops and Switches? lesson > TeachEngineering.org Center for Computational Neurobiology, University of Missouri.
Midway Design Review Team 1: MirrAR
Toy Train Safety Control System
PRELIMINARY DESIGN REVIEW
Determining the Risk Level Regarding to the Positioning of an Exam Machine Used in the Nuclear Environment, based of polynomial regression Mihai OPROESCU1,
Presentation transcript:

Viability of vanishing point navigation for mobile devices Michael Waldron Mentored by Adam Schofield and John Miranda Introduction Results Proficiency of algorithm in varying environments Full functionality Partial Functionality Results (cont.) Currently, the cost of autonomous navigation makes it inaccessible and unjustifiable. If made more accessible, autonomous navigation could range from being used as a great new toy to assisting relief personnel without risking human life. However, replacing multi-thousand dollar equipment with available devices is a daunting task. The goal of this project was to perform autonomous navigation with inexpensive, off the shelf hardware, by implementing OpenCV’s image processing library for Android™ devices. Following testing outside, AHS Control (Graph 5) was selected for its resemblance to APG Control, with the chief difference being it was much wider than APG Control, further testing the robots ability to correct its path completely. This environment also had no range of partial functionality. The final environment tested, was inside the planetarium on the AHS campus (Graph 6) . The hallway itself is standard, except for its extremely low lighting. During testing, the most common reasons for failure were the robot stalling from the inability to find further vanishing points or being unable to correct its path in time, causing it to run into a wall. Angle offset (°) Angle offset (°) -25 25 -20 30 -45 45 40 Graph 1: APG Control Graph 2: AHS Curved Hall Angle offset (°) Angle offset (°) Materials & Methods -10 10 For testing, a Lego® NXT was paired with both a Samsung® Galaxy S5 and a Samsung® Galaxy Note 2. Furthermore, a theodolite was used to accurately calculate all the angle offsets tested. Android™ Studio was used for application development. The Galaxy phones were paired via Bluetooth with the NXT robot in order to send the NXT heading commands to navigate accurately down a clearly defined path. At each location the robot was given a starting angle, precisely calculated using a theodolite. For each trial, success was determined by whether or not the robot could reach a predetermined distance without failure. The angles tested incremented by 5 degrees, starting at 0, and the range until failure was tested for both directions. The application implemented works by capturing a plain image (Picture 1) and running a Hough Line Transformation (OpenCV dev team, 2015) on the image. This finds and overlays all natural lines made in the image (Picture 2). After the lines have been recorded, the average intersection of the lines is found, and recorded as the vanishing point. Finally, the difference between the middle of the image and the vanishing point is calculated, and the robot is sent a command to correct its path according to the difference. Conclusions The project was successful in that autonomous navigation was achieved using only an Android™ smartphone and a Lego® NXT kit. With this method, the cost of autonomous navigation was reduced from tens of thousands of dollars to under one thousand dollars. However, in its current state, this form of autonomous navigation can only satisfy the “great new toy” category as it too slow and not accurate enough for any practical use, military or otherwise. Even so, the results also act as a proof of concept that with a faster camera and processor, vanishing point recognition could serve as a versatile form of autonomous navigation, still at a fraction of the current cost. Graph 3: AHS Cluttered Graph 4: AHS Outside Angle offset (°) Angle offset (°) -10 10 20 -20 50 -50 Graph 5: AHS Control Graph 6: AHS Planetarium Graphs 1-6: Full functionality was when the robot was able to detect an initial vanishing point and complete the required distance without failure. Partial functionality means that the robot was able to detect an initial vanishing point, but was unable to complete the required distance. The algorithm was tested in a variety of environments. APG control (Graph 1) was the first testing environment and was chosen because it was unobstructed and contained well defined linear features. The next environment tested was the top floor of AHS (Graph 2) and it was chosen to test the robot’s proficiency in a hallway that is curved, thus the robot cannot “see” the end, only the path. After the curved hallway, a hallway was chosen in AHS for testing the robot’s proficiency in a cluttered environment (Graph 3). It is also noteworthy that in this environment, once full functionality peaked the algorithm stopped functioning altogether. Normally, the hallway would be similar to the APG control, but it was filled with various objects and supplies that could obstruct the algorithm’s ability to detect lines. Then testing was conducted on the concrete path connecting AHS to the CEO building to test the algorithms ability to find vanishing points outside (Graph 4). Acknowledgements Special thanks to Erik Beltran, a contractor at CERDEC, who helped the project by assisting with Android™ Studio development. Also, recognition must be given to Jacek Fedorynski, the author of the open source “NXT Remote Control” application that was used in the project for its NXT Bluetooth communication protocols. References Picture 1: APG Control hallway Picture 2: natural lines found in APG Control OpenCV dev team (2015, February 25). Hough Line Transform. Retrieved from http://docs.opencv.org/doc/tutorials/imgproc/imgtrans/hough_lines/hough_lines.html 6’ 6’