Presentation is loading. Please wait.

Presentation is loading. Please wait.

Viability of vanishing point navigation for mobile devices

Similar presentations


Presentation on theme: "Viability of vanishing point navigation for mobile devices"— Presentation transcript:

1 Viability of vanishing point navigation for mobile devices
Michael Waldron Mentored by Adam Schofield and John Miranda Introduction Results Proficiency of algorithm in varying environments Full functionality Partial Functionality Results (cont.) Currently, the cost of autonomous navigation makes it inaccessible and unjustifiable. If made more accessible, autonomous navigation could range from being used as a great new toy to assisting relief personnel without risking human life. However, replacing multi-thousand dollar equipment with available devices is a daunting task. The goal of this project was to perform autonomous navigation with inexpensive, off the shelf hardware, by implementing OpenCV’s image processing library for Android™ devices. Following testing outside, AHS Control (Graph 5) was selected for its resemblance to APG Control, with the chief difference being it was much wider than APG Control, further testing the robots ability to correct its path completely. This environment also had no range of partial functionality. The final environment tested, was inside the planetarium on the AHS campus (Graph 6) . The hallway itself is standard, except for its extremely low lighting. During testing, the most common reasons for failure were the robot stalling from the inability to find further vanishing points or being unable to correct its path in time, causing it to run into a wall. Angle offset (°) Angle offset (°) -25 25 -20 30 -45 45 40 Graph 1: APG Control Graph 2: AHS Curved Hall Angle offset (°) Angle offset (°) Materials & Methods For testing, a Lego® NXT was paired with both a Samsung® Galaxy S5 and a Samsung® Galaxy Note 2. Furthermore, a theodolite was used to accurately calculate all the angle offsets tested. Android™ Studio was used for application development. The Galaxy phones were paired via Bluetooth with the NXT robot in order to send the NXT heading commands to navigate accurately down a clearly defined path. At each location the robot was given a starting angle, precisely calculated using a theodolite. For each trial, success was determined by whether or not the robot could reach a predetermined distance without failure. The angles tested incremented by 5 degrees, starting at 0, and the range until failure was tested for both directions. The application implemented works by capturing a plain image (Picture 1) and running a Hough Line Transformation (OpenCV dev team, 2015) on the image. This finds and overlays all natural lines made in the image (Picture 2). After the lines have been recorded, the average intersection of the lines is found, and recorded as the vanishing point. Finally, the difference between the middle of the image and the vanishing point is calculated, and the robot is sent a command to correct its path according to the difference. Conclusions The project was successful in that autonomous navigation was achieved using only an Android™ smartphone and a Lego® NXT kit. With this method, the cost of autonomous navigation was reduced from tens of thousands of dollars to under one thousand dollars. However, in its current state, this form of autonomous navigation can only satisfy the “great new toy” category as it too slow and not accurate enough for any practical use, military or otherwise. Even so, the results also act as a proof of concept that with a faster camera and processor, vanishing point recognition could serve as a versatile form of autonomous navigation, still at a fraction of the current cost. Graph 3: AHS Cluttered Graph 4: AHS Outside Angle offset (°) Angle offset (°) 20 -20 50 -50 Graph 5: AHS Control Graph 6: AHS Planetarium Graphs 1-6: Full functionality was when the robot was able to detect an initial vanishing point and complete the required distance without failure. Partial functionality means that the robot was able to detect an initial vanishing point, but was unable to complete the required distance. The algorithm was tested in a variety of environments. APG control (Graph 1) was the first testing environment and was chosen because it was unobstructed and contained well defined linear features. The next environment tested was the top floor of AHS (Graph 2) and it was chosen to test the robot’s proficiency in a hallway that is curved, thus the robot cannot “see” the end, only the path. After the curved hallway, a hallway was chosen in AHS for testing the robot’s proficiency in a cluttered environment (Graph 3). It is also noteworthy that in this environment, once full functionality peaked the algorithm stopped functioning altogether. Normally, the hallway would be similar to the APG control, but it was filled with various objects and supplies that could obstruct the algorithm’s ability to detect lines. Then testing was conducted on the concrete path connecting AHS to the CEO building to test the algorithms ability to find vanishing points outside (Graph 4). Acknowledgements Special thanks to Erik Beltran, a contractor at CERDEC, who helped the project by assisting with Android™ Studio development. Also, recognition must be given to Jacek Fedorynski, the author of the open source “NXT Remote Control” application that was used in the project for its NXT Bluetooth communication protocols. References Picture 1: APG Control hallway Picture 2: natural lines found in APG Control OpenCV dev team (2015, February 25). Hough Line Transform. Retrieved from 6’ 6’


Download ppt "Viability of vanishing point navigation for mobile devices"

Similar presentations


Ads by Google