Presentation is loading. Please wait.

Presentation is loading. Please wait.

Robotic Guidance.

Similar presentations


Presentation on theme: "Robotic Guidance."— Presentation transcript:

1 Robotic Guidance

2 Project Description Teach a robot to guide a person to a predefined destination. General requirements: 1. Check out the Carleton College computer science senior comps for more information and ideas. 2. The robot must use a cam and a vision algorithm(s) as the main guidance. 3. Sensors such as bump, infra-red and ultra-sound could provide direction and safety for the robot. 4. Consider using the SURF algorithm to detect known environments. 5. The solution should be easily modified for alternative routes. 6. Check out Alexander Popov’s 2011 senior project for techniques about driving the robot. My project assignment is to “teach a robot to duie a person to a predefined destination.” The basis for this project was in an assisted living facility, since there are fewer staff members and larger facilities, these places are relying on robots to guide residents from one area to another for apppointments, etc., Some of the general requirements were the following, however, now I am mainly focusing on 2. and 3. – using a Kinect sensor as described later on, 5. – using a Turtlebot which can create and save maps for different locations, also described later on.

3 My Project Robot Operating System (ROS) Turtlebot Gyro Sensor
Kinect Sensor 3D depth sensors RGB camera Microphones (not used) Laptop with WiFi Gyro Sensor Create Sensors Cliff Sensor Bump Sensor After talking to Dr. Pankratz and Sasha, we came up with using ROS or Robot Operating System and a Turtlebot which uses a Kinect sensor and a laptop for communication to a workstation laptop. ROS uses different stacks for things like driving the robot, using the kinect sensor, using the robot’s sensors, using the gyroscope, etc. and then other modules can subscribe to topics posted by these different sensors so everything can work together. The Kinect sensor has a 3D depth sensor (using an infrared projector and a monochrome CMOS or complimentary metal-oxide semiconductor to “see” a room in 3D) and an RGB camera – these run at 30 Frames Per Second. The Turtlebot also has a Gyroscope that measures orientation, which aides in the odometry of the Turlebot. In addition, as part of the create robot itself, there are cliff and bump sensors as well. I am using the Turtlebot to create a map using SLAM technology so that I can have the robot go to a predefined spot on the map and essentially “guide” a person there. The Turtlebot also has other apps such a follower app where it will follow an object in front of it or an Android app where you can control the robot using a mobile device.

4 My Plan Using the Turtlebot and SLAM
Simultaneous localization and mapping Builds a map of the environment while keeping track of current location (keyboard) Can then use this map to drive around the environment I am using a Turtlebot and SLAM as mentioned earlier which using the Robot Operating System on linux. Here is an introduction of the Turtlebot. There is an application that is a part of the Turtlebot stacks where I can create this virtual map by driving the robot around and then save the map so I can have the Turtlebot navigate to anywhere on the map with a click of the mouse. The SLAM technology builds a map of the environment while keeping track of the current location, using the keyboard or a joystick to navigate around an area. It uses the built in sensor to detect different blobs and object as well as walls to get a general map of an area. When navigating this area, the robot uses this map in addition to the sensors so it can adapt to different settings (like a person or object is in the way).

5 This is an example of the interface for creating a map and what the Turtlebot “sees”, you can see the blobs of object around the Turtlebot and there are a variety of other views to look at through this user interface I was able to successfully do this and not run into any objects (for the most part). It was pretty interesting.

6 What I Have Done Built the Turtlebot Installed all components
Many problems I have build the Turtlebot from the kit Dr. Pankratz ordered and installed everything on two linux machines. I have encountered numerous problems along the way including: Bugs in software release, Permission issues, SNC Network, Driver Issues, Sensor Issues, Latency, Calibration Issues and some of these issues were more problematic for me because I didn’t know if it was the hardware of the robot or the software. In addition since the Turtlebot stacks are all open source, I had a problem where an update to the software included many bugs which made the Turtlebot unusable, so I had a couple of weeks where I had no idea what was going wrong, but it ended up being fixed in a software release. Now, I also can create maps, but can’t save them and I can’t use the keyboard to teleoperate the robot to create maps (network or latency issue?) I plan on working on these things next. Also I have found a turtlebot project that used voice activatation to make the robot go, so maybe that is something I will look into.

7 Questions/Suggestions
Network issues? Weight balance issues? What to add to make this better? I cannot use the SNC network because it does not work with the Turtlebot between the two PCs so I am currently using an adhoc network, which is much slower and could be causing the latency problems, this also means I can’t teleoperate using an android device because it can’t connect to an adhoc network. Any suggestions? The Turtlebot is also very unbalanced and is back heavy, I need to find a way to make sure it doesn’t ever lose balance (also I need to make sure I secure the laptop). Anything else I can add to make this better?


Download ppt "Robotic Guidance."

Similar presentations


Ads by Google