Download presentation
Presentation is loading. Please wait.
Published byNora Grant Modified over 6 years ago
1
What I Did This Summer JEFFERSON RIDGEWAY 07 August 2017
Recent grad from Elizabeth City State university (ECSU)
2
Background Astrobee is able to localize itself based on:
Visual Features from Mapping AR Tags Handrail Measurements from a depth sensor Optical flow features Objective: Create a regression testing script to help understand data based on different features from Astrobee Visualize the summary of combined features into a human-readable output (i.e. pdf) What is Astrobee? An autonomous robot that will eventually replace current SPHERES robots on the ISS due to addressing the limitations of SPHERES robots (who operate on gas thrusters and non-rechargable batteries and localize themselves based on ultrasonic beacons
3
Background: Localization for Astrobee
Map Building Visual Observations Position Tracking Map building Visual features (i.e. SURF, BRISK) are detected and matched across a collection of images, then their 3D positions are jointly optimized using structure from motion SURF + BRISK = feature descriptors SURF = robust but slower (part of the map building process) BRISK = faster but not as robust (used for the robot) Visual Observations Robot detects features on an image, matches them to the pre-built map, then filters the matches with a geometric consistency check Position Tracking Uses (EKF) to keep track of the state when an image was taken to account for the lengthy delay in processing the image
4
Current Control Flow Structure
Bag_one.bag file and .map file ekf_graph.py Bag_two.bag file and .map file ekf_graph.py . . What is a .bag file: recorded data from experiment that was run Have animations Position Graph is one of 12 statistics that are displayed which include: orientation and ekf features, vl features mahalnobis distances, velocity and acceleration, angular velocity, bias terms, mahalnobis distance histogram, visual landmarks density, & optical flow density Steps 1-3 are completely fine for one .bag file, but what happens when you have multiple bag files in different sub directories that you would like to run? Step 1: Record a specific .bag file from map with Astrobee Step 2: Run ekf_graph.py on .bag file and .map file Step 3: Generate .pdf file with statistics Step 4: Manually review generated .pdf file Step 5: Repeat 1-4 for each .bag file n.bag file and .map file Run ekf_graph.py every time What happens when you have n number of .bag files and their .map files?
5
Bag_one.bag file and .map file Bag_two.bag file and .map file
Control Flow Structure with Regression Testing Script Bag_one.bag file and .map file *.json file Output.pdf Regression_Testing.py Bag_two.bag file and .map file *.json file Input -> .json files . . ekf_graph.py n.bag file and .map file *.json file
6
Role of Regression Testing System
Automate the manual running of ekf_graph on every .bag file Check the hash number of the config file and if modified, run ekf_graph.py on subdirectory of .bag files Generate/Visualize summary of statistics in human-readable format (i.e. .pdf)
7
Modifications to ekf_graph.py
Added two functions to handled processing the covariance and writing information to the .json file: Process_covariance() – takes the Euclidean distance per each time step from the covariance array and finds the mean and standard deviation of all the numbers Save_json() – gets all the data from the statistics and saves it to .json file to be read by the regression testing sytem Save_json() function takes gets all of the data [mean and standard deviation from the covariance, position/angular error, runtime error, optical flow features, ml(mapped landmarks)/of(optical flow) count, visual landmarks density] and saves it to .json file to be read by the regression testing system
8
Example of Output.pdf
9
Example of Output.pdf – Mapped Landmarks (ML) & Optical Flow (OF)
What is mapped landmark features? - an algorithm that can match similar points in images consistently What are optical flow features? - Tracks points locally as things(i.e. corners) move in the image
10
Example of Output.pdf – Visual Landmarks Delay
Visual landmark Delay -> delay between when the image is taken and when the image is processed over all files Want to get as many features as fast as possible, and with the graph we can see how long it takes
11
Example of Output.pdf - Heatmap
Heatmap of where in the image we detected the image see a lot more visual landmarks features toward the center of the image (why-> because of the distortion of the image) Optical flow is similar to visual landmarks but more distribution Optical flow – features of the whole image, visual landmarks – features only where there is less distortion From this observation, Max was able to know where to look to complete his research project
12
Example of Output.pdf - Covariance
Mean and standard deviation were chosen metrics based on their detail of how spread out the values are and the average of all of the values
13
My time in California
14
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.