Download presentation
Presentation is loading. Please wait.
Published byDwight Smith Modified over 9 years ago
1
ViPER Video Performance Evaluation Resource University of Maryland
2
Problem and Motivation Unified video performance evaluation resource, including: – ViPER-GT – a Java toolkit for marking up videos with truth data. – ViPER-PE – a command line tool for comparing truth data to result data. – A set of scripts for running several sets of results with different options and generating graphs.
3
Solutions Object level matching. – First, do matching. For each ground truth object, get the output object that is the closest. Alternatively, for each subset of truth objects, get the subset of output objects that minimizes the total overall distance. – Measure of precision / recall for all objects. – Score for each object match. – O(e x ) Pixel/object frame level and single-match tracking. – For each frame, generate a series of metrics looking at the truth and result pixels and box sets. – Using keys, or the location of object in frame k, get success rates for matching individual moving boxes.
4
Pixel Graphs
5
Pixel-Object Graphs
6
Tracking Graphs
7
Progress Polygons added. Slight improvements in memory usage. Various responses to user feedback. – Changed the way certain metrics are calculated.
8
Goals and Milestones Defining formats for tracking people, and metrics to operate on them. Adding new types of graphs to the script output. – Replacing or upgrading the current graph toolkit. Reducing memory usage.
9
Fin Dr. David Doermann David Mihalcik Ilya Makedon & many others
10
Object Level Matching Most obvious solution: many-many matching. Allows matching on any data type, at a price.
11
Pixel-Frame-Box Metrics Look at each frame and ask a specific question about its contents. – Number of pixels correctly matched. – Number of boxes that have some overlap. Or overlap greater than some threshold. – How many boxes overlap a given box? (Fragmentation) Look at all frames and ask a question: – Number of frames correctly detected. – Proper number of objects counted.
12
Individual Box Tracking Metrics Mostly useful for the retrieval problem, this solution looks at pairs of ground truth boxes and a result box. Metrics are: – Position – Size – Orientation
13
Questions: Ignoring Ground Truth Assume the evaluation routine is given a set of objects to ignore (or rules for determining what type of object to ignore). How does this effect the output? – For pixel measures, just don’t count pixels on ignored regions. – For object matches, do the complete match; when finished, ignore result data that matches ignored truth.
14
Questions: Presenting the Results
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.