ViSEvAl ViSualisation and EvAluation
Overview What is evaluation? Evaluation process Metric definition ViSEvAl Description Installation Configuration Functionalities
Evaluation process General overview
Metric definition Metric = distance + filter + criteria Distance: associate detected and annotated objects Spatial: compare bonding boxes area Temporal: compare time intervals Filter: selects which object to evaluate Specific type, distance to the camera,... Criteria: how the property of detected and annotated objects is similar? 4 tasks: detection, classification, tracking, event detection
ViSEvAl platform description ViSEvAl: Interfaces All functionalities (synchronisation, display,...) ViSEvAlGUI ViSEvAlEvaluation Distance Object Filter Frame Metric Temporal MetricEvent Metric Loading Video Tool Core library Plugin
ViSEvAl plugins 1/2 Loading video ASF-Videos, Caviar-Images, JPEG- Images, Kinect-Images (hospital), OpenCV-Videos (Vanhaeim), PNG-Images Distance Bertozzi, dice coefficient, overlapping Filter CloseTo, FarFrom, Identity, TypeGroup
ViSEvAl plugins 2/2 Criteria Detection: M1.X 2 criteria (M1.1: area, M1.2: silhouette) Classification: M2.X 2 criteria (M2.1: type, M2.2: sub type) Tracking: M3.X 6 criteria (M3.1: F2F, M3.2: persistence, M3.4 (tracking time), M3.5: confusion, M3.6, M3.7: confusion + tracking time, M3.8: frame detection accuracy) Event: M4.X 4 criteria (M4.1, M4.2: begin and end time, M4.3: common frame, M4.4: common time)
ViSEvAl: inputs A set of XML files Detection: XML1 file -> sup platform Recognised event: XML3 file -> sup platform Ground truth: xgtf file -> Viper tool Time stamp file for time synchronisation : xml file -> createTimeStampFile.sh script provided by ViSEvAl
ViSEvAl installation Get the sources sup svn repository cd sup/evaluation/ViSEvAl/ Run intall.sh at the root of ViSEvAl folder Dependence: Librairies: QT4 (graphical user interface, plugin management), gl and glu (opengl 3D view), xerces-c (XML read), opencv (video read) Tool: xsdcxx (automatically compute C++ classes for reading XML files) cd bin/appli; setenv LD_LIBRARY_PATH../../lib Run./ViSEvAlGUI chu.conf
ViSEvAl folder organisation src : appli, plugins (Cdistance, CeventMetric, CframeMetric, CloadingVideoInterface, CobjectFilter, CTemporalMetric) include : header files install.sh, clean.sh doc : documentation lib : core library, plugins scripts : createTimeStampFile.sh makeVideoFile.sh splitxml1- 2-3file.sh bin : ViSEvAlGUI, ViSEvAlEvaluation tools : CaviarToViseval, QuasperToViseval xsd : xml schemas
ViSEvAl: configuration file Configuration file based on Keyword-Parameter SequenceLoadMethod "JPEG-Images” #"ASF-Videos“ SequenceLocation "0:../../example/CHU/Scenario_02.vid" TrackingResult "0:../../example/CHU/Scenario_02_Global_XML1.xml" EventResult "../../example/CHU/Scenario_02_Global_XML3.xml" GroundTruth "0:../../example/CHU/gt_ a_mp.xgtf" XMLCamera "0:../../example/CHU/jai4.xml" MetricTemporal "Mono:M3.4:M3.4:DiceCoefficient:0.5:TypeGroup" MetricEvent "M4.2:M4.2.1:Duration:10
ViSEvAl run trace Mon, 11:15>./ViSEvAlGUI Load all the plugins Loading video interfaces: ASF-Videos Caviar-Images JPEG-Images Kinect-Images OpenCV-Videos PNG-Images Loading distance: 3DBertozzi 3DDiceCoefficient 3DOverlapping Bertozzi DiceCoefficient Overlapping Loading object filter: CloseTo FarFrom Identity TypeGroup Loading frame metric: M1.1 M1.2 M2.1 M2.2 M Loading temporal metric: M3.2 M3.4 M3.5 M3.6 M3.7 M Loading event metric: M4.1 M4.2 M4.3 M
ViSEvAl: two tools ViSEvAlGUI Graphical user interface Visualise detection and ground truth on the images User can easily select parameters (e.g. distance, threshold,...) Frame metrics results are computed in live ViSEvAlEvaluation Generate a.res file containing the results of the metrics Frame, temporal and event metrics are computed User can evaluate several experiments Same configuration file for the both tools
ViSEvAl: result file (.res) camera: 0 Tracking result file: /user/bboulay/home/work/svnwork/sup/evaluation/ViSEvAl/example/vanaheim/res.xml1.xml Fusion result file: Event result file: Ground truth file: /user/bboulay/home/work/svnwork/sup/evaluation/ViSEvAl/example/vanaheim/Tornelli T07_00_01_groups.xgtf Common frames with ground-truth: Detection results: ***** ==================================================== Metric M1.1.1 ==================================================== Frame;Precision;Sensitivity 0;True Positive;False Positive;False Negative 0;Couples 8004; ; ;1;1;0;(100;170; ) 8005; ; ;1;1;0;(100;170; ) 8006; ; ;1;1;0;(100;170; ) 8007; ; ;1;1;0;(100;170; ) ==================================================== Final Results: Global results: Number of True Positives : 1789 Number of False Positives : 1597 Number of False Negatives 0: 2254 Precision (mean by frame) : Sensitivity 0 (mean by frame) : Precision (global) : Sensitivity 0 (global) : Results for GT Object 2 Number of True Positives : 0 Number of False Positives : 0 Number of False Negatives 0: 0 Precision (global) : Sensitivity 0 (global) :