Download presentation
Presentation is loading. Please wait.
Published bySilvester Conley Modified over 8 years ago
1
User-Centric Design of a Vision System for Interactive Applications Stanislaw Borkowski, Julien Letessier, François Bérard, and James L. Crowley ICVS’06 New York, NY, USA January 5, 2006
2
2 Academic context PRIMA group (GRAVIR lab, INRIA) « Perception, Recognition and Integration for Interactive Environments » IIHM group (CLIPS lab, Univ. Grenoble) « Engineering in Human-Computer Interactions »
3
3 Outline Context: augmented surfaces User-centric approach in vision systems User-centric requirements Implementation SPODs, VEIL, Support services Conclusions & Future Work
4
4 credit: F. Bérard, J. Letessier Context : augmented surfaces Interacting with projected images... direct manipulation user collaboration mobility... is not realistic today limited, controlled conditions operator requirement software integration issues
5
5 Objectives Propose a client-centric approach design of perceptive input systems two classes of clients : end users realize an interaction task developers create an interactive application Application : design an input system address simple augmented surfaces feature vision-based, WIMP-like widgets (e.g. press-buttons) acheive the usability of a physical input device
6
6 Top-down design Determine client requirements consequences of HCI and SOA requirements user-centric / developer-centric functional / non-functional Service-oriented def : a service adds value to information SOA is a collection of communicating services Approach Overview
7
7 Developer requirements Abstraction : be relevant make computer vision invisible generalize the input Isolation : allow integration permit service distribution support remote access to services offer code reuse Contract : offer quality of service specify usage conditions determine service latency, precision, etc.
8
8 End-user requirements Typical for "real time" interaction Latency limits upper bound : 50 ms for coupled interaction lower bound : 1 s for monitoring applications Autonomy ideally, no setup or maintenance in practice, minimize task disruption Reliability / predictability either real-time or unusable reproducible user experience
9
Pragmatic approach Black-box services BIP (Basic Interconnection Protocol) BIP implementation ≈ SOA middleware service/service and service/application comm. goal 1 : performance connection-oriented (TCP-based) low latency (UDP extensions) goal 2 : easy integration service discovery (standards-based) implementations provided (C++, Java, Tcl) interoperability ≤ 100 lines of code
10
10 Our approach Abstraction, Isolation : use BIP advice to service developers Contract : nothing enforced recommend evaluation of hci-centric criteria Common ground allows to create SOA-based prototypes
11
11 Interactive widgets projected on a portable display surface
12
12 Luminance-based button widget S. Borkowski, J. Letessier, and J. L. Crowley. Spatial Control of Interactive Surfaces in an Augmented Environment. In Proceedings of the EHCI’04. Springer, 2004.
13
13 Touch detection Locate widget in the camera image Calculate mean luminance over the widget Update the state widget state
14
14 Robustness to clutter
15
15 Robustness to clutter
16
16 Assembling occlusion detectors
17
17 Assembling occlusion detectors
18
18 Striplet – the occlusion detector Gain x x y
19
19 Striplet – the occlusion detector x y
20
20 Striplet – the occlusion detector x y
21
21 Striplet-based SPOD SPOD – Simple-Pattern Occlusion Detector
22
22 Striplet-based button
23
23 Striplet-based slider
24
24 SPOD software components Camera Client Application Calibration GUI rendering GUI Striplets Engine VEIL SPODSPOD
25
25 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
26
26 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
27
27 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
28
28 Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service Outputs Occlusion events Striplets Engine Service Striplets Engine VEIL SPODSPOD
29
29 Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service Outputs Occlusion events Striplets Engine Service Striplets Engine VEIL SPODSPOD
30
30 Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service Outputs Occlusion events Striplets Engine Service Striplets Engine VEIL SPODSPOD
31
31 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
32
32 SPOD-based calculator Video available at: http://www-prima.inrialpes.fr
33
33 Conclusions We have presented Service-oriented approach Implementation Future work Different detector types More intelligent VEIL Integration to GML
34
34 Thank you for your attention
35
35 Assembling occlusion detectors
36
36 Striplet – the occlusion detector
37
37 Striplet – the occlusion detector
38
38 Striplet – the occlusion detector
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.