Download presentation
Presentation is loading. Please wait.
Published byIsabella Pratt Modified over 11 years ago
1
Steerable Interfaces for Interactive Environments Stanislaw Borkowski thesis director: James L. Crowley Jury: Institut National de Recherche en Informatique et Automatique INRIA Rhône-Alpes June 26, 2006 Andreas Butz (UM), Joëlle Coutaz (UJF), Alex Pentland (MIT), Pierre Wellner (IDIAP)
2
2 User Interface (UI): aggregate of physical entities or information bound to these entities What is a user interface? A
3
3 Steerable UI: can be relocated in space position is mediated by the computer system Portable UI: can be relocated position is directly controlled through physical contact Mobile UIs Portable UIs Steerable UIs A
4
4 Mobility in current IT Steerable interfaces Conventional GUI (steerable output) X11 session teleporting [Richardson93] Portable interfaces Wearable computers Cell phones Personal Digital Assistants Laptops ….
5
5 Mobility in ambient computing Multiple displays embedded in the environment Large size displays Mobile interaction resources, both portable and steerable [Pinhanez01] [Streitz99] [Arias00]
6
6 Why steerable? Flexibility in resources usage New forms of Human-computer interaction New forms of Human-Human interaction
7
7 Current situation – summary Problem: Need for steerable UIs No predictive models Solution: Provide enabling technology Explore interaction techniques Evaluate the value of steerable UIs
8
8 Outline Mobility in IT Steerable UIs Mobile projected UI Mobile UIs for collaborative work Conclusions
9
9 State of the art EasyLiving [Brumitt00] Tic-Tac-Toe [Pinhanez05]
10
10 State of the art – limitations UI is observable at standstill Limited spatial controllability Only predefined locations Planar surfaces only Requirements for steerable UIs: Continuous observability and controllability
11
11 Outline Mobility in IT Steerable UIs Mobile projected UI Prototype implementation [in collaboration with J. Letessier] Evaluation – latency estimation Mobile UIs for collaborative work Conclusions
12
12 The Steerable Camera Projector (2002) Other steerable projection systems: The Everywhere Display (IBM 2000) Fluid Beam (Fluidum consortium 2002) SCP from Karlsruhe (UKA 2004)
13
13 Steerable display (2002)
14
14 User-centric approach End-users: Latency limits < 50ms Easy setup, no maintenance Reliability / predictability Developers: Abstraction: be relevant Isolation: allow integration Contract: offer quality of service
15
15 Pragmatic approach Black-box services BIP (Basic Interconnection Protocol) BIP implementation SOA middleware service/service and service/application communication service discovery (standards-based)
16
16 Interactive system Application SCP software Human and Environment Interaction events Display orders
17
17 Interactive system Application Human and Environment SCPdisplay SCPcontroller Frame grabber Interaction detector SCPcalibrator A
18
18 Interactive system Application Human and Environment SCPdisplay SCPcontroller Frame grabber Interaction detector SCPcalibrator
19
19 Video projector Light source Screen Source image Projection on arbitrary oriented planar surfaces Users perception
20
20 Video projector Screen Projection on arbitrary oriented planar surfaces Light source Image to project Users perception Source image SCPdisplay
21
21 Projection on arbitrary oriented planar surfaces Image to project Users view
22
22 Interactive system Application Human and Environment SCPdisplay SCPcontroller Frame grabber Interaction detector SCPcalibrator
23
23 Sensor-centric environment model 1 2 3 1 2 3
24
24 Display surface detection Screen
25
25 The Portable Display Surface
26
26 Interactive system Application Human and Environment SCPdisplay SCPcontroller Frame grabber Interaction detector SCPcalibrator
27
27 Interactive widgets projected on a portable display surface
28
28 Luminance-based button widget
29
29 Locate widget in the camera image Estimate occlusion Update widget state Touch detection
30
30 Robustness to clutter
31
31 Assembling occlusion detectors
32
32 Striplet – the occlusion detector x y
33
33 Striplet-based SPOD SPOD – Simple-Pattern Occlusion Detector
34
34 Striplet-based button
35
35 SPOD-based calculator Accelerated video
36
36 Outline Mobility in IT Steerable UIs Mobile projected UI Prototype implementation Evaluation – latency estimation Mobile UIs for collaborative work Conclusions
37
37 Latency estimation PCI A/D converter Frame Grabber CPU Imalab shell Image processing Graphic Card OpenGl render
38
38 Latency estimation Fan PCI A/D converter Frame Grabber CPU Imalab shell Image processing Graphic Card OpenGl render Regulated power supply Video sequence capture Plastic bar Projection of the bar
39
39 Latency estimation – results + up to 51ms!!! A ~17ms PCI A/D converter Frame Grabber CPU Imalab shell Image processing Graphic Card OpenGl render ~70ms PCI A/D converter Frame Grabber CPU Imalab shell Graphic Card ~32ms
40
40 Interactive system Application Human and Environment SCPdisplay SCPcontroller Frame grabber Interaction detector SCPcalibrator
41
41 Outline Mobility in IT Steerable UIs Mobile projected UIs Mobile UIs for collaborative work ContAct application User study – comparison of different take- over techniques Conclusions
42
42 ContAct – a system for authoring presentations Collaboration through interface mobility
43
43 ContAct application setup Wide angle camera Tabletop camera Steerable Camera Projector Portable Display Surface
44
44 ContAct application GUI
45
45 Outline Mobility in IT Steerable interface prototype Mobile UIs for collaborative work ContAct application Taking control: a comparative user study [in collaboration with J. Maisonnasse and J. Letessier] Conclusions
46
46 Evaluation of techniques for taking control Objectives: Determine the preferred control taking technique Evaluate the impact on the task completion performance Evaluate user acceptance of steerable interfaces
47
47 Experimental setup GUI Users Hardware: Steerable Camera Projector Microphone headsets Portable Display Surface Software: Speech detector [D. Vaufreydaz] Conversation modeling [J. Maisonnaisse] Finger tracking [J. Letessier] PDS tracking Drawing application
48
48 The User Interface
49
49 The task Collaborative reconstruction of a graph
50
50 The task Collaborative reconstruction of a graph User 2 User 3User 1
51
51 Experimental conditions Proposed techniques for taking control: Baseline: fixed interface Portable: PDS Steerable: touch-based Steerable: voice-based steering
52
52 Fixed interface GUI Users
53
53 Explicit direct manipulation
54
54 Explicit touch-based steering
55
55 Implicit voice-based steering Rules controlling the interface location: Interface is steered toward the main speaker Interruptions are ignored Drawing inhibits vocal steering Conflicts result in loss of interface control
56
56 Subjects 12 groups of 3 people 13 women, 23 men Average age 27.7 19 experts in IT 17 subjects familiar with IT
57
57 Results – user preference rank Rank scale: 1 = most liked 4 = least liked
58
58 Results – PDS Fun to use Predictable Less intuitive Less reactive Not well suited for the task ExpertsNon-experts
59
59 Results – Voice-based control Intimidating Limits collaboration Fun to use Enhances collaboration ExpertsNon-experts Modified their behaviour Least predictable
60
60 Example result
61
61 User performance – ability to duplicate % of remembered elements
62
62 Outline Mobility in IT Steerable UIs Mobile projected UI Mobile UIs for collaborative work Conclusions
63
63 Conclusions 1/2 Steerable camera-projector pair enables mobile UIs Portable UIs (PDS)Steerable UIs
64
64 Conclusions 2/2 UI mobility can enhance the collaborative experience Explicit control is preferred over implicit control
65
65 Future directions 1/2 The SCP: Adapting to display surface texture The PDS: Tracking and interaction with multiple PDS High frame-rate tracking Vision-based projected widgets: Integration of multiple occlusion detectors
66
66 Steerable interfaces: Other applications for steerable interfaces Alternative methods for controlling the location Exploring links with plastic interfaces – dynamic interface adaptation Creation of a space manager Future directions 2/2
67
67 Thank you for your attention
68
68
69
69 Results The preference: #1the PDS #2button-based control #3voice-based control #4fixed interface
70
70 1 2 3 Sensor-centric environment model
71
71 SPOD software components Frame Grabber Client Application Calibration GUI rendering GUI Striplets Engine VEIL SPODSPOD
72
72 Striplet – the occlusion detector Gain x x y
73
73 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
74
74 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
75
75 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
76
76 Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service Outputs Occlusion events Striplets Engine Service Striplets Engine VEIL SPODSPOD
77
77 Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service Outputs Occlusion events Striplets Engine Service Striplets Engine VEIL SPODSPOD
78
78 Inputs Striplets UI-coordinates UI to camera mapping matrix Images from camera service Outputs Occlusion events Striplets Engine Service Striplets Engine VEIL SPODSPOD
79
79 VEIL – Vision Events Interpretation Layer Striplets Engine VEIL SPODSPOD Inputs Widgets coordinates Scale and UI to camera mapping matrix Striplets occlusion events Outputs Interaction events Striplets coordinates
80
80 Striplet-based slider
81
81 Tracking the PDS Tracking edges in the Hough space + Naturally robust to partial occlusions - High computation cost Line-segments-based tracking + Efficient quadrilateral detection - Difficulties in handling occlusions
82
82 Pushing vs. pulling the UI
83
83 Results Time performance: Trial number Normalized trial time
84
84 Performance +- 180 deg of pan 1600 discrete positions (resolution) 90 deg/s max pan speed reached in 0.75 s 90 deg of tilt 500 discrete positions 80 deg/s max tilt speed reached in 0.60s Video Projector Camera Pan Stepper- motor Tilt Stepper- motor Control and power supply
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.