Download presentation
Presentation is loading. Please wait.
1
Context-aware / Multimodal UI Breakout Summary James A. Landay, et. al. HCC Retreat July 7, 2000
2
7/5/20002 Participants 4 James Landay 4 Anoop Sinha 4 Jimmy Lin 4 Trevor Perring 4 Greg Heinzinger 4 Chris Long 4 Ed Chi 4 Christine Halverson 4 Gian Gonzaga 4 Ken Fishkin 4 John Lowe 4 Adam Janin 4 Russell Eames 4 Elin Pedersen
3
7/5/20003 Applications 4 Alert management *sites beacon context +“this is a quiet place, no interruptions please” +e.g., movie theater or restaurant *devices use context to avoid interruptions *wearable t-shirt that jams local cell phones! 4 “Elvis has left the meeting” *easily share documents from meetings *beam tokens of documents to participants or *use shared context to find docs later +e.g., I was in a meeting with Ken at Lake Tahoe, find docs
4
7/5/20004 Context Events 4 Signal changes *like a windowing system 4 Can use as triggers to cause other actions *change my phone forwarding when I change locations 4 Can be immediate or logged for later tacit information mining
5
7/5/20005 Context Implementation Issues 4 Apps need to share context easily *build-in to apps like cut & paste *context dial tone or infrastructure 4 Global file system *easier to share context & not have to transfer it *just use pointers 4 How to search / browse *computers are good at searching large spaces *humans good at making associations 4 Why not search with Google instead of browser history? *Google easier to get at & seems to work well
6
7/5/20006 Context Toolkits / APIs / Refs 4 Bill Schilit’s Columbia / PARC Ph.D. 4 GA Tech GVU (Anind Dey) 4 IBM (Maria Ebling) 4 MIT (?) 4 ESPIRIT projects have looked at context *German? project according to Elin Pedersen
7
7/5/20007 Interface Between Context & Multimodal UIs 4 “Context is just another kind of input” *different to the user, but similar to the system *user input caused by EXPLICIT user action *context is IMPLICT or DERRIVED 4 Context and multimodal UIs have similar privacy problems *natural inputs are human “readable” *may not want to share context or my input
8
7/5/20008 Interface Between Context & Multimodal UIs 4 Context to choose output modality *e.g., user is in a meeting, don’t use speech 4 Context to disambiguate input(s) *help fusion: there is noise, don’t rely on speech *“the clutching problem” – infer user intent 4 Modality used to help inter context *e.g., talking to device -> user is alone?
9
7/5/20009 Initial Design for Multimodal UI Design Tool 4 Create “rough cuts” *informal (sketching / “Wizard of Oz”) *iterative design (user testing/fast mods) 4 Infer models from design *designer can augment model over time 4 Generate initial prototypes *UIs for multiple devices *designer adds detail / improve UI +or even removes detail Model
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.