Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stanford hci group / cs376 research topics in human-computer interaction Multimodal Interfaces Scott Klemmer 15 November 2005.

Similar presentations


Presentation on theme: "Stanford hci group / cs376 research topics in human-computer interaction Multimodal Interfaces Scott Klemmer 15 November 2005."— Presentation transcript:

1 stanford hci group / cs376 research topics in human-computer interaction http://cs376.stanford.edu Multimodal Interfaces Scott Klemmer 15 November 2005

2 09 November 2004 2 Multimodal Interfaces Some hci definitions  Multimodal generally refers to an interface that can accept input from two or more combined modes  Multimedia generally refers to an interface that produces output in two or more modes  The vast majority of multimodal systems have been speech + pointing (pen or mouse) input, with graphical (and sometimes voice) output

3 09 November 2004 3 Multimodal Interfaces Canonical App: Maps  Why are maps so well-suited?  A visual artifact for computation (Hutchins)

4 09 November 2004 4 Multimodal Interfaces What is an interface  Is it an interface if there’s no method for a user to tell if they’ve done something?  What might an example be?  Is it an interface if there’s no method for explicit user input?  example: health monitoring apps

5 09 November 2004 5 Multimodal Interfaces Sensor Fusion  multimodal = multiple human channels  sensor fusion = multiple sensor channels  Example app: Tracking people (1 human channel)  might use: RFID + vision + keyboard activity + …  I disagree with the Oviatt paper  Speech + lips is sensor fusion, not multimodality

6 09 November 2004 6 Multimodal Interfaces What constitutes a modality?  To some extent, it’s a matter of semantics  Is pen a different modality than a mouse?  Are two mice different modalities if one is controlling a gui, and the other controls a tablet-like ui?  Is a captured modality the same as an input modality?  How does the audio notebook fit into this?

7 09 November 2004 7 Multimodal Interfaces Input modalities  mouse  pen: recognized or unrecognized  speech  non-speech audio  tangible object manipulation  gaze, posture, body-tracking  Each of these experiences has different implementing technologies  e.g., gaze tracking could be laser-based or vision-based

8 09 November 2004 8 Multimodal Interfaces Output modalities  Visual displays  Raster graphics, Oscilloscope, paper printer, …  Haptics: Force Feedback  Audio  Smell  Taste

9 09 November 2004 9 Multimodal Interfaces Dual Purpose Speech

10 09 November 2004 10 Multimodal Interfaces Why multimodal?  Hands busy / eyes busy  Mutual disambiguation  Faster input  “More natural”

11 09 November 2004 11 Multimodal Interfaces On Anthropomorphism  The multimodal community grew out of the AI and speech communities  Should human communication with computers be as similar as possible to human-human communication?

12 09 November 2004 12 Multimodal Interfaces Multimodal Software Architectures  OAA, AAA, OOPS


Download ppt "Stanford hci group / cs376 research topics in human-computer interaction Multimodal Interfaces Scott Klemmer 15 November 2005."

Similar presentations


Ads by Google