Download presentation
Presentation is loading. Please wait.
1
Input and Interaction Dr. Yingcai Xiao
2
A good user interface allows users to perform interaction tasks with ease and joy. WYSIWYG (What you see is what you get). Four basic interaction tasks: position, select, quantify, text Basic design principle: Look and Feel
3
Hardware Locators: relative devices: mice, trackballs, joysticks absolute devices: data tablets, touch screen, Kinect direct devices: light pens, touch screens indirect devices: mice, trackballs, joysticks continues devices: mice, trackballs, joysticks discrete devices: control keys
4
Keyboards: QWERTY(slow down typing) Dvorak, order by frequency of use alphabetic order
5
Valuators: bounded: volume control on radio unbounded: clock Choice Devices: function keys foot switches Haptic Devices: pressure-sensitive stylus force-feedback controls (haptic)
6
3D Interaction Devices Joysticks with a shaft that twists for a third dimension Kinect 3D camera Polhemus 3D sensors
7
VR: virtual reality, immersive, head- mounted data glove
8
The Evolution of User Interface CLI (Command Line Interface) Keyboard GUI (Graphical User Interface) Mouse NUI (Natural User Interface) Kinect sensor (A/V)
9
Basic Interaction Tasks (BITS) position: by pointing (GRAPHICS) selection: by name (DB), by pointing GUI: hierarchical pull-down menu, radio- buttons. e.g. Format->Paragraph… text interaction: keyboard-->text-string quantify interaction: dials, sliders 3D interaction tasks: Z value multiple views, shift + button-down gestures voice
10
Composite Interaction Tasks (CITs) dialogue boxes Rubber-banding
11
Interaction-Technique Toolkits get user input (control) ==>events graphical output (feedback) ==>display graphic device interfaces: GDI X Windows Toolkit (UNIX) Windows API (PC) JAVA Swing Standard APIs for input hardware: not there, need standards. NUI: OpenNI, MS Kinect SDK Haptic: OpenHaptics Toolkit
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.