Download presentation
Presentation is loading. Please wait.
1
Mark W. Newman Trends in HCI mwnewman@umich.edu
University of Michigan School of Information University of Michigan Human Factors Engineering Short Course 2011 1
2
What does HCI study? People Design Task Technology
3
What does HCI study? People Design Task Technology What do people do?
How can we support them? Design Task Technology
4
What does HCI study? People Design Task Technology
How do people interact? What are the human constraints? Design Task Technology
5
What does HCI study? People Design Task Technology
How do advances in technology expand the kinds of tasks that can be supported? Task Technology
6
What does HCI study? People Design Task Technology
How do we use our understanding of people, their tasks, and technical capabilities to design better systems? Design Task Technology
7
People Trends: Wider Social Context
Classic HCI Design Technology People Task CSCW Social Computing
8
Task Trends: New Contexts, New Activities
People Design Technology Task Image credits
9
TechTrends: Bigger, Smaller, More Pervasive
Walls, Tables Smaller Mobile, wearables Multiple Devices Multi-display environments, tangible computing Context-aware computing Worn, carried, or embedded sensing used as an interaction input People Design Technology Task
10
Sidebar: How long do innovations take from concept to widespread adoption?
[Fenn10]
11
Sidebar: Adoption of the Mouse
From concept to mass market: years 1995: Win95 launched 1973: Work begins on Xerox Alto 1978: Xerox Star goes to “market” 1984: Apple Macintosh launched 1965: Doug Engelbart creates first mouse 1969: Engelbart ‘s “mother of all demos” Adapted from [Buxton07]
12
HCI Technology Drivers
Computing: Cheaper, smaller, faster Networking: Ubiquitous and faster Displays: Cheaper and bigger Displays: Cheaper and smaller Sensing: Cheaper and smaller Sensing: Ubiquitous
13
Large Displays: Multitouch
Video: Large multitouch screen interaction Bimanual direct manipulation Multiple simultaneous users [Han05]
14
Very Large Displays: OptiPortals
30-100M Pixels; used for scientific visualization [Pieper09]
15
Large Displays: Tabletops
Encourage multi-user interaction Integrate with new environments Raise new issues Multi-user coordination Rotation-independence Private/public data sharing Tabletops are interesting because they’re more democratic in a way. No perspective is privileged, everything is in reach. It’s not a coincidence that tables feature prominently in pretty much every meeting room you’ve ever been in. Also in every dining room, restaurant, etc. This is a great form for bringing people together. What happens when that becomes interactive? It’s not just taking a wall mounted display and tipping it over. New interactions are needed, and new contexts are supported. Commercial vendors: Circle 12 Smart Tech [ITS 2010]
16
Large Displays: Applications
Personal Workspaces Group Sharing Tight Collaboration Peripheral Information
17
Large Displays: Usability Issues
Control from a distance Losing track of the cursor Window management Task management Configuration problems Leverage the periphery Multi-user coordination
18
Large Display Input: Selecting and Directing
Drag and Pop Extending familiar interactions to a new platform General problem with large displays: how do you work with things that are far apart? [Baudisch03]
19
Large Display Input: Cameraphones
Most people already have an interactive device with them Use built-in capabilities of “personal interactors” to control public devices Selection and control at a distance [Ballagas05]
20
Large Display Interaction: Sensing distance
Use precise location sensing to determine user’s proximity Different functionality based on distance [Vogel04]
21
Tabletop Interaction: Rotation and Personal Space
There is no longer a “privileged” vantage point Need to distinguish users’ content, especially when sharing [Shen06]
22
Small Devices: NanoTouch
Video: [Baudisch09]
23
Wearable Computing [Mann97]
A lot less geeky than it used to be. “Normal” people might actually wear the gear in the lower right. Wearable Computing: A first step toward "Personal Imaging" IEEE Computer, Vol.30, No.3 (summary of my last 20 years as a "photographic cyborg"). Wearable Computing [Mann97]
24
Wearable Computing: Skinput
Watch Video: [Harrison10]
25
Small Device Usability Issues
Information density Selection target size Text entry Gesture set size Gesture set learnability Integration with larger devices Multimodal interaction
26
Context-Aware Computing: The Active Badge (1992)
[Want92]
27
Context-Aware Computing: Cyberguide (1997)
First (of many) location-aware tour guides Client detects position through GPS or IR [Abowd97]
28
Location-Aware Computing
Now in stores! Location-awareness has entered the mainstream
29
Location Aware Applications: Navigation for the Visually Impaired
Video: [Stewart08]
30
Activity-Aware Computing: UbiFit Garden
Walking Running Cycling Playing Soccer Doing Yoga … Running, cycling, walking. [Consolvo08]
31
Activity-Aware Computing: The CareNet Display
Monitor loved-one’s activity Broadcast awareness to support network Support for aging in place [Consolvo04]
32
Activity-Awareness: Detecting Interruptibility
[Fogarty07]
33
Context-aware computing: HCI research issues
Information sharing and privacy Configuring and monitoring implicit interactions Intelligibility of system inference Integration of personal, public, and infrastructure devices Training and personalization
34
TechTrends: Bigger, Smaller, More Pervasive
Walls, Tables Smaller Mobile, wearables Multiple Devices Multi-display environments, tangible computing Context-aware computing Worn, carried, or embedded sensing used as an interaction input Design Technology People Task
35
Summary HCI evolves New technology drives Wider focus on “people”
Wider focus on “tasks” New technical capabilities New technology drives New interactions New tasks New questions about human needs & capabilities Design Technology People Task
36
References [Abowd97] G.D. Abowd, C.G. Atkeson, J. Hong, S. Long, R. Kooper, and M. Pinkerton, “Cyberguide: a mobile context-aware tour guide,” Wireless Networks., vol. 3, 1997, pp [Ballagas05] R. Ballagas, M. Rohs, and J.G. Sheridan, “Sweep and point and shoot: phonecam-based interactions for large public displays,” CHI '05 extended abstracts on Human factors in computing systems, Portland, OR, USA: ACM, 2005, pp [Baudisch09] P. Baudisch and G. Chu, “Back-of-device interaction allows creating very small touch devices,” Proceedings of the 27th international conference on Human factors in computing systems, Boston, MA, USA: ACM, 2009, pp [Baudisch03] P. Baudisch, E. Cutrell, D. Robbins, and M. Czerwinski, “Drag-and-pop and drag-and-pick: Techniques for accessing remote screen content on touch-and pen-operated systems,” Human-computer interaction: INTERACT'03; IFIP TC13 International Conference on Human-Computer Interaction, 1st-5th September 2003, Zurich, Switzerland, 2003, p. 57. [Buxton07] W. Buxton, Sketching user experiences: Getting the design right and the right design, Morgan Kaufmann, [Consolvo04] S. Consolvo, P. Roessler, and B.E. Shelton, “The carenet display: Lessons learned from an in home evaluation of an ambient display,” PROCEEDINGS OF THE 6TH INT'L CONFERENCE ON UBIQUITOUS COMPUTING: UBICOMP '04, 2004, pp [Consolvo08] S. Consolvo, D.W. McDonald, T. Toscos, M.Y. Chen, J. Froehlich, B. Harrison, P. Klasnja, A. LaMarca, L. LeGrand, R. Libby, I. Smith, and J.A. Landay, “Activity sensing in the wild: a field trial of ubifit garden,” Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, Florence, Italy: ACM, 2008, pp [Fenn10] J. Fenn. "2010 Emerging Technologies Hype Cycle is Here." Mastering the Hype Cycle blog. Sept. 7, [ITS10] ACM Interactive Tabletops and Surfaces Conference. Saarbrücken, Germany. November 7-10, [Fogarty07] J. Fogarty and S.E. Hudson, “Toolkit support for developing and deploying sensor-based statistical models of human situations,” Proceedings of the SIGCHI conference on Human factors in computing systems, San Jose, California, USA: ACM, 2007, pp [Han05] J. Y. Han. "Low-Cost Multi-Touch Sensing through Frustrated Total Internal Reflection." In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology. Seatttle, WA, USA
37
References [Harrison10] C. Harrison, D. Tan, and D. Morris, “Skinput: appropriating the body as an input surface,” Proceedings of the 28th international conference on Human factors in computing systems, Atlanta, Georgia, USA: ACM, 2010, pp [Mann97] S. Mann, “Wearable Computing: A First Step Toward Personal Imaging,” IEEE Computer, vol. 30, 1997, pp [Pieper09] G. W. Pieper, et al. "Visualizing Science: The OptIPuter Project." SciDAC Review 12. Spring [Shen06] C. Shen, K. Ryall, C. Forlines, A. Esenther, K. Everitt, M. Hancock, M. R. Morris, F. Vernier, D. Wigdor, and M. Wu. "Interfaces, Interaction Techniques and User Experience on Direct-Touch Horizontal Surfaces." IEEE Computer Graphics and Applications, Sept/Oct 2006, [Stewart08] J. Stewart, S. Bauman, M. Escobar, J.Hilden, K. Bihani, and M. W. Newman. "Accessible Contextual Information for Urban Orientation." In Proceedings of the 10th international conference on Ubiquitous computing (UbiComp 2008) [Note]. Seoul, Korea (2008) [Vogel04] D. Vogel and R. Balakrishnan, “Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users.” In Proceedings of the 17th annual ACM symposium on User interface software and technology, Santa Fe, NM, USA: ACM, 2004, pp [Want92] R. Want, A. Hopper, V. Falcão, and J. Gibbons, “The active badge location system,” ACM Trans. Inf. Syst., vol. 10, 1992, pp
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.