Presentation is loading. Please wait.

Presentation is loading. Please wait.

user-defined gestures for surface computing

Similar presentations


Presentation on theme: "user-defined gestures for surface computing"— Presentation transcript:

1 user-defined gestures for surface computing
jacob o. wobbrock meredith ringel morris university of washington andrew d. wilson microsoft research

2 motivation

3 real life gestures?

4 video

5 i can think of a few gestures
but… what do non-actors, non-researchers, “everyday people” do? beyond finger painting and resizing photos? “The user is not like me.” i can think of a few gestures

6 contributions taxonomy of surface gestures
quant. & qual. characterization of users’ gestures taxonomy of surface gestures user-defined gesture set implications for surface computing technology mental model insights

7 related work

8 studies of discursive gesture (efron 1941, kendon 1988, mcneill 1992)
studies of table & object use (tang 1991, liu et al. 2006) speech and gestures (mignot et al. 1993, robbe 1998, robbe-reiter et al. 2000, beringer 2002, tse et al. 2006) eliciting input from users (epps et al. 2006, good et al. 1984, nielsen et al. 2004, wobbrock et al. 2005)

9 room planner (wu & balakrishnan 2003)
registration, relaxation, reuse (wu et al. 2006) (moscovich & hughes 2006) walls from a distance (malik et al. 2005) The gestures defined by designers and system builders were often chosen based on what was “cool” or easy to recognize. In contrast, we wanted to see what gestures everyday people made.

10 cooperative gestures (morris et al. 2006)
smartskin (rekimoto 2002) cooperative gestures (morris et al. 2006) surface physics (wilson et al. 2008) under the table (wigdor et al. 2006)

11 method

12 signs, referents dog sign (or symbol) referent

13 referents

14 guessability (wobbrock et al. 2005) “dog” “god”

15 guessability “god” “dog” “ram” “yak”

16 conflict “dog” “god” “god” “dog” “ram” “yak” wins “dog”

17 conflict wins “god” & “ram”, “yak” “dog” “god” “god” “dog” “ram” “yak”

18 coverage + “god” (2/5) + “ram” (1/5) + “yak” (1/5) - “dog” (1/5)
3+4=7/10=70% Percentage of guesses that are retained in the final symbol set.

19 agreement (3/5)2+(2/5)2=0.52 (2/5)2+(1/5)2+(1/5)2+(1/5)2=0.28 “dog”
“god” “god” “dog” “ram” “yak” ½( )=0.40 Percentage of guesses that are retained in the final symbol set.

20 “i would touch the corner to rotate like this”
upshot “i would touch the corner to rotate like this” referent (command) sign (gesture)

21 bias

22 shapes world ?

23 procedure

24 logged all touch and likert data
2 hands 27 referents 1 hand 1080 gestures think aloud 20 participants logged all touch and likert data

25 results

26 agreement 0.28 0.32 preferred for 25 of 27 referents

27 correlation with conceptual complexity: r = -.52, F1,25=9.51, p<.01
agreement correlation with conceptual complexity: r = -.52, F1,25=9.51, p<.01

28 coverage 57.0% So 57.0% of all guesses ended up in the final gesture set.

29

30 22 of 27 referents assigned gestures
properties 48 gestures in all 22 of 27 referents assigned gestures reversibility conflict free

31 taxonomy taxonomy Videos agreement scores coverage

32 nature physical symbolic metaphorical abstract

33 breakdown

34 correlations  planning time,  goodness ratings
 conceptual complexity,  planning time  planning time,  goodness ratings  agreement,  goodness ratings  conceptual complexity,  goodness ratings

35 observations number of fingers a windows world acting above the table
Show clips in order along with these observations. land beyond screen

36 discussion

37 implications sense outside boundaries, from side or down from top
avoid depending on exact no. of fingers participants can be used to “design” this way Methodological implication is that we have good evidence that participants can be used to design this way. The gesture set emerged coherent, reversible where appropriate, and with generally good agreement. use widgets where there’s low agreement mmm

38 limitations cannot revise “monologue” shapes world being at ?
culture specific? users aren’t designers being at ?

39 future work implement recognizer, i.e., create “dialogue”
validate guessability, i.e., reverse process user-specific gesture sets, defined on-the-fly taxonomy videos agreement scores coverage physics and physical gestures?

40 conclusion

41 contributions taxonomy of surface gestures
quant. & qual. characterization of users’ gestures taxonomy of surface gestures user-defined gesture set implications for surface computing technology mental model insights

42 thank you jacob o. wobbrock university of washington
meredith ringel morris andrew d. wilson microsoft research


Download ppt "user-defined gestures for surface computing"

Similar presentations


Ads by Google