Download presentation
Presentation is loading. Please wait.
Published byHolly Kennedy Modified over 9 years ago
1
Crisp Answers to Fuzzy Questions: Design lessons for crowdsourcing decision inputs Alex Quinn, Ben Bederson
2
“Market research firm J.D. Power and Associates says […] more than 80% of buyers have already spent an average of 18 hours online researching models and prices, according to Google data.” Wall Street Journal, 2/26/2013 R. L. Polk & Co. / Autotrader.com, Automotive Buyer Study, 2011
3
Vacation itinerary Location for headquarters Grad school applications PediatricianCarSmartphone DATA-DRIVEN DECISIONS
5
Building blocks for Mechanical Turk: HITs (human intelligence task)
11
Keep instructions short. Input labels should be unambiguous HITs must be grouped by common templates See Mechanical Turk Requester Best Practices Guide
12
Example #1: Find a pediatrician Requirements Accepts my insurance ≥4 stars at RateMDs.com >80% positive at HealthGrades.com ≤15 minutes drive from home
29
Effort should be proportional to the reward. HITs in a group share a common base price. Information sources should be traceable. See Mechanical Turk Requester Best Practices Guide
30
Example #2: Buy a stroller Requirements Fits a 30-pound baby Reclines for sleeping Medium/large-sized soft tires Can purchase online in US
31
Bonus offers allow reward to scale with effort Find creative ways to track sources
32
Design lessons 1)Consider effort-reward balance from the start. 2)Look for implicit ways of capturing sources. 3)Use word economy to conserve vertical space. 4)Choose unambiguous input labels. Alex Quinn aq@cs.umd.edu
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.