User Working Group 2013 Crowdsourcing for Dataset Selection 12 March
Problem overview Some Background A Proposal / An Experiment Discussion and Activity Outline
Problem So Many Datasets… So Many Datasets… How to choose? Ask the Crowd? JPL-L2P-MODIS_A JPL-L2P-MODIS_T JPL-L4UHfnd-GLOB-MUR OSDPD-L2P-MSG02 NAVO-L2P-AVHRR19_L OSDPD-L2P-GOES13 NAVO-L2P-AVHRR18_G NAVO-L2P-AVHRR19_G JPL_OUROCEAN-L4UHfnd-GLOB-G1SST EUR-L2P-AVHRR_METOP_A Etc…......
Can crowdsourcing help in dataset selection? What kind of interface do we need to engage the crowd? Problem
Worked assessment & characterization Generated use cases on accountability Very quality-heavy endeavor; really big problem! As an intermediate, think crowdsourcing could help… Background + Proposal Crowdsourcing Leveraging the collective intelligence (Seems like a worthwhile experiment, anyway.)
Need to find the right crowd Need to ask the right questions Need to motivate the crowd Need to filter and screen Need to provide the infrastructure Crowdsourcing Challenges * Apply Crowdsourcing Techniques Identify Crowdsourcing Approaches Provide information to Users Improve Selection & Usage * Elliott/Tauer, 2010: Joining the Fray: Arriving At a Decision Model for Employing Crowdsourcing
We have: Infrastructure (E.g.: Portal, Forum) Filter and Screen (E.g.: Moderated Forum) Find the crowd (E.g.: User Community) The experiment: Can we come up with a model that results in: A motivated crowd Generation of the right info to help improve selection? A Worthwhile Experiment? Apply Crowdsourcing Techniques Identify Crowdsourcing Approaches Provide information to Users Improve Selection & Usage
Want the UWG to help us design the experiment / the right interface. UWG
Lots of ways to get the crowd involved Most techniques should be familiar Going to look at examples Get a feel for the pros and cons Interfacing with the Crowd Comments Be ready to design…
Which Techniques Can Help with Dataset Selection? Analytics Scoring Helpful Comments Discussion Speaker ID Speaker Credibility Value of the Annotation (Make up your own!) Weighting Tagging Also’s
“Going on the Record”
Comments Speaker ID
“Real-Time Re-Org”
Tagging / Folksonomy
“Back and Forth”
Discussion
“Democratic Prestige”
Helpful Speaker Credibility Value of the Annotation
“Seeing Stars”
Weighting Scoring
“The Relatives”
Also’s
“Analyze This…”
Analytics
1. Brainstorm in teams of 2 (5 min max) 2. Lightning round (2 min max) 3. Rehash (10-15 min) 4. Submit 5. Done. Team up…
Get Cooking… Analytics Scoring Helpful Comments Discussion Speaker ID Speaker Credibility Value of the Annotation (Make up your own!) Weighting Tagging Also’s
We’ll review the inputs We’ll define an approach Additional input is always welcome Next steps?
End of Presentation