Download presentation
Presentation is loading. Please wait.
Published byImogene Chandler Modified over 8 years ago
1
User Working Group 2013 Crowdsourcing for Dataset Selection 12 March 2013 http://podaac.jpl.nasa.gov
2
Problem overview Some Background A Proposal / An Experiment Discussion and Activity Outline
3
Problem So Many Datasets… So Many Datasets… How to choose? Ask the Crowd? JPL-L2P-MODIS_A JPL-L2P-MODIS_T JPL-L4UHfnd-GLOB-MUR OSDPD-L2P-MSG02 NAVO-L2P-AVHRR19_L OSDPD-L2P-GOES13 NAVO-L2P-AVHRR18_G NAVO-L2P-AVHRR19_G JPL_OUROCEAN-L4UHfnd-GLOB-G1SST EUR-L2P-AVHRR_METOP_A Etc…......
4
Can crowdsourcing help in dataset selection? What kind of interface do we need to engage the crowd? Problem
5
Worked assessment & characterization Generated use cases on accountability Very quality-heavy endeavor; really big problem! As an intermediate, think crowdsourcing could help… Background + Proposal Crowdsourcing Leveraging the collective intelligence (Seems like a worthwhile experiment, anyway.)
6
Need to find the right crowd Need to ask the right questions Need to motivate the crowd Need to filter and screen Need to provide the infrastructure Crowdsourcing Challenges * Apply Crowdsourcing Techniques Identify Crowdsourcing Approaches Provide information to Users Improve Selection & Usage * Elliott/Tauer, 2010: Joining the Fray: Arriving At a Decision Model for Employing Crowdsourcing
7
We have: Infrastructure (E.g.: Portal, Forum) Filter and Screen (E.g.: Moderated Forum) Find the crowd (E.g.: User Community) The experiment: Can we come up with a model that results in: A motivated crowd Generation of the right info to help improve selection? A Worthwhile Experiment? Apply Crowdsourcing Techniques Identify Crowdsourcing Approaches Provide information to Users Improve Selection & Usage
8
Want the UWG to help us design the experiment / the right interface. UWG
9
Lots of ways to get the crowd involved Most techniques should be familiar Going to look at examples Get a feel for the pros and cons Interfacing with the Crowd Comments Be ready to design…
10
Which Techniques Can Help with Dataset Selection? Analytics Scoring Helpful Comments Discussion Speaker ID Speaker Credibility Value of the Annotation (Make up your own!) Weighting Tagging Also’s
11
“Going on the Record”
12
Comments Speaker ID
13
“Real-Time Re-Org”
14
Tagging / Folksonomy
15
“Back and Forth”
16
Discussion
17
“Democratic Prestige”
18
Helpful Speaker Credibility Value of the Annotation
19
“Seeing Stars”
20
Weighting Scoring
21
“The Relatives”
22
Also’s
23
“Analyze This…”
24
Analytics
25
1. Brainstorm in teams of 2 (5 min max) 2. Lightning round (2 min max) 3. Rehash (10-15 min) 4. Submit 5. Done. Team up…
26
Get Cooking… Analytics Scoring Helpful Comments Discussion Speaker ID Speaker Credibility Value of the Annotation (Make up your own!) Weighting Tagging Also’s
27
We’ll review the inputs We’ll define an approach Additional input is always welcome Next steps?
28
End of Presentation
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.