Download presentation
Presentation is loading. Please wait.
Published byHarry Milo Francis Modified over 9 years ago
1
WG2 Task Force “Crowdsourcing” Tobias Hossfeld, Matthias Hirth, Bruno Gardlo, Michal Ries, Sebastian Egger, Raimund Schatz, Katrien de Moor, Christian Keimel, Martin Varela, Lea Skorin-Kapov WG2 Mechanisms and Models
2
Agenda Report on Task Force progress by Tobias Hoßfeld STSM report by Bruno Gardlo: “Improving Reliability for Crowdsourcing-Based QoE Testing” “QualityCrowd for video quality assessment” by Christian Keimel TF crowdsourcing discussion 2 WG2 Task Force „Crowdsourcing“
3
„Tools“ for Collaboration Wiki: –Idea: living document for discussion, collaboration, information for all –E.g. collecting and storing knowledge on experiments, design of experiments, etc. –Currently: mainly used for describing activities Mailing list –Used for monthly reports, asking for input –Used for announcing crowdsourcing tests STSMs –3 STSMs related to crowdsourcing –Great! Successfully leads to joint work and joint results (per definition) In summary –collaboration and discussion very good within group of active TF members –direct communication of members interested in certain topic 3 WG2 Task Force „Crowdsourcing“
4
Efforts in 2012: Joint Studies and Experiments Video quality and impact of crowdsourcing platform and screening techniques (Bruno Gardlo, Tobias Hoßfeld) Waiting times especially for YouTube video streaming (Tobias Hoßfeld, Raimund Schatz, Sebastian Egger) Crowdsourced multidimensional Web QoE test campaign: performance, visual appeal, ease of use (Marin Varela, Lea Skorin- Kapov) Visual Privacy (Pavel Korshunov, EPFL) 4 WG2 Task Force „Crowdsourcing“
5
Efforts in 2012: Joint Publications YouTube QoE with crowdsourcing tests –3 publications: Tobias Hoßfeld, Raimund Schatz, Sebastian Egger –Initial Delay vs. stalling for Internet video streaming –Similiar results in lab and crowdsourcing (after filtering) Crowdsourcing for audio-visual QoE Tests –2 pubs: Bruno Gardlo, Michal Ries, Tobias Hoßfeld, Raimund Schatz –Impact of Screening Technique on Crowdsourcing –Microworkers vs. Facebook: The Impact of Crowdsourcing Platform Choice on Experimental Results QualityCrowd: Platform for Video Crowdsourcing Tests –3 pubs: Christian Keimel, Julian Habigt, Clemens Horch, Klaus Diepold –Framework for conducting tests and own experiences Details in wiki https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd 5 WG2 Task Force „Crowdsourcing“
6
Planning 2013 Joint Qualinet papers –Best practices QoE testing with crowdsourcing: Tobias Hoßfeld, Bruno Gardlo, Christian Keimel, Matthias Hirth, Julian Habigt –QoE Evaluation via Crowdsourcing and comparison lab / crowd tests Bruno Gardlo, Katrien de Moor, Raimund Schatz, Michal Ries, Tobias Hoßfeld –Web QoE results Lea Skorin-Kapov, Marin Varela –… Further experiments –E.g. expectations tests via crowdsourcing in the context of authentication of social networks Tobias Hossfeld, Markus Fiedler –Further web QoE tests Lea Skorin-Kapov, Marin Varela –Dropbox tests Raimund Schatz, Tobias Hoßfeld –… 6 WG2 Task Force „Crowdsourcing“
7
Reflecting: Goals of this Task Force to identify scientific challenges and problems for QoE assessment via crowdsourcing but also the strengths and benefits, to derive a methodology and setup for crowdsourcing in QoE assessment, to challenge crowdsourcing QoE assessment approach with usual “lab” methodologies, comparison of QoE tests to develop mechanisms and statistical approaches for identifying reliable ratings from remote crowdsourcing users, to define requirements onto crowdsourcing platforms for improved QoE assessment. Joint activities, collaboration within Qualinet 7 WG2 Task Force „Crowdsourcing“
8
Open Issues Experiences with crowdsourcing –What are the main problems? –Reliability, environment monitoring, technical implementation, language problems …? –cartography for crowdsourcing use cases and mechanisms Incentive design for QoE tests –Improves reliability, complementary to filtering techniques –E.g. tests designed as game –E.g. different payment schemes Requirements onto crowdsourcing platforms for improved QoE assessment –open API available soon for microworkers Database with crowdsourcing results –as part of WG 4? Available in crowdsourcing wiki? –E.g. analyze fake user ratings –E.g. compare lab and crowd results for certain apps –E.g. impact of context factors on QoE, country, habits, … Framework for crowdsourcing tests available to Qualinet? –E.g. using facebook 8 WG2 Task Force „Crowdsourcing“
9
Thank you https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd cs.wg2.qualinet@listes.epfl.ch WG2 Task Force „Crowdsourcing“
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.