Crowdsourcing Color Perceptions using Mobile Devices Jaejeung Kim 1, Sergey Leksikov 1, Punyotai Thamjamrassri 2, Uichin Lee 1, Hyeon-Jeong Suk 2 1 Dept.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Clustering Crowdsourced Videos by Line-of-Sight FOCUS: Clustering Crowdsourced Videos by Line-of-Sight Puneet Jain, Justin Manweiler, Arup Acharya, and.
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
Year 11 Unit 2 – Controlled assessment (25%)
CS305: HCI in SW Development Evaluation (Return to…)
University of Huddersfield School of Education & Professional Development Adopting and adapting teaching and learning styles.
Oxford Retail Futures Conference 2014 Examining the Role of Social Media in the Omnichannel Customer Journey of High-Involvement Young Female Fashion Consumers.
Richard Yu.  Present view of the world that is: Enhanced by computers Mix real and virtual sensory input  Most common AR is visual Mixed reality virtual.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Visualizing Directions and Schedules on Handheld Devices A Pilot Study of Maps vs. Text and Color vs. Monochrome Pankaj Thakkar Irina Ceaparu Cemal Yilmaz.
Evaluation of A PDA Based Clinical Handover System Dr Marilyn R McGee-Lennon University of Glasgow SIHI, Portsmouth, Sept 2007 HECTOR.
© 2003 Eindhoven University of Technology Alexandru Telea, Flavius Frasincar, Geert-Jan Houben Eindhoven University of Technology, the Netherlands Visualizing.
Multi-media Graphics JOUR 205 Color Models & Color Space 5 ways of specifying colors.
Agenda Introduction New Features in Map Suite Web Edition 3.0 Demonstration Where to Get Help and Learn More Q&A 2.
Paper Title Your Name CMSC 838 Presentation. CMSC 838T – Presentation Motivation u Problem paper is trying to solve  Characteristics of problem  … u.
Mobile Crowdsourcing in the Gulf of Mexico Oil Spill Considerations for Integration with Professional GIS Robert Laudati Trimble Navigation Ltd. November.
Empirically Assessing End User Software Engineering Techniques Gregg Rothermel Department of Computer Science and Engineering University of Nebraska --
1 © 2005 Nokia V1-MSI-SL.ppt / / SL Finding Communication Hot Spots of Location Based Postings Mobile Spatial Interaction Workshop ,
Privacy-Preserving Cross-Domain Network Reachability Quantification
Masters ProjectFall 2010Joe B. Taylor Pedestrian Navigation Using Mobile Devices Georeferencing Photographs of Publicly Posted Maps.
Company LOGO B2C E-commerce Web Site Quality: an Empirical Examination (Cao, et al) Article overview presented by: Karen Bray Emilie Martin Trung (John)
From Controlled to Natural Settings
Classroom Climate and Students’ Goal Structures in High-School Biology Classrooms in Kenya Winnie Mucherah Ball State University Muncie, Indiana, USA June,
Tangible Flags Collaborative Educational Technology to enhance grade school field trips Gene Chipman PhD Candidate in Computer Science
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
“What a Small World!” ---NaviChat Team 4: Andrew Puchle Arthur Liu Yi-Jen Lai.
TUTORIAL 6 VISUAL AESTHETICS 2 Colour in GUI design.
Introduction In recent years, products are required to follow the trend of fashion. It is very popular in using freeform surface to design the model of.
Uichin Lee, Jihyoung Kim *, Eunhee Yi **, Juyup Sung, Mario Gerla * KAIST Knowledge Service Engineering * UCLA Computer Science ** LG UX R&D Lab
Color Theory What is color? How do we describe and match colors? Color spaces.
@VOLCROWE Develop new models of motivations for volunteering in the context of non-commercial crowdsourcing projects. Evaluate a range.
Chapter 6 Supplement Knowledge Engineering and Acquisition Chapter 6 Supplement.
Visualizations to Support Interactive Goal Model Analysis Jennifer Horkoff 1 Eric Yu 2 Department of Computer Science 1 Faculty of Information 2
Validation of Color Managed 3D Appearance Acquisition Michael Goesele Max-Planck-Institut für Informatik (MPI Informatik) Vortrag im Rahmen des V 3 D 2.
Descriptive and Causal Research Designs
Question 1 Why did a majority of students perceive the innovative web-enhanced Japanese language courses favorably and participate in additional online.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
The Literature Search and Background of the Problem.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Midterm Stats Min: 16/38 (42%) Max: 36.5/38 (96%) Average: 29.5/36 (78%)
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
CS2003 Usability Engineering Human-Centred Design Dr Steve Love.
Feature Detection in Ajax-enabled Web Applications Natalia Negara Nikolaos Tsantalis Eleni Stroulia 1 17th European Conference on Software Maintenance.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Implicit Human Computer Ineraction Through Context
Users’ Quality Ratings of Handheld devices: Supervisor: Dr. Gary Burnett Student: Hsin-Wei Chen Investigating the Most Important Sense among Vision, Hearing.
Human Interaction with Data “Meaningful Interpretations” “The Power of Crowdsourcing” &
Eye Tracking In Evaluating The Effectiveness OF Ads Guide : Dr. Andrew T. Duchowski.
CS654: Digital Image Analysis Lecture 29: Color Image Processing.
User and Task Analysis © Ed Green Penn State University Penn State University All Rights Reserved All Rights Reserved 12/5/2015User and Task Analysis 1.
1 City With a Memory CSE 535: Mobile Computing Andreea Danielescu Andrew McCord Brandon Mechtley Shawn Nikkila.
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
Palette-based Photo Recoloring
Mobile GIS CHAPTER 1: GIS AND THE INFORMATION AGE The Information Age:  The world changing and the methods of meeting the needs of those changes are also.
ASSOCIATIVE BROWSING Evaluating 1 Jinyoung Kim / W. Bruce Croft / David Smith for Personal Information.
Adopting and adapting teaching and learning styles Neil Denby.
Whole Test Suite Generation. Abstract Not all bugs lead to program crashes, and not always is there a formal specification to check the correctness of.
Quality Is in the Eye of the Beholder: Meeting Users ’ Requirements for Internet Quality of Service Anna Bouch, Allan Kuchinsky, Nina Bhatti HP Labs Technical.
ASSOCIATIVE BROWSING Evaluating 1 Jin Y. Kim / W. Bruce Croft / David Smith by Simulation.
Emilia Mendes1 Professora Visitante CAPES/ Associate Professor Univ. Auckland, NZ. Introdução a Métricas, Qualidade e Medição de Software.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Shadow Detection in Remotely Sensed Images Based on Self-Adaptive Feature Selection Jiahang Liu, Tao Fang, and Deren Li IEEE TRANSACTIONS ON GEOSCIENCE.
Abstract Panoramic Virtual Reality Motivation to Use Virtual Reality VR Types
PostGraduate Research Excellence Symposium (PGReS) 2015
  A preliminary study: Perceptions of aviation maintenance students related to the use of Augmented Reality maintenance instructions Amadou Anne, Yu Wang.
The Literature Search and Background of the Problem
Visual Memory is Superior to Auditory Memory
From Controlled to Natural Settings
Testing & modeling users
Presentation transcript:

Crowdsourcing Color Perceptions using Mobile Devices Jaejeung Kim 1, Sergey Leksikov 1, Punyotai Thamjamrassri 2, Uichin Lee 1, Hyeon-Jeong Suk 2 1 Dept. Knowledge Service Engineering 2 Dept. Industrial Design CrowdColor

Motivation Related Work CrowdColor Design Experiment 1: CrowdColor Generation Experiment 2: CrowdColor User Evaluation Limitations Conclusion & Future Work Overview

Color is a primary purchase criterion in fashion. Inaccurate colors leads to negative shopping experiences (Parker et al. 2009). Motivation 3/20

Expectation Motivation Reality VS 4/20

Customer reviews are a key source of subjective opinions of the product. However it is not easy to convey precise meaning of color. Motivation 5/20

Capturing HW/SW Lighting conditions Manual editing Image generationImage renderingImage viewing Motivation Product viewing Display HW Rendering SW 6/20

Image generationImage renderingImage viewing Related Work Product viewing Color Reproduction: -Automatic white balancing -Display adaptive tone mapping (ACM TOG’08) -Color Match (MobileHCI’08) Crowdsourcing Graphical Perceptions: -Visualization judgement task (CHI’10) -Extracting color themes from images (CHI’13) Color Match (MobileHCI’08) Color theme extraction (CHI’13) 7/20

We aim to design and evaluate a color generation application that 1)Receives explicit color inputs using a color picker 2)Aggregating the inputs into a color that can represent the product (CrowdColor) in a form of a customer review. CrowdColor Design 8/20

1)The customer perceives the color of the product 2)Selects the perceived color using the color picker CrowdColor Design selected color color picker ① ② 9/20

3)Selected colors are sent to the server 4)Server aggregates the RGB data inputs into a representative color 5)Device/light adaptive aggregation* could be made CrowdColor Design Server ③ ③ ③ ④ *Color input from only a certain device under certain lighting condition will be aggregated to be given to a customer with the same environmental conditions. (e.g. iphone user under a fluorescent light will be given the aggregated colors from iphone users under fluorescent light) 10/20

Goal: -Generate CrowdColor and assess its accuracy. -Observe crowd worker’s color perception and selection ability. Method: -31 participants (12 females), in lab controlled experiment -4 (colors) × 2 (lighting conditions) × 2 (devices) Experiment 1: CrowdColor Generation 11/20

Color stimuli: Red, blue, yellow, gray Lighting conditions: Daylight(5400K), Incandescent(3800K) Display device: Samsung Galaxy S4 (AMOLED), iPhone 5s (IPS) Experiment 1: CrowdColor Generation 5400K 3800K 12/20

Experiment 1: CrowdColor Generation From the user inputs, we generated two types of CrowdColors: 1)Adaptively-mapped CrowdColor (ACC): hypothetically the best case 2)Reversely-mapped CrowdColor (BCC): hypothetically the worst case All the CrowdColors were measured with spectrophotometer to acquire color accuracies (accuracy = color difference between the color stimuli and the CrowdColor) 13/20

Experiment 1: Results The best performing case was B1 (Δ = 2.0 < JND ≈ 2.3) All ACC outperformed RCC. 14/20

Experiment 1: Results Effect of color, light, and device on color accuracy: 1)Both color and light had a significant effect on color accuracy 2)Device did not independently affect color accuracy 3)Interaction effect of the color and device on the color accuracy marginally significant. 15/20

Experiment 1: Results Effect of color, light, and device on input time: 1)Only the color had a significant effect on time. 2)Yellow, blue, gray required more effort than red  Consistent with the post interview, where more than 50% users reported yellow and blue were hard to locate. 16/20

Experiment 2: CrowdColor User Evaluation Goal: -Assess the subjective agreement level of perceived color similarity Method: -18 participants, given 2 devices, under 2 lighting conditions -Evaluate perceived color accuracy of previously generated CrowdColors -Only evaluated the best performing ACCs -Total of 144 responses collected 17/20

Experiment 2: Results CrowdColors were positively accepted overall -73% were positive or neutral -Exit interview revealed CrowdColor was not precisely the same -But it is more trustworthy than the images provided by the sellers 18/20

Limitations Not all colors are locatable : Display device cannot represent all colors that exists in the real world Controlled settings : Current study was a in lab experiment. In the wild study could decrease its accuracy Different types of materials needs to be tested : We mainly considered standardized color papers. Further study of various materials (e.g. fabric, metal) needs to be conducted for generalizability. 19/20

Conclusion & Future Work Explored viability of crowdsourcing color inputs from a real world object. The accuracy of the CrowdColor was examined in relation to environmental factors. User study revealed it is positive overall and trustworthy in online shopping. Large-scale, in-situ study will be conducted in the future. 20/20

Please visit CrowdColor.net and try for yourself! Thank you

Color Difference (Δ)