Download presentation
Presentation is loading. Please wait.
Published byStanley Melton Modified over 9 years ago
1
Crowdsourcing Color Perceptions using Mobile Devices Jaejeung Kim 1, Sergey Leksikov 1, Punyotai Thamjamrassri 2, Uichin Lee 1, Hyeon-Jeong Suk 2 1 Dept. Knowledge Service Engineering 2 Dept. Industrial Design CrowdColor
2
Motivation Related Work CrowdColor Design Experiment 1: CrowdColor Generation Experiment 2: CrowdColor User Evaluation Limitations Conclusion & Future Work Overview
3
Color is a primary purchase criterion in fashion. Inaccurate colors leads to negative shopping experiences (Parker et al. 2009). Motivation 3/20
4
Expectation Motivation Reality VS 4/20
5
Customer reviews are a key source of subjective opinions of the product. However it is not easy to convey precise meaning of color. Motivation 5/20
6
Capturing HW/SW Lighting conditions Manual editing Image generationImage renderingImage viewing Motivation Product viewing Display HW Rendering SW 6/20
7
Image generationImage renderingImage viewing Related Work Product viewing Color Reproduction: -Automatic white balancing -Display adaptive tone mapping (ACM TOG’08) -Color Match (MobileHCI’08) Crowdsourcing Graphical Perceptions: -Visualization judgement task (CHI’10) -Extracting color themes from images (CHI’13) Color Match (MobileHCI’08) Color theme extraction (CHI’13) 7/20
8
We aim to design and evaluate a color generation application that 1)Receives explicit color inputs using a color picker 2)Aggregating the inputs into a color that can represent the product (CrowdColor) in a form of a customer review. CrowdColor Design 8/20
9
1)The customer perceives the color of the product 2)Selects the perceived color using the color picker CrowdColor Design selected color color picker ① ② 9/20
10
3)Selected colors are sent to the server 4)Server aggregates the RGB data inputs into a representative color 5)Device/light adaptive aggregation* could be made CrowdColor Design Server ③ ③ ③ ④ *Color input from only a certain device under certain lighting condition will be aggregated to be given to a customer with the same environmental conditions. (e.g. iphone user under a fluorescent light will be given the aggregated colors from iphone users under fluorescent light) 10/20
11
Goal: -Generate CrowdColor and assess its accuracy. -Observe crowd worker’s color perception and selection ability. Method: -31 participants (12 females), in lab controlled experiment -4 (colors) × 2 (lighting conditions) × 2 (devices) Experiment 1: CrowdColor Generation 11/20
12
Color stimuli: Red, blue, yellow, gray Lighting conditions: Daylight(5400K), Incandescent(3800K) Display device: Samsung Galaxy S4 (AMOLED), iPhone 5s (IPS) Experiment 1: CrowdColor Generation 5400K 3800K 12/20
13
Experiment 1: CrowdColor Generation From the user inputs, we generated two types of CrowdColors: 1)Adaptively-mapped CrowdColor (ACC): hypothetically the best case 2)Reversely-mapped CrowdColor (BCC): hypothetically the worst case All the CrowdColors were measured with spectrophotometer to acquire color accuracies (accuracy = color difference between the color stimuli and the CrowdColor) 13/20
14
Experiment 1: Results The best performing case was B1 (Δ = 2.0 < JND ≈ 2.3) All ACC outperformed RCC. 14/20
15
Experiment 1: Results Effect of color, light, and device on color accuracy: 1)Both color and light had a significant effect on color accuracy 2)Device did not independently affect color accuracy 3)Interaction effect of the color and device on the color accuracy marginally significant. 15/20
16
Experiment 1: Results Effect of color, light, and device on input time: 1)Only the color had a significant effect on time. 2)Yellow, blue, gray required more effort than red Consistent with the post interview, where more than 50% users reported yellow and blue were hard to locate. 16/20
17
Experiment 2: CrowdColor User Evaluation Goal: -Assess the subjective agreement level of perceived color similarity Method: -18 participants, given 2 devices, under 2 lighting conditions -Evaluate perceived color accuracy of previously generated CrowdColors -Only evaluated the best performing ACCs -Total of 144 responses collected 17/20
18
Experiment 2: Results CrowdColors were positively accepted overall -73% were positive or neutral -Exit interview revealed CrowdColor was not precisely the same -But it is more trustworthy than the images provided by the sellers 18/20
19
Limitations Not all colors are locatable : Display device cannot represent all colors that exists in the real world Controlled settings : Current study was a in lab experiment. In the wild study could decrease its accuracy Different types of materials needs to be tested : We mainly considered standardized color papers. Further study of various materials (e.g. fabric, metal) needs to be conducted for generalizability. 19/20
20
Conclusion & Future Work Explored viability of crowdsourcing color inputs from a real world object. The accuracy of the CrowdColor was examined in relation to environmental factors. User study revealed it is positive overall and trustworthy in online shopping. Large-scale, in-situ study will be conducted in the future. 20/20
21
Please visit CrowdColor.net and try for yourself! Thank you
22
Color Difference (Δ)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.