CUSTOMER NEEDS ELICITATION FOR PRODUCT CUSTOMIZATION Yue Wang Advisor: Prof. Tseng Advanced Manufacturing Institute Hong Kong University of Science and.

Slides:



Advertisements
Similar presentations
Fast Algorithms For Hierarchical Range Histogram Constructions
Advertisements

Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
Bayesian inference “Very much lies in the posterior distribution” Bayesian definition of sufficiency: A statistic T (x 1, …, x n ) is sufficient for 
Part 3 Probabilistic Decision Models
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Planning under Uncertainty
Visual Recognition Tutorial
CS 589 Information Risk Management 30 January 2007.
Mobility Improves Coverage of Sensor Networks Benyuan Liu*, Peter Brass, Olivier Dousse, Philippe Nain, Don Towsley * Department of Computer Science University.
CS 589 Information Risk Management 23 January 2007.
1 CS 430 / INFO 430 Information Retrieval Lecture 12 Probabilistic Information Retrieval.
Evaluating Hypotheses
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
1 Wavelet synopses with Error Guarantees Minos Garofalakis Phillip B. Gibbons Information Sciences Research Center Bell Labs, Lucent Technologies Murray.
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Thanks to Nir Friedman, HU
1 Presentation Outline Introduction Objective Sorting Probability and Loss Matrices The Proposed Model Analysis Of Some Loss Functions Case Study Redundancy.
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
* Problem solving: active efforts to discover what must be done to achieve a goal that is not readily attainable.
Modeling (Chap. 2) Modern Information Retrieval Spring 2000.
Hypothesis Testing.
Nature and Scope of Marketing Research
Determining Sample Size
Introduction to variable selection I Qi Yu. 2 Problems due to poor variable selection: Input dimension is too large; the curse of dimensionality problem.
STOCHASTIC DOMINANCE APPROACH TO PORTFOLIO OPTIMIZATION Nesrin Alptekin Anadolu University, TURKEY.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
An Integration Framework for Sensor Networks and Data Stream Management Systems.
1 Sampling Distributions Lecture 9. 2 Background  We want to learn about the feature of a population (parameter)  In many situations, it is impossible.
Ranking Queries on Uncertain Data: A Probabilistic Threshold Approach Wenjie Zhang, Xuemin Lin The University of New South Wales & NICTA Ming Hua,
Software Measurement & Metrics
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Heuristic Optimization Methods Tabu Search: Advanced Topics.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Learning Geographical Preferences for Point-of-Interest Recommendation Author(s): Bin Liu Yanjie Fu, Zijun Yao, Hui Xiong [KDD-2013]
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology.
Statistics and the Verification Validation & Testing of Adaptive Systems Roman D. Fresnedo M&CT, Phantom Works The Boeing Company.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
An Energy Efficient Hierarchical Clustering Algorithm for Wireless Sensor Networks Seema Bandyopadhyay and Edward J. Coyle Presented by Yu Wang.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 5.
Clustering and Testing in High- Dimensional Data M. Radavičius, G. Jakimauskas, J. Sušinskas (Institute of Mathematics and Informatics, Vilnius, Lithuania)
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Chapter 1 Introduction n Introduction: Problem Solving and Decision Making n Quantitative Analysis and Decision Making n Quantitative Analysis n Model.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 07: BAYESIAN ESTIMATION (Cont.) Objectives:
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Computer Vision Lecture 6. Probabilistic Methods in Segmentation.
Dependency Networks for Collaborative Filtering and Data Visualization UAI-2000 발표 : 황규백.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Lecture 5 Introduction to Sampling Distributions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Ragnar Arnason Comments on Environmentally harmful subsidies: - Developing a checklist - OECD Workshop on Environmentally harmful subsidies 7-8 November.
Quick summary One-dimensional vertical (quality) differentiation model is extended to two dimensions Use to analyze product and price competition Two.
Dynamic Background Learning through Deep Auto-encoder Networks Pei Xu 1, Mao Ye 1, Xue Li 2, Qihe Liu 1, Yi Yang 2 and Jian Ding 3 1.University of Electronic.
Hypothesis Testing. Statistical Inference – dealing with parameter and model uncertainty  Confidence Intervals (credible intervals)  Hypothesis Tests.
Dependency Networks for Inference, Collaborative filtering, and Data Visualization Heckerman et al. Microsoft Research J. of Machine Learning Research.
SemiBoost : Boosting for Semi-supervised Learning Pavan Kumar Mallapragada, Student Member, IEEE, Rong Jin, Member, IEEE, Anil K. Jain, Fellow, IEEE, and.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Innovation Team of Recommender System(ITRS) Collaborative Competitive Filtering : Learning Recommender Using Context of User Choice Keynote: Zhi-qiang.
Uncertainty and controversy in environmental research
Formulate the Research Problem
Ch3: Model Building through Regression
Continuous improvement is the process by which organizations frequently review their procedures, aiming to correct errors or problems. The most effective.
Overview of Machine Learning
Connecting Data with Domain Knowledge in Neural Networks -- Use Deep learning in Conventional problems Lizhong Zheng.
LECTURE 09: BAYESIAN LEARNING
Presentation transcript:

CUSTOMER NEEDS ELICITATION FOR PRODUCT CUSTOMIZATION Yue Wang Advisor: Prof. Tseng Advanced Manufacturing Institute Hong Kong University of Science and Technology

Advanced Manufacturing Institute Background 2 Customer Needs (CNs) Functional Requirements (FRs) Design Parameters (DPs) Process Variables (PVs) Product specification definition Product design Process design CNs are expressed in explicit product specifications. Axiomatic design:

Advanced Manufacturing Institute  Customer needs elicitation should be  Good: predictive, customer insight  Fast: for customers and for designers  Cheap: reduce market research cost  Easy: reduce drudgery and errors Introduction 3

Advanced Manufacturing Institute  Can we find what people want quickly and inexpensively?  How to avoid confusing customers with too many products? Research issues 4

Advanced Manufacturing Institute 5 Challenges  Customers are  Impatient to specify a long list of items  Unable to articulate their needs  Unaware of latent needs  Lack of information about available options  Interlocking among attributes

Advanced Manufacturing Institute  Research framework  Bayesian network based preferences representation  Adaptive specification definition procedure  Recommendation for customized product Approach 6

Advanced Manufacturing Institute Preferences Representation  Uncertainty of the purchasing choices  Customers are heterogeneous  Choice decisions differ under various situations  The context of purchase differs  Dependency among preferences towards different attributes 7

Advanced Manufacturing Institute  Bayesian network Preferences Representation 8

Advanced Manufacturing Institute  The important considerations in this phase:  Customers are not patient enough to specify a long list of items.  The items differ a lot in terms of the amount of information they can provide. Specification Definition 9

Advanced Manufacturing Institute  Basic ideas:  Present the most informative query item to customers  The value of information:  :the additional information received about X from getting the value of Y=y.  Specification Definition 10

Advanced Manufacturing Institute  The solution for f (Blachman, 1968 # ): # N. M. Blachman, “The amount of information that y gives about X,” IEEE Trans. Inform. Theory, vol. IT-14, no. 1, pp , Jan Specification Definition 11

Advanced Manufacturing Institute  Given:  Customers preferences information  Determine:  Which products should be recommended?  In what order to present the recommendations if more than one recommendations are presented? Recommendation 12

Advanced Manufacturing Institute  Probability of relevance under binary independent assumption:  Probability of relevance considering first order conditional dependency: Probabilistic relevance computation 13

Advanced Manufacturing Institute  The idea is to rank products by their estimated probability of relevance with respect to the information obtained.  Probability ranking principle is optimal, in the sense that it minimizes the expected loss. Probability ranking principle 14

Advanced Manufacturing Institute Schematic framework 15

Advanced Manufacturing Institute  Precision rate  Recall rate Evaluation metrics 16

Advanced Manufacturing Institute  The recommendation based on probability ranking can guarantee the highest precision and recall rate.  If customers’ preferences to all the components are independent and the potential preferences towards all the alternatives of an attribute are random, the specification definition method based on the information gain has the highest precision and recall rate. Evaluation results 17

Advanced Manufacturing Institute Parameters setting Result (# of experiments in which the precision and recall rate are highest/ total # of experiments) m ~ Uniform (3, 13) N ~ Uniform (50, 100) |N i |~Uniform 9,345/10,000 m ~ Uniform (5, 15) N ~ Uniform (100, 150) |N i |~Uniform 9,325/10,000 m~Uniform (5, 15) N ~ Uniform (1000, 2000) |N i |~Uniform 9,344/10,000 m ~ Uniform (5, 15) N ~ Uniform (100, 200) |N i |~Norm(1, 1) 9,560/10,000 m ~ Uniform (5, 15) N ~ Uniform (100, 200) |N i |~Norm(1, 2) 9,603/10,000 m ~ Uniform (5, 15) N ~ Uniform (100, 200) |N i |~Norm(1, 0.5) 9,262/10,000 Evaluation results 18

Advanced Manufacturing Institute Evaluation by utility  Preliminaries:   Stochastically dominate: If, then approach 1 stochastically dominates approach 2. 19

Advanced Manufacturing Institute  The presented method  stochastically dominates other approaches.  is optimal with respect to any nondecreasing utility function. Evaluation results 20

Advanced Manufacturing Institute  An approach to elicit customers’ preference is presented.  The model can be used to adaptively improve definition of product specification for custom product design.  Based on the model, customized query sequence can be developed to reduce redundant questions.  Product recommendation approach is adopted to further improve the efficiency of custom product design Summary 21

Advanced Manufacturing Institute 22 Thank you! Your suggestions & comments are highly appreciated!

Advanced Manufacturing Institute Extension to binary independent assumption  Theorem: A probability distribution of tree dependence P t (x) is an optimal approximation to P(x) if and only it’s maximum spanning tree. [Chow and Liu, 1968]

Advanced Manufacturing Institute Why customized product design  Well calibrated customized product design can integrate customers into design activities  Mitigate the side effect of sticky information  Better meet customers’ requirements  Loyalty can be enhanced.  Help identify latent needs guide future product development

Advanced Manufacturing Institute  Lemma 1: Suppose approach 1 proposes n recommendations in a sequence S 1 =(r 11,r 12,…r 1n ). Each recommendation r 1i has probability p 1i to meet the customer needs. The sequence is arranged such that. Approach 2 also proposes n recommendations in a sequence S 2 =(r 21,r 22,…r 2n ). These n recommendations may be different from the ones in sequence S 1. Similarly, we also have corresponding probability serial and If for all, then X 1 stochastically dominates X 2 where X i is an indicator of the number of satisfactory recommendations by using approach i.

Advanced Manufacturing Institute  Lemma 2: Suppose approach 1 proposes n recommendations in a sequence S 1 =(r 11,r 12,…r 1n ). Each recommendation r 1i has probability p 1i to meet the customer needs. The sequence is arranged such that. Approach 2 also proposes n recommendations in a sequence S 2 =(r 21,r 22,…r 2n ) which is a permutation of S 1 =(r 11,r 12,…r 1n ). Then the distribution of satisfactory product for approach 1 is identical to approach 2.

Advanced Manufacturing Institute  Lemma 3: Let U(x) be a nondecreasing utility function where x is the number of satisfactory recommendations. Let X i be an indicator of the number of satisfactory recommendations by using approach i. If X 1 stochastically dominates X 2, then the expected utility by adopting approach 1 is greater or equal to that of approach 2, i.e.,.

Advanced Manufacturing Institute Evaluation m: the number of attributes n i : the number of alternatives of the ith attribute N: the total number of configurations P ijk : if the jth alternative of the ith component is selected, the probability that the kth configuration is the desired one. The entropy of the configuration space if the jth alternative of the ith component is selected: The expected entropy of the configuration space if the ith component is proposed for a customer to specify: 28

Advanced Manufacturing Institute Background  Competitive and changing market  Shorter product development time  Product variety proliferation  Bigger penalty cost of failing to meet customers’ needs or catch up customers’ needs changes 29

Advanced Manufacturing Institute 30

Advanced Manufacturing Institute  Probability of relevance (including first order conditional dependency):  Parameters setting: Probabilistic relevance model 31

Advanced Manufacturing Institute  tailor product to different needs  how to avoid confusing customers with too many products  Can we find what people want quickly and inexpensively  how to find out if a customer is interested in a virtual which doesn't exist  reducing inconsistent preferences  good: predictive, customer insight: what people buy or how many will people buy it  fast: for them and for us: it should be fast, doesn't cost so many time  cheap: reduce market research cost: should be cheat  easy reduce drudgery and errors: should be easy for both customers and designers  That's all the questions in marketing science today. 32