Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Department Computer Science Bayesian Logistic Regression Model (Final Report) Graduate Student Teawon Han Professor Schweighofer,

Similar presentations


Presentation on theme: "University of Southern California Department Computer Science Bayesian Logistic Regression Model (Final Report) Graduate Student Teawon Han Professor Schweighofer,"— Presentation transcript:

1 University of Southern California Department Computer Science Bayesian Logistic Regression Model (Final Report) Graduate Student Teawon Han Professor Schweighofer, Nicolas 9/23/2011

2 Introduction Bayesian Logistic Regression Model (Final Report) 1.The purpose of the project - Experiment ? 2. Summary of Bayesian Logistic Regression (BLR) - How do I apply BLR to the BART or ART 3. What is next?

3 The purpose of the project Bayesian Logistic Regression Model (Final Report) 1.Predict accurate status of rehabilitation - Reduce rehabilitation time ( No un-necessary training ) - Rise efficiency in rehabilitation process 2.Data Collection method - use 3 days data in my program (regression) for test

4 The purpose of the project Bayesian Logistic Regression Model (Final Report) 3.Experiment Environment ` Success!

5 The purpose of the project Bayesian Logistic Regression Model (Final Report) 4. Given Data type (collected data 150) Error ==0 && Hit Hand ==1 Data (1 day) Data (2 day) Pattern analysis Prior Weight value New Weight value Pattern analysis New New Weight value Success condition

6 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 1.What is regression? Why do we use regression? 2.Example ( Linear Regression ) Regression can help to represent complete model by partially observed data.

7 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 3.How do I apply BLR to the project - First, we have two classes for classification. ( Success and Fail ) - Expression a. p(C 1 | Ф ) = y (Ф) = Ϭ (W T Ф)  success b. p(C 2 | Ф ) = 1 - p(C 1 | Ф )  fail where Ф is feature vector ( data ) and w is weight vector. Error ==0 && Hit Hand ==1 Success condition

8 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 3.How do I apply BLR to the project (continue) - Second, to represent Logistic Regression, I used Ϭ (·). where Ϭ (α) = 1 / 1 + exp (-α) a. range is limited (0 ~ 1) b. TO MAKE EASY, I used simplest formula (next page)

9 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 3.How do I apply BLR to the project (continue) b. TO MAKE EASY, I used simplest formula which includes the least number of parameters (features) Formula : W 0 + W 1 Ф 1 +W 2 Ф 2  this should be updated more accurately by Nuero-Scientific knowledge. 4. The goal in here is ‘Finding accurate W vector’ to predict posterior result. (next page)

10 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 4.The goal in here is ‘Finding accurate W vector’ to predict posterior result. - Process of calculation W vector (W can be represented by Gaussian) a. W map (mean) S N (covariance) : W map can be calculated by Newton-Raphson rule. b. Newton-Raphson rule : Iterative Optimization Scheme to make minimize the error of weight vector. [link]link

11 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 4.The goal in here is ‘Finding accurate W vector’ to predict posterior result. - Process of calculation posterior W vector c. Equation of Newton’s method (W map ) ( Pattern Recognition and machine learning book p208 ) d. Covariance of W

12 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 4.The goal in here is ‘Finding accurate W vector’ to predict posterior result. - Process of calculation W vector e. Finally, we can get distribution of posterior W 5.To get the posterior probability given data with posterior W

13 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 5.To get the posterior probability given data with posterior W (derivation) - you can find “Pattern recognize and machine learning book” - I also attached from Srihari’s lecture note.

14 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report)

15 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report)

16 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report)

17 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 6.How do I apply BLR to the project a. Initial weight vector = [0.001,0.001,0.001] b. Initial covariance vector = [1,0,0 ; 0,1,0; 0,0,1] Data (1 day) Data (2 day) Pattern analysis Prior Weight value New Weight value Pattern analysis New New Weight value new New

18 Summary of Bayesian Logistic Regression (BLR) Bayesian Logistic Regression Model (Final Report) 7. Results predict


Download ppt "University of Southern California Department Computer Science Bayesian Logistic Regression Model (Final Report) Graduate Student Teawon Han Professor Schweighofer,"

Similar presentations


Ads by Google