CS 189 Brian Chu Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge) brianchu.com.

Slides:



Advertisements
Similar presentations
Chapter 2 Describing Contingency Tables Reported by Liu Qi.
Advertisements

Homework Questions.
Logistic Regression STA302 F 2014 See last slide for copyright information 1.
Multivariate linear models for regression and classification Outline: 1) multivariate linear regression 2) linear classification (perceptron) 3) logistic.
SLIDE 1IS 240 – Spring 2010 Logistic Regression The logistic function: The logistic function is useful because it can take as an input any.
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
Today Linear Regression Logistic Regression Bayesians v. Frequentists
Lecture 5 Probability and Statistics. Please Read Doug Martinson’s Chapter 3: ‘Statistics’ Available on Courseworks.
STAT 104: Section 2 4 Oct, 2007 TF: Daniel Moon. TF Daniel Moon G2 (A.M. in Statistics Dept) Office hour:
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
STAT 104: Section 3 21 Feb, 2008 TF: Daniel Moon.
1 Business 90: Business Statistics Professor David Mease Sec 03, T R 7:30-8:45AM BBC 204 Lecture 21 = Start Chapter “Confidence Interval Estimation” (CIE)
Bayes Classifier, Linear Regression 10701/15781 Recitation January 29, 2008 Parts of the slides are from previous years’ recitation and lecture notes,
Adminstrative Info for Final Exam Location: Steinberg Hall-Dietrich Hall 351 Time: Thursday, May 1st, 4:00-6:00 p.m. Closed book. Allowed two double-sided.
Problem of the Day Problem of the Day next Using a Calculator.
TODAY IN ALGEBRA…  WARM UP: Determining whether an ordered pair is a solution and graphing linear equations  Learning Goal: 6.7 You will graph linear.
PROBABILITY David Kauchak CS451 – Fall Admin Midterm Grading Assignment 6 No office hours tomorrow from 10-11am (though I’ll be around most of the.
Dr. Fowler AFM Unit 8-5 Linear Correlation Be able to construct a scatterplot to show the relationship between two variables. Understand the properties.
ECE 5984: Introduction to Machine Learning Dhruv Batra Virginia Tech Topics: –Classification: Naïve Bayes Readings: Barber
1 Logistic Regression Adapted from: Tom Mitchell’s Machine Learning Book Evan Wei Xiang and Qiang Yang.
Logistic Regression STA2101/442 F 2014 See last slide for copyright information.
CS 6961: Structured Prediction Fall 2014 Course Information.
LOGISTIC REGRESSION A statistical procedure to relate the probability of an event to explanatory variables Used in epidemiology to describe and evaluate.
Machine Learning Recitation 6 Sep 30, 2009 Oznur Tastan.
Regression Usman Roshan CS 698 Machine Learning. Regression Same problem as classification except that the target variable y i is continuous. Popular.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Machine Learning CUNY Graduate Center Lecture 4: Logistic Regression.
Overview of the final test for CSC Overview PART A: 7 easy questions –You should answer 5 of them. If you answer more we will select 5 at random.
CS 189 Brian Chu Slides at: brianchu.com/ml/
CS 189 Brian Chu Slides at: brianchu.com/ml/ brianchu.com/ml/ Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge)
CS 189 Brian Chu Slides at: brianchu.com/ml/ brianchu.com/ml/ Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge)
CS 189 Brian Chu Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge) brianchu.com.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
ECE 5984: Introduction to Machine Learning Dhruv Batra Virginia Tech Topics: –(Finish) Model selection –Error decomposition –Bias-Variance Tradeoff –Classification:
Calculator Tricks 1 By Brian Carruthers ( (Please see notes on slide 1)
3.1.4 Modelling.
Mathematics 2 the Fifth and Sixth Lectures
Probability David Kauchak CS158 – Fall 2013.
CS 188: Artificial Intelligence Fall 2006
Lecture 3: Linear Regression (with One Variable)
ECE 5424: Introduction to Machine Learning
Lecture 09: Gaussian Processes
CMSC201 Computer Science I for Majors Lecture 27 – Final Exam Review
ECE 5424: Introduction to Machine Learning
with observed random variables
Section 11.2: Solving Linear Systems by Substitution
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18
Overview Basic Information Lecture Labs Lab Reports Homework Exams
Section 10.2: Fitting a Linear Model to Data
Section 2.2 More on Functions and Their Graphs
Today’s Agenda Go over exam #2 Go over exam #3 Lab 8 for 1 hour
Data Mining (and machine learning)
Welcome to Foundations of Math 3!
Chapter 3 Describing Relationships Section 3.2
Students, Take out your calendar and your homework
3.0 Functions of One Random Variable
Section 6.3 Standard Form.
Lecture 10: Gaussian Processes
CS 336/536: Computer Network Security Fall 2014 Nitesh Saxena
CS 250, Discrete Structures, Fall 2014 Nitesh Saxena
BAE2023 Physical Properties of Biological Materials
Continuous Improvement for Writing
Welcome 11/6/14 if x < 2 f(x) = if x > 2
Mathematical Analysis
Graph Linear Inequalities in Two Variables
Cases. Simple Regression Linear Multiple Regression.
Logistic Regression [Many of the slides were originally created by Prof. Dan Jurafsky from Stanford.]
Regression and Correlation of Data
CS 232 Geometric Algorithms: Lecture 1
CS a-spring-midterm2-survey
Presentation transcript:

CS 189 Brian Chu Office Hours: Cory 246, 6-7p Mon. (hackerspace lounge) brianchu.com

Agenda me for slides Questions? Random / HW Why logistic regression Worksheet

Questions Any grad students? – Thoughts on final project? Who would be able to make my 12-1pm section? – Lecture / worksheet split section Questions? Concerns? Lecture pace / content / coverage?

Features sklearn hog, sklearn tfidf, bag of words, etc.

Terminology Shrinkage (regularization) Variable with a hat (ŷ)  estimated/predicted P(Y | X) ∝ P(X |Y) P(Y) posterior ∝ likelihood * prior

Why logistic regression Odds measure of relative confidence – P =.9998; 4999:1 – P =.9999; 9999:1 – Doubled confidence!.5001% .5002; :1  :1 – (basically no change in confidence) “relative increase or decrease of a factor by one unit becomes more pronounced as the factors absolute difference increases.”

Log-odds (calculations in base 10) (0, 1)  (-∞, ∞) Symmetric:.99 ≈ 2,.01 ≈ -2 X units of log-odds  same Y % change in confidence –0.5  0.91 ≈ 0  1 –.999 .9999 ≈ 3  4 “ Log-odds make it clear that increasing from 99.9% to 99.99% is just as hard as increasing from 50% to 91%” Credit:

Logistic Regression w x = lg [ P(Y=1|x) / (1 – P(Y=1|x) ] Intuition: some linear combination of the features tells us the log-odds that Y = 1 Intuition: some linear combination of the features tells us the “confidence” that Y = 1