Prediction of NBA games based on Machine Learning Methods

Slides:



Advertisements
Similar presentations
ECE 539 – Introduction to Artificial Neural Networks and Fuzzy Systems Henrique Parreiras Couto.
Advertisements

Alberto Trindade Tavares ECE/CS/ME Introduction to Artificial Neural Network and Fuzzy Systems.
Predicting the Winner of an NFL Football Game Matt Gray CS/ECE 539.
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
Ranking individuals by group comparison New exponentiel model Two methods for calculations  Regularized least square  Maximum likelihood.
Data Mining: Discovering Information From Bio-Data Present by: Hongli Li & Nianya Liu University of Massachusetts Lowell.
Rating Systems Vs Machine Learning on the context of sports George Kyriakides, Kyriacos Talattinis, George Stefanides Department of Applied Informatics,
1 Linear Classification Problem Two approaches: -Fisher’s Linear Discriminant Analysis -Logistic regression model.
Introduction to Artificial Neural Network and Fuzzy Systems
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
SPAM DETECTION USING MACHINE LEARNING Lydia Song, Lauren Steimle, Xiaoxiao Xu.
Classification of multiple cancer types by multicategory support vector machines using gene expression data.
Using a Feed-forward ANN to predict NBA Games. About my ANN -Trained incrementally using back propagation -Currently it only uses sigmoid activation -Outputs.
An Example of Course Project Face Identification.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Hospitalization Prediction From Health Care Claims Adithya Renduchintala, Benjamin Martin, & Lance Legel University of Colorado Boulder  Data Mining 
Jeff Howbert Introduction to Machine Learning Winter Regression Linear Regression.
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Spam Detection Ethan Grefe December 13, 2013.
Evolutionary Algorithms for Finding Optimal Gene Sets in Micro array Prediction. J. M. Deutsch Presented by: Shruti Sharma.
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Competition II: Springleaf Sha Li (Team leader) Xiaoyan Chong, Minglu Ma, Yue Wang CAMCOS Fall 2015 San Jose State University.
Combining multiple learners Usman Roshan. Decision tree From Alpaydin, 2010.
Regression Methods. Linear Regression  Simple linear regression (one predictor)  Multiple linear regression (multiple predictors)  Ordinary Least Squares.
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
Classification of Breast Cancer Cells Using Artificial Neural Networks and Support Vector Machines Emmanuel Contreras Guzman.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
ECE 5984: Introduction to Machine Learning Dhruv Batra Virginia Tech Topics: –(Finish) Model selection –Error decomposition –Bias-Variance Tradeoff –Classification:
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Usman Roshan Dept. of Computer Science NJIT
Big data classification using neural network
Deep Feedforward Networks
Extreme Learning Machine
Spring 2003 Dr. Susan Bridges
Trees, bagging, boosting, and stacking
Building Linear Models from Data
Estimating Link Signatures with Machine Learning Algorithms
Classification with Perceptrons Reading:
ECE 5424: Introduction to Machine Learning
Natural Language Processing of Knee MRI Reports
Schizophrenia Classification Using
Presenter: Randy Hunt Presenter: Vitaliy Krestnikov
Data Mining Lecture 11.
Overview of Supervised Learning
NBA Draft Prediction BIT 5534 May 2nd 2018
Probabilistic Models for Linear Regression
Combining Base Learners
CIKM Competition 2014 Second Place Solution
CS 188: Artificial Intelligence
Training a Neural Network
دسته بندی با استفاده از مدل های خطی
Using Neural Networks to Determine NFL Game Outcomes
مدلسازي تجربي – تخمين پارامتر
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Warm Up Imagine a family has three children. 1) What is the probability the family has: 3 girls and 0 boys 2 girls and 1 boy 1 girl and 2 boys 0 girls.
Biointelligence Laboratory, Seoul National University
Aleysha Becker Ece 539, Fall 2018
Somi Jacob and Christian Bach
Graph-based Security and Privacy Analytics via Collective Classification with Joint Weight Learning and Propagation Binghui Wang, Jinyuan Jia, and Neil.
Model generalization Brief summary of methods
Sofia Pediaditaki and Mahesh Marina University of Edinburgh
Regression Methods.
Nonlinear Conjugate Gradient Method for Supervised Training of MLP
CAMCOS Report Day December 9th, 2015 San Jose State University
Basketball Position Classification
Season TOTAL # OF WINS 14/15 13/14 12/13 11/12 10/11 09/10 08/09 07/08
Presentation transcript:

Prediction of NBA games based on Machine Learning Methods University Of Wisconsin Madison Prediction of NBA games based on Machine Learning Methods ECE/CS/ME 539 Introduction to Artificial Neural Networks and Fuzzy System Renato Amorim Torres December, 2013

Proposal The objectives of the project are: Predict the winning team of a NBA game The goal is to have the Prediction Rate higher than the rate of the Very Naive Majority Vote Classifier. This method looks at all previous games (in the season) of the two teams and picks the team with the fewest losses as the winner.

Preparation of the Data Data was obtained on the website: www.basketball-reference.com

Preparation of the Data Box scores were copied into a spreadsheet Using macro all the team names were replaced by numbers, the unnecessary columns were deleted, and .txt files were generated with the data. The .txt files were loaded in MatLab and used to generate the feature vectors and implement the methods.

Data Analysis Blue Line >> 2012 to 2006 Regular Seasons Red Line >> Mean of Seasons

Feature Vectors Win-Loss percentage of both teams Point differential per game of both teams Win-Loss percentage in the previous 8 games for both teams Visitor Team win-Loss percentage as visitor Home Team win-Loss percentage at home Total of eight features

Feature Vectors Win-Loss percentage in the previous 8 games of both teams. Why eight games? Blue Line >> 2012 to 2006 Regular Seasons Red Line >> Mean of Seasons

Naive Majority Vote Rate from 9th game Prediction Based on the graph shown before, it was found that each team should have played at least eight games prior to predicting games. Considering this starting point, the Naive Majority Vote Rate was calculated in order to define the goal: Naive Majority Vote Rate from 9th game Data 2012 2011 2010 2009 2008 2007 2006 Mean Rate(%) 61.83 65.94 64.75 66.30 63.86 65.81

Maximum Likelihood Classifier Prediction Maximum Likelihood Classifier - Not all the feature vectors were used. In order to establish the best features, a code was implemented to verify the prediction rate for all combinations of features. Features used: Feature #2: Home Team Win-Loss percentage at home Feature #4: Home Team total Win-Loss percentage Feature #5: Visitor Team point differential per game in the actual season Feature #7/8: Win-Loss percentage in the last 8 games for both teams

Prediction Maximum Likelihood Classifier -The training data was a combination of the previous season's games and the testing data was random games of the “current” season. The likelihood classifier achieved the following results: Likelihood Classifier Testing Data 2012 2011 2010 2009 2008 2007 2006 Mean Rate(%) 67.31 64.37 67.88 67.43 69.43 65.05 67.42

Prediction Maximum Likelihood Classifier Results Testing Data 2012 2011 2010 2009 2008 2007 2006 Mean Rate(%) 67.31 64.37 67.88 67.43 69.43 65.05 67.42 Naive Majority Vote Rate from 8th game Data 2012 2011 2010 2009 2008 2007 2006 Mean Rate(%) 61.83 65.94 64.75 66.30 63.86 65.81 For comparison... Table extracted from: Matthew Beckler, Hongfei Wang. and Michael Papamichael NBA Oracle

Linear Regression / LMS Algorithm Prediction Linear Regression / LMS Algorithm - In order to find the weights, the LMS Algorithm was implemented: