Understanding Business Intelligence with Neural Networks James McCaffrey Microsoft Research Labs Wednesday, May 4, 2016 3:15 – 4:00 PM Room Breakers CD.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
also known as the “Perceptron”
Back-propagation Chih-yun Lin 5/16/2015. Agenda Perceptron vs. back-propagation network Network structure Learning rule Why a hidden layer? An example:
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Decision making in episodic environments
Lecture 14 – Neural Networks
x – independent variable (input)
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Neural Networks An Introduction.
Data mining and statistical learning - lecture 11 Neural networks - a model class providing a joint framework for prediction and classification  Relationship.
Machine learning Image source:
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
SPAM DETECTION USING MACHINE LEARNING Lydia Song, Lauren Steimle, Xiaoxiao Xu.
1 Introduction to Artificial Neural Networks Andrew L. Nelson Visiting Research Faculty University of South Florida.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
Chapter 9 Neural Network.
8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo.
Appendix B: An Example of Back-propagation algorithm
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
Back-Propagation MLP Neural Network Optimizer ECE 539 Andrew Beckwith.
Classification / Regression Neural Networks 2
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
Multi-Layer Perceptron
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
Insight: Steal from Existing Supervised Learning Methods! Training = {X,Y} Error = target output – actual output.
Fundamentals of Artificial Neural Networks Chapter 7 in amlbook.com.
© Devi Parikh 2008 Devi Parikh and Tsuhan Chen Carnegie Mellon University April 3, ICASSP 2008 Bringing Diverse Classifiers to Common Grounds: dtransform.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
1 Statistics & R, TiP, 2011/12 Neural Networks  Technique for discrimination & regression problems  More mathematical theoretical foundation  Works.
Neural Networks Vladimir Pleskonjić 3188/ /20 Vladimir Pleskonjić General Feedforward neural networks Inputs are numeric features Outputs are in.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Modelleerimine ja Juhtimine Tehisnärvivõrgudega Identification and Control with artificial neural networks.
Information Processing by Neuronal Populations Chapter 6: Single-neuron and ensemble contributions to decoding simultaneously recoded spike trains Information.
Azure Machine Learning My first Data Science experiment Using Azure Machine Learning.
語音訊號處理之初步實驗 NTU Speech Lab 指導教授: 李琳山 助教: 熊信寬
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CEE 6410 Water Resources Systems Analysis
Deep learning David Kauchak CS158 – Fall 2016.
Machine Learning & Deep Learning
Artificial Neural Networks
Understanding Neural Networks using .NET
Modelleerimine ja Juhtimine Tehisnärvivõrgudega
CSE 473 Introduction to Artificial Intelligence Neural Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Understanding the Difficulty of Training Deep Feedforward Neural Networks Qiyue Wang Oct 27, 2017.
CSSE463: Image Recognition Day 20
Neural Networks Advantages Criticism
Artificial Intelligence Chapter 3 Neural Networks
Neural Networks Geoff Hulten.
Artificial Intelligence Chapter 3 Neural Networks
Sigmoid and logistic regression
Multilayer Perceptron: Learning : {(xi, f(xi)) | i = 1 ~ N} → W
Automated Recipe Completion using Multi-Label Neural Networks
Artificial neurons Nisheeth 10th January 2019.
Artificial Intelligence Chapter 3 Neural Networks
Lecture 04: Multilayer Perceptron
COSC 4368 Machine Learning Organization
Backpropagation David Kauchak CS159 – Fall 2019.
Artificial Intelligence Chapter 3 Neural Networks
David Kauchak CS51A Spring 2019
Neural Networks Weka Lab
LHC beam mode classification
Presentation transcript:

Understanding Business Intelligence with Neural Networks James McCaffrey Microsoft Research Labs Wednesday, May 4, :15 – 4:00 PM Room Breakers CD

Agenda and Goals What is a neural network? How do neural networks work? How can you create a neural network? Q & A

training data independent variables / predictors / signals / attributes / features / X-values “the thing to classify (predict)” / label / dependent variable / Y What is a Neural Network?

How Do Neural Networks Work? 35 49,000 M Republican 0.62 inputhiddenoutput High weights and biases

How Do Neural Networks Work? 35 49,000 M Republican 0.62 inputhiddenoutput High encoding normalization weights and biases

Under the Hood ?? 1). (0.1)(4.0) + (0.2)(-5.0) + (0.3)(6.0) = 1.2 2) = 3.2 3). Activation(3.2) = ). ?? = Local output = perceptron weights and biases activation function

Activation Functions and Why You Don’t Care Logistic Sigmoid Output between [0, 1] y = 1.0 / (1.0 + e –x ) Hyperbolic Tangent Output between [-1, +1] y = tanh(x) = (e x – e -x ) / (e x + e -x ) Softmax Outputs between [0, 1] and sum to 1.0 y = (e -xi ) / Σ ( e -xj ) activation function

Training and Free Parameters – No Free Lunch Number of weights and bias values to determine: (n i * n h ) + (n h * n o ) + (n h + n o ) Ex: n i = 10, n h = 20, n o = 3 (10 * 20) + (20 * 3) + (20 + 3) = 283 back-propagation free parameters

Research vs. Reality back-propagation free parameters

Over-Fitting: The Biggest Challenge over-fitting error, accuracy

Four Ways to Actually Create a Neural Network Use an existing application tool (Weka, Azure ML) Hire a vendor company Use a library/API to create a custom system Create custom system from scratch

Weka

Azure Machine Learning

Neural Network from Scratch

Summary If your data can be put into a spreadsheet, then a neural network can be used to predict any of the columns. You don’t need to know how to implement a neural network, but you do need to know the 10 key vocabulary terms so you can communicate. The 10 Key Terms: perceptron, features, encoding, normalization, weights & biases, activation function, back-propagation, free parameters, over-fitting, error & accuracy. Neural network tools for Big Data are not quite ready for prime time.

Resources WEKA Step-by-Step: Technical Bible: ftp://ftp.sas.com/pub/neural/FAQ.html Python Implementation:

Thank You! Understanding Business Intelligence with Neural Networks James McCaffrey Microsoft Research Labs Wednesday, May 4, :15 – 4:00 PM Room Breakers CD