Machine Learning Methods Maximum entropy –Maxent is an example Boosting: –Boosted Regression Trees Neural Networks.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Artificial Intelligence
Rerun of machine learning Clustering and pattern recognition.
Neural networks Introduction Fitting neural networks
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
also known as the “Perceptron”
Neural Networks: Backpropagation algorithm Data Mining and Semantic Web University of Belgrade School of Electrical Engineering Chair of Computer Engineering.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
How computers answer questions An introduction to machine learning Peter Barnum August 7, 2008.
Artificial Intelligence (CS 461D)
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Artificial Neural Networks Artificial Neural Networks are (among other things) another technique for supervised learning k-Nearest Neighbor Decision Tree.
Three kinds of learning
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
What is AI:-  Ai is the science of making machine do things that would requires intelligence.  Computers with the ability to mimic or duplicate the.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
NEURAL NETWORKS FOR DATA MINING
Artificial Intelligence Lecture No. 29 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Machine Learning Lecture 1. Course Information Text book “Introduction to Machine Learning” by Ethem Alpaydin, MIT Press. Reference book “Data Mining.
An informal description of artificial neural networks John MacCormick.
BOOSTING David Kauchak CS451 – Fall Admin Final project.
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 12-1 Chapter 12 Advanced Intelligent Systems.
Artificial Intelligence & Neural Network
WEEK INTRODUCTION IT440 ARTIFICIAL INTELLIGENCE.
Predictive Modeling Spring 2005 CAMAR meeting Louise Francis, FCAS, MAAA Francis Analytics and Actuarial Data Mining, Inc
Artificial Intelligence, Expert Systems, and Neural Networks Group 10 Cameron Kinard Leaundre Zeno Heath Carley Megan Wiedmaier.
Ensemble Methods.  “No free lunch theorem” Wolpert and Macready 1995.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Neural Networks Demystified by Louise Francis Francis Analytics and Actuarial Data Mining, Inc.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
COMP24111: Machine Learning Ensemble Models Gavin Brown
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Each neuron has a threshold value Each neuron has weighted inputs from other neurons The input signals form a weighted sum If the activation level exceeds.
Regression Methods. Linear Regression  Simple linear regression (one predictor)  Multiple linear regression (multiple predictors)  Ordinary Least Squares.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
Neural networks (2) Reminder Avoiding overfitting Deep neural network Brief summary of supervised learning methods.
Business Analytics Several odds and ends Copyright © 2016 Curt Hill.
How do you get here?
Michael Holden Faculty Sponsor: Professor Gordon H. Dash.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Classification of models
PART IV: The Potential of Algorithmic Machines.
Artificial Intelligence (CS 370D)
Boosting and Additive Trees (2)
RESEARCH APPROACH.
A Simple Artificial Neuron
CSE 473 Introduction to Artificial Intelligence Neural Networks
Estimating Link Signatures with Machine Learning Algorithms
COMP61011 : Machine Learning Ensemble Models
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Chapter 12 Advanced Intelligent Systems
Overview of Machine Learning
Decision Trees By Cole Daily CSCI 446.
Artificial Intelligence Lecture No. 28
Regression Methods.
Introduction to Neural Networks
Presentation transcript:

Machine Learning Methods Maximum entropy –Maxent is an example Boosting: –Boosted Regression Trees Neural Networks

Machine Learning Branch of artificial intelligence Supervised learning: –Algorithms that “learn” from data Deals with representation and generalization Generalization: –Can operate on data that has not been seen before Rigor provided by computational learning theory

Issues Methods we’ll examine: –Work just like linear regression but can produce much more complex models –Fitting algorithms are “hidden” –Parameters harder to access and examine Produce better “model fits” Tend to “over fit”

Tamarisk: Temp and Precip 162 parameters

Boosting Using “weak learners” together to make a “stronger learner”. Gradient boosting –For regression problems –Uses an “ensemble” of weak prediction models Gradient tree boosting –Uses many tiny regression trees to make more complex models

Boosted Regression Trees The “weak” learners are individual, binary trees of three nodes There can be thousands of trees! The trees are hidden within the model Now we can really over fit our model!

Boosted Regression Trees Relative dominance black-spruce vs. deciduous trees in post-fire Alaska

Brown Shrimp BRT Model

A Neuron Wikipedia

Neural Network Basically: –Neurons “sum” charge from other neurons –When charge goes over a threshold, –The neuron turns “on” and sends a signal to other neurons

Artificial neural network AIDA

Artificial Neural Networks Advantages: –Very flexible –Can model “fuzzy” problems –Successes in simple visual recognition, some expert systems. Disadvantages: –Hidden model –Can be slow –Have not been able to solve a wide range of problems

Expert Systems Attempt to capture “expertise” Originally was part of the promise of neural networks Now largely driven off very large databases –WebMD was one attempt –Ask Jeeves was another (ask.com)