Publication Venues Main Neural Network Conferences –NIPS (Neural Information Processing Systems) –IJCNN (Intl Joint Conf on Neural Networks) Main Neural.

Slides:



Advertisements
Similar presentations
Neural networks Introduction Fitting neural networks
Advertisements

Polynomial Curve Fitting BITS C464/BITS F464 Navneet Goyal Department of Computer Science, BITS-Pilani, Pilani Campus, India.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 15 Nov, 1, 2011 Slide credit: C. Conati, S.
Data Mining Classification: Alternative Techniques
Knowledge Representation and Reasoning Learning Sets of Rules and Analytical Learning Harris Georgiou – 4.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Instance Based Learning
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
Class Discussion Chapter 2 Neural Networks. Top Down vs Bottom Up What are the differences between the approaches to AI in chapter one and chapter two?
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
OLS REGRESSION VS. NEURAL NETWORKS VS. MARS A COMPARISON R. J. Lievano E. Kyper University of Minnesota Duluth.
Three kinds of learning
General Mining Issues a.j.m.m. (ton) weijters Overfitting Noise and Overfitting Quality of mined models (some figures are based on the ML-introduction.
CS Instance Based Learning1 Instance Based Learning.
Saturation, Flat-spotting Shift up Derivative Weight Decay No derivative on output nodes.
Part I: Classification and Bayesian Learning
ML Concepts Covered in 678 Advanced MLP concepts: Higher Order, Batch, Classification Based, etc. Recurrent Neural Networks Support Vector Machines Relaxation.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
INTRODUCTION TO MACHINE LEARNING. $1,000,000 Machine Learning  Learn models from data  Three main types of learning :  Supervised learning  Unsupervised.
COMP3503 Intro to Inductive Modeling
Machine Learning Chapter 11. Analytical Learning
Are we still talking about diversity in classifier ensembles? Ludmila I Kuncheva School of Computer Science Bangor University, UK.
Full model selection with heuristic search: a first approach with PSO Hugo Jair Escalante Computer Science Department, Instituto Nacional de Astrofísica,
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
1 Integration of Neural Network and Fuzzy system for Stock Price Prediction Student : Dah-Sheng Lee Professor: Hahn-Ming Lee Date:5 December 2003.
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
Treatment Learning: Implementation and Application Ying Hu Electrical & Computer Engineering University of British Columbia.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence Friday, February 4, 2000 Lijun.
Categorical data. Decision Tree Classification Which feature to split on? Try to classify as many as possible with each split (This is a good split)
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 5.
ARTIFICIAL NEURAL NETWORKS. Overview EdGeneral concepts Areej:Learning and Training Wesley:Limitations and optimization of ANNs Cora:Applications and.
1 CS 391L: Machine Learning: Experimental Evaluation Raymond J. Mooney University of Texas at Austin.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 9: Ways of speeding up the learning and preventing overfitting Geoffrey Hinton.
Introduction to Machine Learning Supervised Learning 姓名 : 李政軒.
Decision Tree Learning R&N: Chap. 18, Sect. 18.1–3.
Ensemble Methods: Bagging and Boosting
BAGGING ALGORITHM, ONLINE BOOSTING AND VISION Se – Hoon Park.
Artificial Intelligence Chapter 3 Neural Networks Artificial Intelligence Chapter 3 Neural Networks Biointelligence Lab School of Computer Sci. & Eng.
ISCG8025 Machine Learning for Intelligent Data and Information Processing Week 3 Practical Notes Application Advice *Courtesy of Associate Professor Andrew.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CS Inductive Bias1 Inductive Bias: How to generalize on novel data.
A Generalized Version Space Learning Algorithm for Noisy and Uncertain Data T.-P. Hong, S.-S. Tseng IEEE Transactions on Knowledge and Data Engineering,
SAL: A Game Learning Machine Joel Paulson & Brian Lanners.
1 CS 4700: Foundations of Artificial Intelligence Prof. Bart Selman Machine Learning: The Theory of Learning R&N 18.5.
CORRECTIONS L2 regularization ||w|| 2 2, not ||w|| 2 Show second derivative is positive or negative on exams, or show convex – Latter is easier (e.g. x.
Ping-Tsun Chang Intelligent Systems Laboratory Computer Science and Information Engineering National Taiwan University Combining Unsupervised Feature Selection.
Data Mining and Decision Support
Machine Learning in CSC 196K

Overfitting, Bias/Variance tradeoff. 2 Content of the presentation Bias and variance definitions Parameters that influence bias and variance Bias and.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
1 CS 391L: Machine Learning: Computational Learning Theory Raymond J. Mooney University of Texas at Austin.
Eick: Introduction Machine Learning
CS 9633 Machine Learning Inductive-Analytical Methods
Computational Learning Theory
CSE 4705 Artificial Intelligence
Chapter 11: Learning Introduction
A “Holy Grail” of Machine Learing
Hyperparameters, bias-variance tradeoff, validation
Artificial Intelligence Chapter 3 Neural Networks
network of simple neuron-like computing elements
Artificial Intelligence Chapter 3 Neural Networks
Overfitting and Underfitting
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Christoph F. Eick: A Gentle Introduction to Machine Learning
Sanguthevar Rajasekaran University of Connecticut
Introduction to Machine learning
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Publication Venues Main Neural Network Conferences –NIPS (Neural Information Processing Systems) –IJCNN (Intl Joint Conf on Neural Networks) Main Neural Network Journals –Neural Networks –Neural Computation –IEEE Transactions on Neural Networks

Publication Venues Main Machine Learning Conferences –ICML (Intl Conf on Machine Learning) –COLT (Computational Learning Theory) Main Machine Learning Journals –ML (Machine Learning) –JMLR (J. Machine Learning Research) –JAIR (J. Artificial Intelligence Research)

Underfit and Overfit

Need for Bias 2 2 n Boolean function of n inputs x1x2x3ClassPossible Consistent Function Hypotheses ?

No Free Lunch Any inductive bias chosen will have equal accuracy compared to any other bias over all possible functions (assuming all functions are equally likely). If correct on some cases, must be incorrect on equally many cases. Is this a problem? –Random vs. Regular –Anti-Bias (even though regular) –The “Interesting” Problems – subset of learnable?

Automatic Discover of Inductive Bias Defining the set of Interesting/Learnable problems – No Free Lunch concepts Defining the set of available inductive biases oProposing novel learning algorithms oAnalysis, comparison, and extension of current learning algorithms oDefining/discovering a set of biases which covers I (Interesting problems) oParameter Free learning algorithms – Automatic selection of learning parameters Automatically fitting a bias to a problem – Overfit, underfit, noise issues, etc. Automatic Feature Selection

ADIB (Cont.) Dynamic Inductive Biases oPre-selection of an appropriate bias based on the application data set oAutomatically selecting a bias during learning oBias which adjusts dynamically in time during learning oBias which adjusts dynamically in space during learning (different parts of the problem space are better learned with different biases, including differing parameters in one bias). oCombinations of the above Combination of Biases oLinear and non-linear combinations of biases oDynamic combinations of biases oEnsemble variants