Stream flow rate prediction x = weather data f(x) = flow rate.

Slides:



Advertisements
Similar presentations
Active Reading: “Scientific Processes”
Advertisements

Learning Procedural Planning Knowledge in Complex Environments Douglas Pearson March 2004.
On Sample Selection Bias and Its Efficient Correction via Model Averaging and Unlabeled Examples Wei Fan Ian Davidson.
ReverseTesting: An Efficient Framework to Select Amongst Classifiers under Sample Selection Bias Wei Fan IBM T.J.Watson Ian Davidson SUNY Albany.
When Efficient Model Averaging Out-Perform Bagging and Boosting Ian Davidson, SUNY Albany Wei Fan, IBM T.J.Watson.
On / By / With The building blocks of the Mplus language.
Intelligent Information Technology Research Lab, Acadia University, Canada 1 Getting a Machine to Fly Learn Extending Our Reach Beyond Our Grasp Daniel.
Teaching Machines to Learn by Metaphors Omer Levy & Shaul Markovitch Technion – Israel Institute of Technology.
Artificial Neural Networks (1)
Test practice Multiplication. Multiplication 9x2.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Graphical Multi-Task Learning Dan Sheldon Cornell University NIPS SISO Workshop 12/12/2008.
1 Version 3 Module 8 Ethernet Switching. 2 Version 3 Ethernet Switching Ethernet is a shared media –One node can transmit data at a time More nodes increases.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Bayesian Learning Rong Jin. Outline MAP learning vs. ML learning Minimum description length principle Bayes optimal classifier Bagging.
Competitive Networks. Outline Hamming Network.
Declarative Learning. When a voluntary action is performed in response to a perceptual input A causal structural description is formed Perceptual representation.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
Human memory has two channels for processing information : visual & auditory Cognitive Theory How Do People Learn? Human memory has a limited capacity.
Classification of Music According to Genres Using Neural Networks, Genetic Algorithms and Fuzzy Systems.
Neural Networks Lab 5. What Is Neural Networks? Neural networks are composed of simple elements( Neurons) operating in parallel. Neural networks are composed.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Learning from Multiple Outlooks Maayan Harel and Shie Mannor ICML 2011 Presented by Minhua Chen.
Radial Basis Function Networks
Training and future (test) data follow the same distribution, and are in same feature space.
Comp 5013 Deep Learning Architectures Daniel L. Silver March,
Drones Collecting Cell Phone Data in LA AdNear had already been using methods.
COMP3503 Intro to Inductive Modeling
CS 478 – Tools for Machine Learning and Data Mining The Need for and Role of Bias.
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Machine Learning CSE 681 CH2 - Supervised Learning.
A PCB has 2 signal layers, a split power layer and one ground layer. It also has two solder mask and two silkscreen layers. Is the board classified as:
Test Review Magnetism Electricity. Format Mostly Fill in 10 multiple choice 4 Diagrams.
Transfer Learning Task. Problem Identification Dataset : A Year: 2000 Features: 48 Training Model ‘M’ Testing 98.6% Training Model ‘M’ Testing 97% Dataset.
@delbrians Transfer Learning: Using the Data You Have, not the Data You Want. October, 2013 Brian d’Alessandro.
Modern Topics in Multivariate Methods for Data Analysis.
Transfer Learning Motivation and Types Functional Transfer Learning Representational Transfer Learning References.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
P1801 – Merging of Power Domains Gary Delp. 2 Draft - proposal provided to P1801 (and other groups) by LSI - #include LSI Confidential Purpose Provide.
HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY Transfer for Supervised Learning Tasks.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Image Classification for Automatic Annotation
Chapter 1 Vocabulary Quiz. Multiple Choice 1. A primary source is… A: information gathered by someone who did not take part or witness an event B: the.
Intelligent Information Technology Research Lab, Acadia University, Canada 1 Machine Lifelong Learning: Inductive Transfer with Context- sensitive Neural.
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Web-Mining Agents: Transfer Learning TrAdaBoost R. Möller Institute of Information Systems University of Lübeck.
Transfer and Multitask Learning Steve Clanton. Multiple Tasks and Generalization “The ability of a system to recognize and apply knowledge and skills.
IELTS Reading Test GENERAL PRESENTATION
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Combining Neural Networks and Context-Driven Search for On- Line, Printed Handwriting Recognition in the Newton Larry S. Yaeger, Brandn J. Web, and Richard.
Randomness in Neural Networks
LEARNING & MEMORY Jaan Aru
Lifelong Machine Learning and Reasoning
Plan Agents Chapter 7..
CS 9633 Machine Learning Inductive-Analytical Methods
Chapter 11: Learning Introduction
Common Furniture Arrangement
Unsupervised Learning and Autoencoders
Inductive Transfer, Machine Lifelong Learning, and AGI
Machine Learning Week 1.
التأهيل التربوي المملكة العربية السعودية جامعة الملك عبدالعزيز
ريكاوري (بازگشت به حالت اوليه)
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
פחת ורווח הון סוגיות מיוחדות תהילה ששון עו"ד (רו"ח) ספטמבר 2015
Revealing priors on category structures through iterated learning
التعلم بالإكتشاف المراجع:
Суури мэдлэг Basic Knowledge
Learning Incoherent Sparse and Low-Rank Patterns from Multiple Tasks
Predicting Voter Choice from Census Data
Lecture 09: Introduction Image Recognition using Neural Networks
Presentation transcript:

Stream flow rate prediction x = weather data f(x) = flow rate

2 Transfer Learning and MTL Instance Space X Training Examples Testing Examples ( x, f(x)) Model of Classifier h Inductive Learning System short-term memory h(x) ~ f(x) Domain Knowledge long-term memory Retention & Consolidation Inductive Bias Selection Knowledge Transfer

3 Transfer Learning and MTL Instance Space X Training Examples Testing Examples ( x, f(x)) Model of Classifier h h(x) ~ f(x) Domain Knowledge long-term memory Retention & Consolidation Knowledge Transfer f2(x)f2(x) x1x1 xnxn f1(x)f1(x) fk(x)fk(x) Multiple Task Learning (MTL) Inductive Bias Selection

4 Multiple Task Learning (MTL) f2(x)f2(x) x1x1 xnxn f1(x)f1(x) Task specific representation Common internal Representation Common feature layer fk(x)fk(x) Multiple hypotheses develop in parallel within one back- propagation network [Caruana, Baxter, Silver 93-95] An inductive bias occurs through shared use of common internal representation Knowledge or Inductive transfer to primary task f 1 (x) depends on choice of secondary tasks