Transfer and Multitask Learning Steve Clanton. Multiple Tasks and Generalization “The ability of a system to recognize and apply knowledge and skills.

Slides:



Advertisements
Similar presentations
Pat Langley Arizona State University and Institute for the Study of Learning and Expertise Expertise, Transfer, and Innovation in.
Advertisements

Stream flow rate prediction x = weather data f(x) = flow rate.
A Survey on Transfer Learning Sinno Jialin Pan Department of Computer Science and Engineering The Hong Kong University of Science and Technology Joint.
Teaching Machines to Learn by Metaphors Omer Levy & Shaul Markovitch Technion – Israel Institute of Technology.
CHAPTER 2: Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Learning a Class from Examples.
On-line learning and Boosting
1 Semi-supervised learning for protein classification Brian R. King Chittibabu Guda, Ph.D. Department of Computer Science University at Albany, SUNY Gen*NY*sis.
Data Mining Classification: Alternative Techniques
1 CS 391L: Machine Learning: Instance Based Learning Raymond J. Mooney University of Texas at Austin.
Combining Inductive and Analytical Learning Ch 12. in Machine Learning Tom M. Mitchell 고려대학교 자연어처리 연구실 한 경 수
Introduction to Supervised Machine Learning Concepts PRESENTED BY B. Barla Cambazoglu February 21, 2014.
1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.1 Introduction Instance-Based Learning: Local approximation to the.
Building Global Models from Local Patterns A.J. Knobbe.
Regression. So far, we've been looking at classification problems, in which the y values are either 0 or 1. Now we'll briefly consider the case where.
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
Lazy Learning k-Nearest Neighbour Motivation: availability of large amounts of processing power improves our ability to tune k-NN classifiers.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Machine Learning: Symbol-Based
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
Training and future (test) data follow the same distribution, and are in same feature space.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Classifiers, Part 1 Week 1, video 3:. Prediction  Develop a model which can infer a single aspect of the data (predicted variable) from some combination.
Kansas State University Department of Computing and Information Sciences CIS 830: Advanced Topics in Artificial Intelligence From Data Mining To Knowledge.
Issues with Data Mining
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
GCSE Computer Science 2 YEAR COURSE Business & ICT Department.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 10a-11:30a Instructor: Christoph F. Eick Classroom:AH123
Big Data Research in Undergraduate Education George Karypis Department of Computer Science & Engineering University of Minnesota.
Multi-Task Learning for HIV Therapy Screening Steffen Bickel, Jasmina Bogojeska, Thomas Lengauer, Tobias Scheffer.
Transfer Learning Task. Problem Identification Dataset : A Year: 2000 Features: 48 Training Model ‘M’ Testing 98.6% Training Model ‘M’ Testing 97% Dataset.
@delbrians Transfer Learning: Using the Data You Have, not the Data You Want. October, 2013 Brian d’Alessandro.
Transfer Learning Motivation and Types Functional Transfer Learning Representational Transfer Learning References.
Multi-Task Learning for Boosting with Application to Web Search Ranking Olivier Chapelle et al. Presenter: Wei Cheng.
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 9 Instance-Based.
Ensemble Methods: Bagging and Boosting
Ensemble Learning Spring 2009 Ben-Gurion University of the Negev.
Learning from observations
Indirect Supervision Protocols for Learning in Natural Language Processing II. Learning by Inventing Binary Labels This work is supported by DARPA funding.
CISC Machine Learning for Solving Systems Problems Presented by: Ashwani Rao Dept of Computer & Information Sciences University of Delaware Learning.
Learning with AdaBoost
CS Inductive Bias1 Inductive Bias: How to generalize on novel data.
HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY Transfer for Supervised Learning Tasks.
Logistic Regression (Classification Algorithm)
Rehospitalization Analytics: Modeling and Reducing the Risks of Rehospitalization Chandan K. Reddy Department of Computer Science, Wayne State University.
Data Mining and Decision Support
Ontology Support for Abstraction Layer Modularization Hyun Cho, Jeff Gray Department of Computer Science University of Alabama
CS Machine Learning Instance Based Learning (Adapted from various sources)
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 1-2:30p Instructor: Christoph F. Eick Classroom:AH301
FNA/Spring CENG 562 – Machine Learning. FNA/Spring Contact information Instructor: Dr. Ferda N. Alpaslan
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Who am I? Work in Probabilistic Machine Learning Like to teach 
Data Transformation: Normalization
Instance Based Learning
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
Introductory Seminar on Research: Fall 2017
Machine Learning Week 1.
CIKM Competition 2014 Second Place Solution
Statistical Learning Dong Liu Dept. EEIS, USTC.
CIKM Competition 2014 Second Place Solution
David M. Levine, Baruch College (CUNY)
Data Warehousing and Data Mining
Learning Probabilistic Graphical Models Overview Learning Problems.
Timing analysis research
Computer Science CLASS :V. Computer Science CLASS :V.
Presentation transcript:

Transfer and Multitask Learning Steve Clanton

Multiple Tasks and Generalization “The ability of a system to recognize and apply knowledge and skills learned in previous tasks to novel tasks.” “aims to improve the learning of the target predictive function using knowledge” from in a related learning task or domain “solve learning tasks [better] using information gained from solving related tasks”

What are related tasks? (different domains) Example 1: cross-company software defect prediction Projects could be nearly identical but use different metrics Projects could use the same metrics but have very different distributions

What are related tasks? (different tasks) Example 3: Regression, multi-class classification, and binary classification Example 4: Predicting driving direction, predicting position of lines on road

Different Types of Tasks

How to Transfer Domain Knowledge Instance transfer (e.g. feature weighting) Feature-representation transfer Parameter Transfer

How to Transfer Knowledge Example: Pneumonia Study (Backpropagation net)

How to Transfer Domain Knowledge Backpropagation nets K-nearest neighbor Kernel regression Decision trees All of these work off inductive bias

Benefits: What’s better? Potential to handle noise through averaging (learn with less data) Potential to discover an important latent feature Eavesdropping: tell the learner a concept is important Representation bias: add stability

References Pan, S.J. & Yang, Q. 2010, "A Survey on Transfer Learning", IEEE Transactions on Knowledge and Data Engineering, vol. 22, no. 10, pp Caruana, R. 1997, "Multitask Learning", Machine Learning,vol. 28, no. 1, pp Hassan Mahmud, M.M. 2009, "On universal transfer learning", Theoretical Computer Science, vol. 410, no. 19, pp