Download presentation
Presentation is loading. Please wait.
Published byRolf Hoover Modified over 9 years ago
1
HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY Transfer for Supervised Learning Tasks
2
Motivation Model Training Data ? Test Data Assumptions: 1.Training and Test are from same distribution 2.Training and Test are in same feature space Not True in many real-world applications
3
Examples: Web-document Classification Model ? Physics Machine Learning Life Science
4
Examples: Web-document Classification Model ? Physics Machine Learning Life Science Content Change ! Assumption violated! Learn a new model
5
Learn new Model : 1.Collect new Labeled Data 2.Build new model Learn new Model : 1.Collect new Labeled Data 2.Build new model Reuse & Adapt already learned model !
6
Examples: Image Classification Model One Features Task One Task One
7
Examples: Image Classification Cars Motorcycles Task Two Features Task One Features Task Two Reuse Model Two
8
Traditional Machine Learning vs. Transfer Source Task Knowledge Target Task Learning System Different Tasks Learning System Traditional Machine LearningTransfer Learning
9
Questions ?
10
Transfer Learning Definition Notation: Domain : Feature Space: Marginal Probability Distribution: with Given a domain then a task is : Label SpaceP(Y|X)
11
Transfer Learning Definition Given a source domain and source learning task, a target domain and a target learning task, transfer learning aims to help improve the learning of the target predictive function using the source knowledge, where or
12
Transfer Definition Therefore, if either : Domain Differences Task Differences
13
Examples: Cancer Data Age Smoking AgeHeight Smoking
14
Examples: Cancer Data Task Source: Classify into cancer or no cancer Task Target: Classify into cancer level one, cancer level two, cancer level three
15
Quiz When doesn’t transfer help ? (Hint: On what should you condition?)
16
Questions ?
17
Questions to answer when transferring What to Transfer ? How to Transfer ? When to Transfer ? Instances ? Model ? Features ? Map Model ? Unify Features ? Weight Instances ? In which Situations
18
Algorithms: TrAdaBoost Assumptions: Source and Target task have same feature space: Marginal distributions are different: Not all source data might be helpful !
19
Algorithms: TrAdaBoost (Quiz) What to Transfer ? How to Transfer ? Instances Weight Instances
20
Algorithm: TrAdaBoost Idea: Iteratively reweight source samples such that: reduce effect of “bad” source instances encourage effect of “good” source instances Requires: Source task labeled data set Very small Target task labeled data set Unlabeled Target data set Base Learner
21
Algorithm: TrAdaBoost Weights Initialization Hypothesis Learning and error calculation Weights Update
22
Questions ?
23
Algorithms: Self-Taught Learning Problem Targeted is : 1.Little labeled data 2.A lot of unlabeled data Build a model on the labeled data Natural scenes Car Motorcycle
24
Algorithms: Self-Taught Learning Assumptions: Source and Target task have different feature space: Marginal distributions are different: Label Space is different:
25
Algorithms: Self-Taught Learning (Quiz) What to Transfer ? How to Transfer ? 1. Discover Features 2. Unify Features Features
26
Algorithms: Self-Taught Learning Framework: Source Unlabeled data set: Target Labeled data set: Natural scenes Build classifier for cars and Motorbikes
27
Algorithms: Self-Taught Learning Step One: Discover high level features from Source data by Regularization TermRe-construction Error Constraint on the Bases
28
Algorithm: Self-Taught Learning Unlabeled Data Set High Level Features
29
Algorithm: Self-Taught Learning Step Two: Project target data onto the attained features by Informally, find the activations in the attained bases such that: 1.Re-construction is minimized 2.Attained vector is sparse
30
Algorithms: Self-Taught Learning High Level Features
31
Algorithms: Self-Taught Learning Step Three: Learn a Classifier with the new features Target Task Source Task Learn new features (Step 1) Project target data (Step 2) Learn Model (Step 3)
32
Conclusions Transfer learning is to re-use source knowledge to help a target learner Transfer learning is not generalization TrAdaBoot transfers instances Self-Taught learning transfer unlabeled features
33
hmm
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.