Download presentation
Presentation is loading. Please wait.
Published byBrendan Stevenson Modified over 9 years ago
1
Dual Transfer Learning Mingsheng Long 1,2, Jianmin Wang 2, Guiguang Ding 2 Wei Cheng, Xiang Zhang, and Wei Wang 1 Department of Computer Science and Technology 2 School of Software, Tsinghua University, Beijing 100084, China
2
Outline Motivation The Framework Dual Transfer Learning An Implementation Joint Nonnegative Matrix Tri-Factorization Experiments Conclusion
3
Notations Domain Feature space Two domains are different Task Given feature space and label space Learn or estimate where Two tasks are different
4
Motivation Exploring the marginal distributions Target comp.hard ware Target comp.hard ware Source comp.os Source comp.os Latent factors Task scheduling Performance ArchitecturePower consumption Cause the discrepancy between domains Represent the commonality between domains
5
Motivation Exploring the conditional distributions Target comp.hard ware Target comp.hard ware Source comp.os Source comp.os Model parameters Task scheduling → comp Performance →comp Architecture →comp Power consumption →comp Represent the commonality between tasks
6
The Framework: Dual Transfer Learning (DTL) Simultaneously learning the marginal distribution and the conditional distribution Marginal mapping: learning the marginal distribution Conditional mapping: learning the conditional distribution Exploring the duality for mutual reinforcement Learning one distribution can help to learn the other distribution
7
Nonnegative Matrix Tri-Factorization (NMTF) k feature clusters, latent factors induce marginal mapping c example classes, representing the categorical information association between k feature clusters and c example classes, model parameters induce conditional mapping
8
An Implementation: Joint NMTF Marginal mapping: learning the marginal distribution Target comp.hard ware Target comp.hard ware Source comp.os Source comp.os Task scheduling Performance ArchitecturePower consumption Cause the discrepancy between domains Represent the commonality between domains Latent factors
9
An Implementation: Joint NMTF Conditional mapping: learning the conditional distribution Target comp.hard ware Target comp.hard ware Source comp.os Source comp.os Task scheduling → comp Performance →comp Architecture →comp Power consumption →comp Represent the commonality between tasks Model parameters
10
An Implementation: Joint NMTF Joint Nonnegative Matrix Tri-Factorization Solution to the Joint NMTF optimization problem Dual Transfer Learning
11
Joint NMTF: Theoretical Analysis Derivation Formulate a Lagrange function for the optimization problem Use the KKT condition Convergence Prove it by the auxiliary function approach [Ding et al. KDD’06]
12
Experiments Open data sets 20-Newsgroups Reuters-21578 Each cross-domain data set 8,000 documents, 15,000 features approximately Evaluation Criteria
13
Experiments Non-transfer methods: NMF, SVM, LR, TSVM Transfer learning methods: Co-Clustering based Classification (CoCC) [Dai et al. KDD’07] Matrix Tri-Factorization based Classification (MTrick) [Zhuang et al. SDM’10] Dual Knowledge Transfer (DKT) [Wang et al. SIGIR’11]
14
Experiments Parameter sensitivity and algorithm convergence
15
Conclusion We proposed a novel Dual Transfer Learning (DTL) framework Exploring the duality between the marginal distribution and the conditional distribution for mutual reinforcement We implemented a novel Joint NMTF algorithm based on the DTL framework Experimental results validated that DTL is superior to the state-of-the-art single transfer learning methods
16
Any Questions? Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.