A Survey on Transfer Learning.ppt
《A Survey on Transfer Learning.ppt》由会员分享,可在线阅读,更多相关《A Survey on Transfer Learning.ppt(42页珍藏版)》请在麦多课文档分享上搜索。
1、A Survey on Transfer Learning,Sinno Jialin Pan Department of Computer Science and Engineering The Hong Kong University of Science and Technology Joint work with Prof. Qiang Yang,Transfer Learning? (DARPA 05),Transfer Learning (TL): The ability of a system to recognize and apply knowledge and skills
2、learned in previous tasks to novel tasks (in new domains),It is motivated by human learning. People can often transfer knowledge learnt previously to novel situationsChess CheckersMathematics Computer ScienceTable Tennis Tennis,Outline,Traditional Machine Learning vs. Transfer LearningWhy Transfer L
3、earning?Settings of Transfer LearningApproaches to Transfer LearningNegative TransferConclusion,Outline,Traditional Machine Learning vs. Transfer LearningWhy Transfer Learning?Settings of Transfer LearningApproaches to Transfer LearningNegative TransferConclusion,Traditional ML vs. TL (P. Langley 06
4、),Traditional ML vs. TL,Learning Process of Traditional ML,Learning Process of Transfer Learning,Notation,Domain: It consists of two components: A feature space , a marginal distribution In general, if two domains are different, then they may have different feature spaces or different marginal distr
5、ibutions.Task: Given a specific domain and label space , for each in the domain, to predict its corresponding label In general, if two tasks are different, then they may have different label spaces or different conditional distributions,Notation,For simplicity, we only consider at most two domains a
6、nd two tasks.Source domain: Task in the source domain:Target domain:Task in the target domain,Outline,Traditional Machine Learning vs. Transfer LearningWhy Transfer Learning?Settings of Transfer LearningApproaches to Transfer LearningNegative TransferConclusion,Why Transfer Learning?,In some domains
7、, labeled data are in short supply. In some domains, the calibration effort is very expensive.In some domains, the learning process is time consuming.,How to extract knowledge learnt from related domains to help learning in a target domain with a few labeled data?How to extract knowledge learnt from
8、 related domains to speed up learning in a target domain?,Transfer learning techniques may help!,Outline,Traditional Machine Learning vs. Transfer LearningWhy Transfer Learning?Settings of Transfer LearningApproaches to Transfer LearningNegative TransferConclusion,Settings of Transfer Learning,Trans
9、fer Learning,Multi-task Learning,Transductive Transfer Learning,Unsupervised Transfer Learning,Inductive Transfer Learning,Domain Adaptation,Sample Selection Bias /Covariance Shift,Self-taught Learning,Labeled data are available in a target domain,Labeled data are available only in a source domain,N
10、o labeled data in both source and target domain,No labeled data in a source domain,Labeled data are available in a source domain,Case 1,Case 2,Source and target tasks are learnt simultaneously,Assumption: different domains but single task,Assumption: single domain and single task,An overview of vari
11、ous settings of transfer learning,Outline,Traditional Machine Learning vs. Transfer LearningWhy Transfer Learning?Settings of Transfer LearningApproaches to Transfer LearningNegative TransferConclusion,Approaches to Transfer Learning,Approaches to Transfer Learning,Outline,Traditional Machine Learni
12、ng vs. Transfer LearningWhy Transfer Learning?Settings of Transfer LearningApproaches to Transfer Learning Inductive Transfer Learning Transductive Transfer Learning Unsupervised Transfer Learning,Inductive Transfer Learning Instance-transfer Approaches,Assumption: the source domain and target domai
13、n data use exactly the same features and labels. Motivation: Although the source domain data can not be reused directly, there are some parts of the data that can still be reused by re-weighting.Main Idea: Discriminatively adjust weighs of data in the source domain for use in the target domain.,Indu
14、ctive Transfer Learning - Instance-transfer Approaches Non-standard SVMs Wu and Dietterich ICML-04,Differentiate the cost for misclassification of the target and source data,Uniform weights,Correct the decision boundary by re-weighting,Loss function on the target domain data,Loss function on the sou
15、rce domain data,Regularization term,Inductive Transfer Learning - Instance-transfer Approaches TrAdaBoost Dai et al. ICML-07,Inductive Transfer Learning Feature-representation-transfer Approaches Supervised Feature Construction Argyriou et al. NIPS-06, NIPS-07,Assumption: If t tasks are related to e
16、ach other, then they may share some common features which can benefit for all tasks. Input: t tasks, each of them has its own training data. Output: Common features learnt across t tasks and t models for t tasks, respectively.,Supervised Feature Construction Argyriou et al. NIPS-06, NIPS-07,where,Av
17、erage of the empirical error across t tasks,Regularization to make the representation sparse,Orthogonal Constraints,Inductive Transfer Learning Feature-representation-transfer Approaches Unsupervised Feature Construction Raina et al. ICML-07,Three steps: Applying sparse coding Lee et al. NIPS-07 alg
18、orithm to learn higher-level representation from unlabeled data in the source domain.Transforming the target data to new representations by new bases learnt in the first step. Traditional discriminative models can be applied on new representations of the target data with corresponding labels.,Step1:
19、Input: Source domain data and coefficient Output: New representations of the source domain data and new bases Step2:Input: Target domain data , coefficient and bases Output: New representations of the target domain data,Unsupervised Feature Construction Raina et al. ICML-07,Inductive Transfer Learni
20、ng Model-transfer Approaches Regularization-based Method Evgeiou and Pontil, KDD-04,Assumption: If t tasks are related to each other, then they may share some parameters among individual models. Assume be a hyper-plane for task , where and Encode them into SVMs:,Common part,Specific part for individ
- 1.请仔细阅读文档,确保文档完整性,对于不预览、不比对内容而直接下载带来的问题本站不予受理。
- 2.下载的文档,不会出现我们的网址水印。
- 3、该文档所得收入(下载+内容+预览)归上传者、原创作者;如果您是本文档原作者,请点此认领!既往收益都归您。
下载文档到电脑,查找使用更方便
2000 积分 0人已下载
下载 | 加入VIP,交流精品资源 |
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- ASURVEYONTRANSFERLEARNINGPPT
