1. Multiview Transfer Learning and Multitask Learning
- Author
-
Ziang Dong, Lidan Wu, Shiliang Sun, and Liang Mao
- Subjects
business.industry ,Iterative method ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Multi-task learning ,Machine learning ,computer.software_genre ,Linear discriminant analysis ,Consistency (database systems) ,Margin (machine learning) ,AdaBoost ,Artificial intelligence ,Transfer of learning ,business ,Cluster analysis ,computer - Abstract
Transfer learning is proposed to transfer the learned knowledge from source domains to target domains where the target ones own fewer training data. Multitask learning learns multiple tasks simultaneously and makes use of the relationship among these tasks. Both of these learning methods can combine with the multiview learning, which exploits the information from the consistency of diverse views. In this chapter, we introduce four multiview transfer learning methods and three multiview multitask learning methods. We review research on multiview transfer learning under the large margin framework, discuss multiview discriminant transfer learning in detail, and introduce how to adapt Adaboost into multiview transfer learning. Three multiview multitask learning methods concentrate on the shared structures between tasks and views. The most natural way is to represent the relationships based on the bipartite graph and use an iterative algorithm to optimize its objective function. Another method constructs additional regularization function to ensure the view consistency. In general, convex shared structure learning algorithm provides structure parameters to share information. Besides, we introduce other methods; as supplements, where multi-transfer, multitask multiview discriminant analysis, and clustering are briefly mentioned.
- Published
- 2019
- Full Text
- View/download PDF