Semisupervised multitask learning.
Journal Article (Journal Article)
Context plays an important role when performing classification, and in this paper we examine context from two perspectives. First, the classification of items within a single task is placed within the context of distinct concurrent or previous classification tasks (multiple distinct data collections). This is referred to as multi-task learning (MTL), and is implemented here in a statistical manner, using a simplified form of the Dirichlet process. In addition, when performing many classification tasks one has simultaneous access to all unlabeled data that must be classified, and therefore there is an opportunity to place the classification of any one feature vector within the context of all unlabeled feature vectors; this is referred to as semi-supervised learning. In this paper we integrate MTL and semi-supervised learning into a single framework, thereby exploiting two forms of contextual information. Example results are presented on a "toy" example, to demonstrate the concept, and the algorithm is also applied to three real data sets.
Full Text
Duke Authors
Cited Authors
- Liu, Q; Liao, X; Carin, HL; Stack, JR; Carin, L
Published Date
- June 2009
Published In
Volume / Issue
- 31 / 6
Start / End Page
- 1074 - 1086
PubMed ID
- 19372611
Electronic International Standard Serial Number (EISSN)
- 1939-3539
International Standard Serial Number (ISSN)
- 0162-8828
Digital Object Identifier (DOI)
- 10.1109/tpami.2008.296
Language
- eng