Multi-task learning for sequential data via iHMMs and the nested Dirichlet process

Published

Journal Article

A new hierarchical nonparametric Bayesian model is proposed for the problem of multitask learning (MTL) with sequential data. Sequential data are typically modeled with a hidden Markov model (HMM), for which one often must choose an appropriate model structure (number of states) before learning. Here we model sequential data from each task with an infinite hidden Markov model (iHMM), avoiding the problem of model selection. The MTL for iHMMs is implemented by imposing a nested Dirichlet process (nDP) prior on the base distributions of the iHMMs. The nDP-iHMM MTL method allows us to perform task-level clustering and data-level clustering simultaneously, with which the learning for individual iHMMs is enhanced and between-task similarities are learned. Learning and inference for the nDP-iHMM MTL are based on a Gibbs sampler. The effectiveness of the framework is demonstrated using synthetic data as well as real music data.

Full Text

Duke Authors

Cited Authors

  • Ni, K; Carin, L; Dunson, D

Published Date

  • August 23, 2007

Published In

  • Acm International Conference Proceeding Series

Volume / Issue

  • 227 /

Start / End Page

  • 689 - 696

Digital Object Identifier (DOI)

  • 10.1145/1273496.1273583

Citation Source

  • Scopus