Compressive particle filtering for target tracking
This paper presents a novel compressive particle filter (henceforth CPF) for tracking one or more targets in video using a reduced set of observations. It is shown that, by applying compressive sensing ideas in a multi-particle-filter framework, it is possible to preserve tracking performance while achieving considerable dimensionality reduction, avoiding costly feature extraction procedures. Additionally, the target locations are estimated directly, without the need to reconstruct each image. This can be done using linear measurements which, under certain conditions, preserve crucial observability properties. The paper presents a state-space model and a tracking algorithm that incorporate these ideas. Performance is illustrated using both toy examples and real video, and with two different measurement ensembles. © 2009 IEEE.