Abnormal event detection in unseen scenarios

Journal Article

Event detection in unseen scenarios is a challenging problem due to high variability of scene type, viewing direction, nature of scene entities, and environmental conditions. Existing event detection approaches mostly rely on context-specific tuning and training. Consequently, these techniques fail to achieve high scalability in a large surveillance network with hundreds of video feeds where scenario specific tuning/training is impossible. In this paper, we present a generic event detection approach where the extracted low-level features represent the global characteristics of the target scene instead of any context-specific information. From the temporal evolution of these context-invariant features over a timeframe, a fixed number of temporal features are extracted based on the periodicity of significant transition points and associated temporal orders. Finally, top-ranked temporal features are used to train binary classifier-based event models. In this approach, supervised training and exhaustive feature extraction are required only once while building the target event models. During real-time operation in unseen scenarios, event detection is performed based on the trained event models by extracting the required features only. The proposed event detection approach has been demonstrated for abnormal event detection in completely unseen public place scenarios from benchmark datasets without additional training and tuning. Furthermore, the proposed event detection approach has also outperformed recent optical flow based event detection technique. © 2012 IEEE.

Full Text

Duke Authors

Cited Authors

  • Haque, M; Murshed, M

Published Date

  • October 4, 2012

Published In

  • Proceedings of the 2012 Ieee International Conference on Multimedia and Expo Workshops, Icmew 2012

Start / End Page

  • 378 - 383

Digital Object Identifier (DOI)

  • 10.1109/ICMEW.2012.72

Citation Source

  • Scopus