Person re-identification from gait using an autocorrelation network
We propose a new biometric feature based on autocorrelation using an end-to-end trained network to capture human gait from different viewpoints. Our method condenses an unbounded image stream into a fixed size descriptor, and capitalizes on the periodic nature of walking to leverage sequence self-similarity. Autocorrelation is invariant to start or end of the gait cycle, can be efficiently computed online, and is well suited for capturing pose frequencies. We demonstrate empirically that under equal settings an autocorrelation network provides a more complete representation for gait than existing work, resulting in improved person re-identification performance.