Integration of wearable and ambient sensors towards characterization of physical effort
Human performance monitoring in complex operational environments calls for sensing solutions that measure human physiology as well as human interactions with their surroundings. Recent advances in multimodal sensing have led to the development of intelligent environments that analyze human activities with high granularity. One of the greatest challenges is to unify multiple discrete sensing systems through synchronization and integration of multimodal data streams. This paper describes an intelligent environment that consolidates wearable skin-strain sensors for physiological monitoring; geophones and microphones to record ambient vibrations and sounds; and video cameras to visually observe human activities. We show proof-of-concept functionality by using the system to differentiate walking effort in human subjects. First, the work shows the alignment of wearable and ambient sensor time-history records. Then, data features are extracted and correlated to walking speed using three sensor modalities. Finally, feature-level analysis is done to associate the data features with the perceived walking exertion for each subject.