Autonomously detecting interaction with an affective robot to explore connection to developmental ability
This research employs an expressive robot to elicit affective response in young children and explore correlations between autonomously-detected play, affective response and developmental ability. In this study, we introduce a new, affective interface that combines sound, color, movement and context to simulate the expression of emotions. Our approach exploits social contingencies to emphasize the importance of situational cues in the proper interpretation of affective state. We studied a group of young children at various ages and stages of cognitive development, to: (1) evaluate the efficacy of using captured motion data to autonomously detect physical patterns of play while interacting with a robot, (2) examine relationships between physical play patterns and observed affective response and, (3) explore associations between developmental ability and play or affective response. This pilot study demonstrates that aggregate patterns of physical interaction with a robot are distinguishable through autonomous data collection. Further, statistical analyses demonstrates that developmental ability may be directly related to how a child interacts with and responds to an affective robot.