Neurophysiology of Visual-Motor Learning during a Simulated Marksmanship Task in Immersive Virtual Reality
Immersive virtual reality (VR) systems offer flexible control of an interactive environment, along with precise position and orientation tracking of realistic movements. Immersive VR can also be used in conjunction with neurophysiological monitoring techniques, such as electroencephalography (EEG), to record neural activity as users perform complex tasks. As such, the fusion of VR, kinematic tracking, and EEG offers a powerful testbed for naturalistic neuroscience research. In this study, we combine these elements to investigate the cognitive and neural mechanisms that underlie motor skill learning during a multi-day simulated marksmanship training regimen conducted with 20 participants. On each of 3 days, participants performed 8 blocks of 60 trials in which a simulated clay pigeon was launched from behind a trap house. Participants attempted to shoot the moving target with a firearm game controller, receiving immediate positional feedback and running scores after each shot. Over the course of the 3 days that individuals practiced this protocol, shot accuracy and precision improved significantly while reaction times got significantly faster. Furthermore, results demonstrate that more negative EEG amplitudes produced over the visual cortices correlate with better shooting performance measured by accuracy, reaction times, and response times, indicating that early visual system plasticity underlies behavioral learning in this task. These findings point towards a naturalistic neuroscience approach that can be used to identify neural markers of marksmanship performance.
Clements, JM; Kopper, R; Zielinski, DJ; Rao, H; Sommer, MA; Kirsch, E; Mainsah, BO; Collins, LM; Appelbaum, LG
25th Ieee Conference on Virtual Reality and 3d User Interfaces, Vr 2018 Proceedings
Start / End Page
International Standard Book Number 13 (ISBN-13)
Digital Object Identifier (DOI)