Studying the effects of stereo, head tracking, and field of regard on a small-scale spatial judgment task.


Journal Article

Spatial judgments are important for many real-world tasks in engineering and scientific visualization. While existing research provides evidence that higher levels of display and interaction fidelity in virtual reality systems offer advantages for spatial understanding, few investigations have focused on small-scale spatial judgments or employed experimental tasks similar to those used in real-world applications. After an earlier study that considered a broad analysis of various spatial understanding tasks, we present the results of a follow-up study focusing on small-scale spatial judgments. In this research, we independently controlled field of regard, stereoscopy, and head-tracked rendering to study their effects on the performance of a task involving precise spatial inspections of complex 3D structures. Measuring time and errors, we asked participants to distinguish between structural gaps and intersections between components of 3D models designed to be similar to real underground cave systems. The overall results suggest that the addition of the higher fidelity system features support performance improvements in making small-scale spatial judgments. Through analyses of the effects of individual system components, the experiment shows that participants made significantly fewer errors with either an increased field of regard or with the addition of head-tracked rendering. The results also indicate that participants performed significantly faster when the system provided the combination of stereo and head-tracked rendering.

Full Text

Duke Authors

Cited Authors

  • Ragan, ED; Kopper, R; Schuchardt, P; Bowman, DA

Published Date

  • May 2013

Published In

Volume / Issue

  • 19 / 5

Start / End Page

  • 886 - 896

PubMed ID

  • 22868674

Pubmed Central ID

  • 22868674

Electronic International Standard Serial Number (EISSN)

  • 1941-0506

International Standard Serial Number (ISSN)

  • 1077-2626

Digital Object Identifier (DOI)

  • 10.1109/tvcg.2012.163


  • eng