MENÜ MENÜ  

cover

A Distributed Virtual Reality System for Spatial Updating. Concepts, Implementation, and Experiments

MPI Series in Biological Cybernetics, Bd. 2

Markus von der Heyde

ISBN 978-3-89722-781-1
126 pages, year of publication: 2001
price: 40.50 €
Over the course of evolution humans as well as other animals learned to navigate through complex environments. Such navigation had two main goals: to find food and to find the way back to shelter. For most moving organisms it is important to know their location in the world and maintain some internal representation of it. For higher species it is most likely that multiple sensory systems provide information to solve this task. Consequently, to study human behavior in a complex environment it is important that the experimenter has full control over the stimulus for multiple senses. Furthermore, it is crucial to guarantee the following: A) The stimulus, and the information it conveys has to be precisely controllable; B) The experimental conditions have to be repeatable; and C) The stimulus conditions have to be independent of the individual characteristics of the observer.

Virtual Environments have to some degree offered a solution for these demands. Recently, it has become increasingly possible to conduct psychophysical experiments with more than one sensory modality at a time. In this thesis, Virtual Reality (VR) technology was used to design multi-sensory experiments which look into some aspects of the complex multi-modal interactions of human behavior.

Contents: The first part of this PhD thesis describes a Virtual Reality laboratory which was built to allow the experimenter to stimulate four senses at the same time: vision, acoustics, touch, and the vestibular sense of the inner ear. Special purpose equipment is controlled by individual computers to guarantee optimal performance of the modality specific simulations. These computers are connected in a network functioning as a distributed system using asynchronous data communication. The second part of the thesis presents two experiments which investigate the ability of humans to perform spatial updating. These experiments contribute new scientific results to the field and serve, in addition, as proof of concept for the VR-lab. More specifically, the experiments focus on the following main questions: A) Which information do humans use to orient in the environment and maintain an internal representation about the current location in space?; B) Do the different senses code their percept in a single spatial representation which is used across modalities, or is the representation modality specific?

Results and Conclusions: The experimental results allow the following conclusions: A) Even without vision or acoustics, humans can verbally judge the distance traveled, peak velocity, and to some degree even maximum acceleration using relative scales. Therefore, they can maintain a good spatial orientation based on proprioception and vestibular signals; B) Learning the sequence of orientation changes with multiple modalities (vision, proprioception and vestibular input) enables humans to reconstruct their heading changes from memory. In situations with conflicting cues, the maximum percept from either of the modalities had a major influence on the reconstruction. Most of the naive subjects did not notice any conflicts between modalities. In total, this seems to suggest that there is a single spatial reference frame used for spatial memory. One possible model for cue integration might be based on a dynamically weighted sum of all modalities which is used to come up with a coherent percept and memory for spatial location and orientation.

Keywords:
  • Virtuelle Realität/Virtual Reality
  • Verteilte Systeme/Distributed Systems
  • Raumkognition/Spatial Cognition
  • Sensor Integration/Sensor Fusion
  • Psychophysik/Psychophysics

BUYING OPTIONS

40.50 €
in stock
cover cover cover cover cover cover cover cover cover