Modeling "spatial purport of perceptual experience": egocentric space perception in a semi-realistic 3D virtual environment
Skip to main content
eScholarship
Open Access Publications from the University of California

Modeling "spatial purport of perceptual experience": egocentric space perception in a semi-realistic 3D virtual environment

Creative Commons 'BY' version 4.0 license
Abstract

Egocentric space perception is multimodal, closely tied to action and bodily movements and has an inherent phenomenal dimension. One prominent account, provided by Rick Grush, has postulated posterior parietal cortex as key neural area. Computational model based on Kalman filter has been proposed to account for the operation of this brain region, underscoring the importance of bodily skills for perceiving spatial properties. The current study provides a first direct simulation of this model in a semi-realistic 3D virtual environment. The goal of the simulation was to develop an agent with a realistic ability for egocentric space perception based on a neural approximation of Kalman filter. To achieve this goal, we use machine learning techniques, with a strong focus on unsupervised methods of reinforcement learning. Resulting agent is tested behaviorally on ecologically plausible tasks to evaluate its internal, learned representations. Poster presents simulation results and discusses the model.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View