Projects per year
Abstract
Almost all animals navigate their environment to find food, shelter, and mates. Spatial cognition of nonhuman primates in large-scale environments is notoriously difficult to study. Field research is ecologically valid but controlling confounding variables can be difficult. Captive research enables experimental control, but space restrictions can limit generalizability. Virtual reality technology combines the best of both worlds by creating large-scale, controllable environments. We presented six chimpanzees with a semi naturalistic virtual environment, using a custom touch screen application. The chimpanzees exhibited signature behaviors reminiscent of real-life navigation: they learned to approach a landmark associated with the presence of fruit, improving efficiency over time; they located this landmark from novel starting locations, and approached a different landmark when necessary. We conclude that virtual environments can allow for standardized testing with higher ecological validity than traditional tests in captivity, and harbor great potential to contribute to longstanding questions in primate navigation, e.g., the use of landmarks, Euclidean maps, or spatial frames of reference.
Original language | English |
---|---|
Number of pages | 16 |
Journal | Science Advances |
Volume | 8 |
Issue number | 25 |
DOIs | |
Publication status | Published - 24 Jun 2022 |
Fingerprint
Dive into the research topics of 'Chimpanzees (Pan troglodytes) navigate to find hidden fruit in a virtual environment'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Josep Call: Constructing Social Minds: Coordination, Communication and Cultural Transmission
Call, J. (PI)
1/01/15 → 31/12/20
Project: Standard
Datasets
-
APExplorer_3D: A virtual environment application for the study of primate cognition
Schweller, K. (Creator), Allritz, M. (Contributor), Call, J. (Contributor), McEwen, E. (Contributor), Janmaat, K. (Contributor), Menzel, C. (Contributor) & Dolins, F. (Contributor), Open Science Framework, 2022
DOI: 10.17605/osf.io/sx5pm, http://osf.io/4tjur
Dataset