Seeing the future: Natural image sequences produce “anticipatory” neuronal activity and bias perceptual report

David Ian Perrett, Dengke Xiao, Nicholas Edward Barraclough, Christian Keysers, Michael William Oram

Research output: Contribution to journalArticlepeer-review

52 Citations (Scopus)

Abstract

This paper relates human perception to the functioning of cells in the temporal cortex that are engaged in high-level pattern processing. We review historical developments concerning (a) the functional organization of cells processing faces and (b) the selectivity for faces in cell responses. We then focus on (c) the comparison of perception and cell responses to images of faces presented in sequences of unrelated images. Specifically the paper concerns the cell function and perception in circumstances where meaningful patterns occur momentarily in the context of a naturally or unnaturally changing visual environment. Experience of visual sequences allows anticipation, yet one sensory stimulus also “masks” perception and neural processing of subsequent stimuli. To understand this paradox we compared cell responses in monkey temporal cortex to body images presented individually, in pairs and in action sequences. Responses to one image suppressed responses to similar images for 500 ms. This suppression led to responses peaking 100 ms earlier to image sequences than to isolated images (e.g., during head rotation, face-selective activity peaks before the face confronts the observer). Thus forward masking has unrecognized benefits for perception because it can transform neuronal activity to make it predictive during natural change.
Original languageEnglish
Pages (from-to)2081-2104
JournalThe Quarterly Journal of Experimental Psychology
Volume62
Issue number11
Early online date23 Sept 2009
DOIs
Publication statusPublished - 2009

Fingerprint

Dive into the research topics of 'Seeing the future: Natural image sequences produce “anticipatory” neuronal activity and bias perceptual report'. Together they form a unique fingerprint.

Cite this