Feature Integration Across Space, Time, and Orientation

Thomas U. Otto*, Haluk Oegmen, Michael H. Herzog

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

23 Citations (Scopus)

Abstract

The perception of a visual target can be strongly influenced by flanking stimuli. In static displays, performance on the target improves when the distance to the flanking elements increases-presumably because feature pooling and integration vanishes with distance. Here, we studied feature integration with dynamic stimuli. We show that features of single elements presented within a continuous motion stream are integrated largely independent of spatial distance (and orientation). Hence, space-based models of feature integration cannot be extended to dynamic stimuli. We suggest that feature integration is guided by perceptual grouping operations that maintain the identity of perceptual objects over space and time.

Original languageEnglish
Pages (from-to)1670-1686
Number of pages17
JournalJournal of Experimental Psychology: Human Perception and Performance
Volume35
Issue number6
DOIs
Publication statusPublished - Dec 2009

Keywords

  • nonretinotopic processing
  • motion grouping
  • metacontrast masking
  • oblique effect
  • contrast polarity
  • PRIMARY VISUAL-CORTEX
  • HUMAN-VISION
  • APPARENT-MOTION
  • SPATIOTEMPORAL INTERPOLATION
  • METACONTRAST MASKING
  • LATERAL INTERACTIONS
  • FEATURE INHERITANCE
  • BACKWARD-MASKING
  • MACAQUE MONKEY
  • VERNIER ACUITY

Fingerprint

Dive into the research topics of 'Feature Integration Across Space, Time, and Orientation'. Together they form a unique fingerprint.

Cite this