The associative property holds for combination of auditory, visual, and tactile signals in multisensory decisions

Activity: Talk or presentation typesPresentation


A prominent finding in multisensory research is that reaction times (RTs) are faster to bimodal signals compared to the unimodal components, which is the redundant signals effect (RSE). An intriguing explanation of the effect comes with race models, which assume that responses to bimodal signals are triggered by the faster of two parallel decision units, which can be implemented by a logic OR-gate. This basic model architecture results in statistical facilitation and the RSE can be predicted based on unisensory RT distributions and probability summation. To test the explanatory power of the framework, an expansion of the bimodal RSE is that RTs to trimodal signals are even faster. To measure the effect, I presented three unimodal signals (in vision, audition, and touch), all bimodal combinations, and a trimodal condition. To adapt the model, a corresponding extension simply assumes that responses are triggered by the fastest of three parallel decision units. Following the associative property in mathematics, an interesting proposition is that probability summation with any bimodal and missing unimodal RT distribution should equally predict the trimodal RT distribution. Furthermore, the expected RSE can in fact be computed for any combination of uni- and bimodal conditions, which results in a total of seven parameter-free predictions. Remarkably, the empirical effects follow these predictions overall very well. Hence, the associative property holds. Race models are consequently a strong and consistent candidate framework to explain the RSE and provide a powerful tool to investigate and understand perceptual decisions with multisensory signals.
PeriodJul 2020
Event title53rd Annual Meeting of the Society for Mathematical Psychology
Event typeConference