We've updated our Privacy Policy to make it clearer how we use your personal data. We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Brain Study Maps How Vision and Sound Converge for Quicker Reactions

A white model brain with multicoloured wires coming out of its sides.
Credit: iStock.
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 2 minutes

A new study has provided insight into how the brain integrates visual and auditory signals to inform quick motor responses.


The findings show that while the brain initially processes changes in sight and sound separately, it eventually merges this information to help generate movement more efficiently.

Reaction times improved by combined sensory input

The study, published in Nature Human Behaviour, builds on previous work that used electroencephalography (EEG) to observe the brain’s response to sensory changes. 


“We were uniquely positioned to tackle this,” said Simon Kelly, PhD, professor at University College Dublin and the study's senior author. “The more we know about the fundamental brain architecture underlying such elementary behaviors, the better we can interpret differences in the behaviors and signals associated with such tasks in clinical groups and design mechanistically informed diagnostics and treatments.”


This latest research involved participants responding to simultaneous visual and auditory cues, revealing how these distinct sensory inputs are processed in parallel before converging in the motor system.


Participants were asked to observe a dot animation while listening to a series of tones. They were instructed to press a button when they noticed any change – visual, auditory or both. EEG recordings showed that the brain generated separate signals for visual and auditory changes. However, when changes occurred in both modalities, the signals came together in the motor system, enabling faster reaction times.


“We found that the EEG accumulation signal reached very different amplitudes when auditory versus visual targets were detected, indicating that there are distinct auditory and visual accumulators,” Kelly said.


Computational models help explain the process

The researchers used computational models to compare different theories about how these accumulators operate during multisensory decision-making. One model proposed that the brain uses a 'race' strategy, where each sensory signal competes to trigger a response. The other model suggested that signals from the separate senses are combined before reaching the motor system.


Both models fit the data under normal conditions. However, when researchers introduced a slight delay in either the visual or auditory signal, only the integration model successfully predicted participants’ responses. This suggests that during multisensory experiences, the brain may begin by processing each sense individually, but eventually merges the information to guide action.


“The research provides a concrete model of the neural architecture through which multisensory decisions are made,” Kelly said. “It clarifies that distinct decision processes gather information from different modalities, but their outputs converge onto a single motor process where they combine to meet a single criterion for action.”


Reference: Egan JM, Gomez-Ramirez M, Foxe JJ, O’Connell RG, Kelly SP. Distinct audio and visual accumulators co-activate motor preparation for multisensory detection. Nat Hum Behav. 2025. doi: 10.1038/s41562-025-02280-9


This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source. Our press release publishing policy can be accessed here.


This content includes text that has been generated with the assistance of AI. Technology Networks' AI policy can be found here.