ePosterDOI Available

Recurrence in temporal multisensory processing

Swathi Aniland 2 co-authors
Bernstein Conference 2024 (2024)
Goethe University, Frankfurt, Germany

Presentation

Date TBA

Poster preview

Recurrence in temporal multisensory processing poster preview

Event Information

Abstract

Animals continuously process information from multiple sensory modalities to assess their environment and guide their behavior. For example, a predator may rely on both visual and auditory cues to track its prey. Over time, various algorithms have been developed to describe multisensory processing, including linear and nonlinear fusion. However, these algorithms typically treat each time step independently and fail to account for temporal dependencies in these signals. To address this limitation, we introduce a novel set of multisensory tasks that systematically incorporate controlled temporal dependencies across streams of sensory signals, making them more naturalistic. Our findings show that traditional multisensory algorithms, which ignore temporal dependencies, perform sub-optimally on these tasks. Conversely, these algorithms approach near-optimal performance when adapted to integrate evidence across both sensory channels and time. Additionally, we demonstrate that recurrent artificial neural networks (RNNs) outperform algorithmic models, underscoring the importance of recurrent connections and temporal dependencies in multisensory processing. This study highlights the benefits of integrating multisensory information across both channels and time and presents innovative and naturalistic tasks for evaluating the significance of these processes in biological systems.

Cookies

We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.