Speaker
Description
The brain combines information from multiple sensory modalities to build a consistent representation of the world. The principles by which multimodal stimuli are integrated in cortical hierarchies are well studied, but it is less clear whether and how unimodal inputs systematically shape the processing of signals carried by a different modality. In rodents, for instance, direct connections from primary auditory cortex reach visual cortex, but studies disagree on the impact of these projections on visual cortical processing. Both enhancement and suppression of visually evoked responses by auditory inputs have been reported, as well as sharpening of orientation tuning and improvement in the coding of visual information. Little is known, however, about the functional impact of auditory signals on rodent visual perception. Here we trained a group of rats in a visual temporal frequency (TF) classification task, where the visual stimuli to categorize were paired with simultaneous but task-irrelevant auditory stimuli, to prevent high-level multisensory integration and investigate instead the spontaneous, direct impact of auditory signals on the perception of visual stimuli. Rat classification of visual TF was systematically altered by the presence of sounds, in a way that was determined by sound intensity but not by its temporal modulation. A Bayesian ideal observer model showed that this phenomenon is consistent with an effective compression of the visual perceptual space induced by the auditory inputs, suggesting an important role for inhibition as the key mediator of auditory-visual interactions at the neural representation level.