Conveners
Symposia: The ontogenetic necessity to extract information from the auditory environment
- Valentina Silvestri (Università degli Studi di Milano-Bicocca)
Presentation materials
Voices are arguably the most relevant auditory stimuli for human social interactions conveying linguistic content but also offering several paralinguistic cues about the speakers, including their identity, gender, age, and emotional state. Thus, from voices, it is possible to extract both the linguistic content conveying propositional meaning and the prosodic aspects conveying the speaker's...
From the very beginning infants are aware of their surrounding environment. Since the auditory system is one of the first sensory systems to develop, newborns can already absorb a wealth of information from their auditory environment. They exhibit both behavioral and neurophysiological responses to a variety of external sounds, such as pure tones, sounds at different acoustic frequencies, and...
Learning is the fundamental backbone of social creatures. In humans, learning is the essential ability to take advantage of what others have already experienced. Here, we want to evaluate the presence of learning during pregnancy and the maintenance of it after birth in postnatal life. Exploiting EEG neural entrainment, we want to evaluate the presence of a brain preference for a learned...
Since the first hours of life human newborns are subjected to a multitude of stimuli and they need to learn quickly how to integrate visual and auditory input. Cross-modal integration operates not only at the perceptual level, where low-level features of stimuli such as shape, texture, and temporal occurrence are recognized, but also at a more abstract level. For instance, research has shown...
Compelling evidence shows that audio-tactile multisensory integration (AT-MSI) is modulated by body proximity already at birth. In our view, early movement may represent the developmental context allowing to encode multisensory stimuli in a body-centered reference frame, by anchoring auditory and tactile inputs to the body through proprioception. Based on these premises, we addressed whether...
Brain functions depend on recognizing patterns to predict future events (Auksztulewicz et al., 2018; Morillon & Schroeder, 2015). Early in auditory processing, the brain detects repetitions, guiding attention to specific intervals in the auditory stream (Auksztulewicz et al., 2018). Speech perception involves the dynamic sampling of acoustic information across different time scales...