Conveners
Symposia: The ontogenetic necessity to extract information from the auditory environment
- Valentina Silvestri (Università degli Studi di Milano-Bicocca)
-
Valentina Silvestri (Università degli Studi di Milano-Bicocca)9/23/24, 5:30 PMTalk in simposio
Voices are arguably the most relevant auditory stimuli for human social interactions conveying linguistic content but also offering several paralinguistic cues about the speakers, including their identity, gender, age, and emotional state. Thus, from voices, it is possible to extract both the linguistic content conveying propositional meaning and the prosodic aspects conveying the speaker's...
Go to contribution page -
Valentina Silvestri (Università degli Studi di Milano-Bicocca)9/23/24, 5:30 PMAbstract complessivo di un simposio
From the very beginning infants are aware of their surrounding environment. Since the auditory system is one of the first sensory systems to develop, newborns can already absorb a wealth of information from their auditory environment. They exhibit both behavioral and neurophysiological responses to a variety of external sounds, such as pure tones, sounds at different acoustic frequencies, and...
Go to contribution page -
Nicolo Castellani (IMT Lucca)9/23/24, 5:50 PMTalk in simposio
Learning is the fundamental backbone of social creatures. In humans, learning is the essential ability to take advantage of what others have already experienced. Here, we want to evaluate the presence of learning during pregnancy and the maintenance of it after birth in postnatal life. Exploiting EEG neural entrainment, we want to evaluate the presence of a brain preference for a learned...
Go to contribution page -
Elena Eccher (CIMeC - University of Trento)9/23/24, 6:10 PMTalk in simposio
Since the first hours of life human newborns are subjected to a multitude of stimuli and they need to learn quickly how to integrate visual and auditory input. Cross-modal integration operates not only at the perceptual level, where low-level features of stimuli such as shape, texture, and temporal occurrence are recognized, but also at a more abstract level. For instance, research has shown...
Go to contribution page -
Alice Rossi Sebastiano (MANIBUS Lab, Department of Psychology, University of Turin, Italy)9/23/24, 6:30 PMTalk in simposio
Compelling evidence shows that audio-tactile multisensory integration (AT-MSI) is modulated by body proximity already at birth. In our view, early movement may represent the developmental context allowing to encode multisensory stimuli in a body-centered reference frame, by anchoring auditory and tactile inputs to the body through proprioception. Based on these premises, we addressed whether...
Go to contribution page -
Silvia Polver9/23/24, 6:40 PMTalk in simposio
Brain functions depend on recognizing patterns to predict future events (Auksztulewicz et al., 2018; Morillon & Schroeder, 2015). Early in auditory processing, the brain detects repetitions, guiding attention to specific intervals in the auditory stream (Auksztulewicz et al., 2018). Speech perception involves the dynamic sampling of acoustic information across different time scales...
Go to contribution page