-
Mr Jorie van Haren (Maastricht University)7/20/22, 11:00 AMPoster
To make sense of the intricate, noisy, and often incomplete soundscape of our dynamic world, human listeners continuously use contextual information to form predictions about future states while also adapting to past sensations. While extensive research supports the relevance of both prediction and trial-level adaptation to aid effective neural processing of sounds, differentiating between the...
Go to contribution page -
Nan Qiu7/20/22, 11:00 AMPoster
Visual search performance is facilitated when the singleton distractor occurs at a high probability location where the distractor occurred frequently in the past, compared to locations where it rarely occurred. Additionally, some studies found search becomes slower when the target appeared at the location of the preceding distractor (coincident condition). However, the underlying neural...
Go to contribution page -
Dunia Giomo (SISSA)7/20/22, 11:00 AMPoster
Series of discrete highly regular sensorimotor events are often experienced as temporal patterns. Studies on rhythm perception, sensorimotor learning and predictive coding in the auditory domain have shown that humans learn, are highly sensitive to and form expectations about the temporal regularities of the environment. A complete understanding of the basic mechanism supporting these...
Go to contribution page -
Alana Hodson (Carnegie Mellon University), Dr Charles Yunan Wu (Carnegie Mellon University), Prof. Barbara Shinn-Cunningham (Carnegie Mellon University), Lori Holt (Carnegie Mellon University)7/20/22, 11:00 AMPoster
Prior research demonstrates that the ‘perceptual weight’ of acoustic input in signaling speech categories shifts rapidly when statistical distributions of speech input deviate from expectations, as when you encounter a foreign accent. What drives this perceptual adaptation is debated. One possibility is that accented or otherwise distorted speech carries enough information to partially...
Go to contribution page -
Xiangbin Teng7/20/22, 11:00 AMPoster
Complex human behaviors involve perceiving continuous stimuli and planning actions at sequential time points, such as in perceiving/producing speech and music. To guide adaptive behavior, the brain needs to internally anticipate a sequence of prospective moments. How does the brain achieve this sequential temporal anticipation without relying on any external timing cues? To answer this...
Go to contribution page -
Christoph Huber-Huber (CCNS, University of Salzburg, Austria)7/20/22, 11:00 AMPoster
Our eyes move about three times per second which divides apparently continuous vision into rather discrete snapshots. These spatiotemporal dynamics implied in active vision bring about statistical regularities which impact on perceptual processing. An example are preview effects, which demonstrate that extrafoveal pre-saccadic information contributes to post-saccadic foveal processing. Preview...
Go to contribution page
Choose timezone
Your profile timezone: