Perspective determines the way you interpret pointing gestures
Talk04:00 PM - 05:30 PM (UTC) 2020/03/23 16:00:00 UTC - 2020/03/23 17:30:00 UTC
Though ubiquitous in human communication, pointing is often misunderstood. Here, we examined how the observer’s perspective affects pointer-observer misunderstandings. We hypothesize that observers rely on different visual cues when interpreting distal pointing gestures from different viewpoints. That is, observers extrapolate the pointing arm when seeing it from the side but use the pointer’s index finger position in their visual fields when assuming viewpoints close to the pointer. Hence, small changes in the observers’ head positions should have a negligible effect on interpretations in the former case but a relatively large effect in the latter case. We tested this hypothesis in a virtual reality and a real word experiment, in which participants estimated the location on a screen at which a pointer was pointing. We manipulated the observer’s viewpoint, view height, and the pointed-at region. As expected, small modifications in the observer view height resulted affected judgements considerably when standing behind the pointer but not when seeing the pointer from the side. This pattern could be found in the real world and VR setting. In conclusion, the data suggests that observers base interpretations on the index finger position in the visual field when sharing the pointer perspective but extrapolate the pointing arm when seeing the pointer from the side.
Alpha rhythms in the visual cortex impact what we see and how fast we see it
Talk04:00 PM - 05:30 PM (UTC) 2020/03/23 16:00:00 UTC - 2020/03/23 17:30:00 UTC
Alpha-frequency band rhythms (9-13 Hz) are thought to reflect inhibitory control on neural excitability in visual cortical areas. Consequently, previous research has established strong links between visual perception and alpha rhythms. For example, whether we see a visual stimulus or not has been found to depend on the power and phase of occipital alpha rhythms before stimulus onset. Here, I present magneto-encephalography (MEG) evidence that suggests that alpha rhythms impact the speed and the content of our visual experiences. First, the frequency of alpha rhythms over occipital and inferior temporal cortex increases when task demands emphasize segregation vs. integration of visual inputs over time. These findings link alpha frequency to the temporal resolution of visual perception and show that it can be strategically adjusted to meet task goals. Second, network states between occipital and inferior temporal cortex that communicate at alpha frequencies predispose the future perception of the bi-stable Rubin’s face-vase stimulus. These results suggest that pre-established, alpha-timed connectivity pathways bias not only the detection of visual stimuli but also their perceived contents. Overall, this work relates the time profile of alpha rhythms to visual speed and content. It suggests that dynamic visual inputs are processed through an alpha-timed (~100 ms cycle period) rhythmic-temporal architecture in the visual cortex.
Entraining visual cortex activity in grapheme-color synaesthesia
Talk04:00 PM - 05:30 PM (UTC) 2020/03/23 16:00:00 UTC - 2020/03/23 17:30:00 UTC
Grapheme-color synesthetes have color sensations when viewing specific letters or numbers (inducers). It is known that the visual cortex of synesthetes is highly excitable. We here investigate a possible link between increased visual cortex excitability and color misperceptions in synesthetes. Visual cortex oscillations were entrained by rapid serial presentations of inducers and non-inducers, and steady-state visual evoked potentials (SSVEPs) in the EEG were analyzed at the driving frequencies. Inducers compared to non-inducers produced larger occipital SSVEP amplitudes, originating from neural sources in calcarine sulcus and middle occipital cortex. Moreover, SSVEP amplitude differences between grapheme conditions predicted the vividness of color sensations. No effect was found in a sample of non-synesthetic control participants who saw matched sets of graphemes. Thus, in synesthetes, inducers entrained strong brain responses in the lower visual cortex. The increased brain activity might be maintained during further processing and increase the likelihood for irregular grapheme and color bindings.
The number of trials shapes the relationship between priming and visibility under interocular suppression
Talk04:00 PM - 05:30 PM (UTC) 2020/03/23 16:00:00 UTC - 2020/03/23 17:30:00 UTC
Research on the functional segregation of action- and perception-related processing streams in the visual system often studies preserved priming effects in motor responses, which occur independently of perceptual discriminability of the prime stimuli. However, experimental design decisions could influence the resulting relationship between priming and visibility. Here, we reanalyzed a recent dataset on priming effects under Continuous Flash Suppression (CFS), a particularly potent variant of interocular suppression (Valuch & Mattler, 2019, Journal of Vision). Participants completed the same number of priming and visibility trials. Priming and visibility functions were computed for each participant based on an incremental increase of the number of trials, relative to the start of the experiment (8, 16, 24, 32, 40, 48 or 56 trials per experimental condition). The analysis revealed that priming effects remained stable when more trials were added to the analysis. A completely different pattern emerged for visibility effects. The estimated visibility of the prime stimuli increased substantially and monotonically with adding more trials to the analysis. Using a lower number of trials for measuring visibility compared to priming effects (which is not an uncommon choice), could thus underestimate the actual visibility of the primes. We discuss implications for evaluating dissociations between priming and visibility and assess the generalizability of this observation for visual masking techniques beyond interocular suppression.
Time course and shared neurocognitive mechanisms of mental imagery and visual perception
Talk04:00 PM - 05:30 PM (UTC) 2020/03/23 16:00:00 UTC - 2020/03/23 17:30:00 UTC
A growing body of research suggests that seeing something with the mind’s eye—mental imagery—may not be all that different from seeing something literally with one’s eyes, since both engage similar brain areas. Yet, the time course of neurocognitive mechanisms that support imagery is still largely unknown. The current view holds that imagery does not share early mechanisms with perception, but starts directly with high-level, holistic representations. However, evidence of earlier shared mechanisms is difficult to obtain because imagery and perception tasks typically differ in visual input. To control for low-level differences, we tested imagery and perception of objects while manipulating the degree of associated object knowledge. Event-related brain potentials showed that imagery and perception were equally influenced by knowledge already at an early stage, reflected in the P1 component, revealing shared mechanisms during low-level visual processing. Later holistic processing observable in the N1 component was increased for successful compared to incomplete imagery. Further, activity over frontal brain areas suggested that stabilizing mental images demands increased monitoring from higher-level areas. It follows that imagery is not merely perception in reverse, but both are active and constructive processes that share mechanisms even in an early phase.