Does altered subcortical emotional salience processing and a 'jumping to conclusions' bias lead to psychotic symptoms in Parkinson’s patients?
Talk10:30 AM - 12:00 Noon (UTC) 2020/03/23 10:30:00 UTC - 2020/03/23 12:00:00 UTC
Current research does not provide a clear explanation for why some patients with Parkinson’s Disease (PD) develop psychotic symptoms. In schizophrenia research, the ‘aberrant salience hypothesis’ of psychosis has been influential in explaining the development of psychotic symptoms, proposing that dopaminergic dysregulation leads to inappropriate attribution of salience/attention to otherwise irrelevant stimuli, facilitating the formation of hallucinations and delusions. However, this theory has received limited attention in the context of PD-psychosis. We investigated salience processing in 14 PD-patients with psychotic symptoms, 23 PD-patients without psychotic symptoms and 19 healthy controls. All patients received dopaminergic medication. We examined emotional salience using a visual-oddball fMRI-paradigm that has been used to investigate early stages of psychosis. Furthermore, a subgroup of all participants completed a behavioural ‘jumping to conclusions’ task. We found significant differences in brain responses to emotional salience between the patient groups. PD-patients with psychotic symptoms revealed enhanced brain responses in the striatum, the hippocampus and the amygdala compared to patients without psychotic symptoms. PD-patients with psychotic symptoms showed significant correlations between the levels of dopaminergic drugs and BOLD signalling, as well as psychotic symptom scores. Furthermore, our data provide first indications for dysfunctional top-down processes, measured in a ‘jumping to conclusions’ bias. Our study suggests that enhanced signalling in the striatum, hippocampus and amygdala together with deficient top-down regulations is associated with the development of psychotic symptoms in PD, similarly to that proposed in the ‘aberrant salience hypothesis’ of psychosis in schizophrenia.
Emotion and cognition influence multisensory integration
Talk10:30 AM - 12:00 Noon (UTC) 2020/03/23 10:30:00 UTC - 2020/03/23 12:00:00 UTC
We receive information about our environment through multiple sensory modalities. For example, we evaluate the threat of approaching thunder and lightning, by integrating auditory and visual information. How cognitive and emotional processes affect this integration is under debate. In five experiments, we assessed multisensory integration using the sound-induced flash illusion (SIFI), in which two auditory beeps presented simultaneously with one visual flash can induce the illusion of two flashes. We used orthogonal tasks to modulate cognitive and emotional processes to examine their influence therein. In the first experiment, we found that increased cognitive load induced by an n-back task enhances the susceptibility to the SIFI. In the second experiment we replicated this effect while recording EEG. The analysis of neural oscillations indicated that the interaction between cognitive load and perception is reflected in frontal theta and beta band power. In the third experiment, we used a visual cueing paradigm and show that visual cues prior to the presentation of SIFI stimuli can enhance the illusion rate, possibly due to increased perceptual load. In the fourth and fifth experiment, we used emotional pictures and sounds to examine the influence of emotional processes on the SIFI. In both experiments, we show that emotional stimuli presented prior to the onset of the SIFI can reduce the illusion rate, possibly due to increased arousal. Taken together, our experiments highlight the role of cognitive and emotional processes in perception and advance our understanding of the state-dependency of multisensory integration.
Christian Kaernbach Institut Für Psychologie, Christian-Albrechts-Universität Zu Kiel
LATER Modelling of Emotional Antisaccades
Talk10:30 AM - 12:00 Noon (UTC) 2020/03/23 10:30:00 UTC - 2020/03/23 12:00:00 UTC
In the present study, we are going to apply a traditional antisaccade paradigm, in which participants have to inhibit a saccade to a salient white circle stimulus, as well as an emotional antisaccade paradigm, in which a saccade to an emotional facial stimulus has to be avoided by the participants. A novelty of the present study is that – in contrast studies using emotional antisaccades to measure inhibition performance – this is the first time the paradigm is tested in healthy subjects only. Data from both paradigms is subsequently subjected to linear approach to threshold with ergodic rate modelling (LATER). Based on the LATER model by Noorani & Carpenter (2013), we aim to predict antisaccade performance from subjects’ performance in prosaccade trials. Moreover, performance in the emotional antisaccade paradigm will be predicted by the performance in the standard antisaccade paradigm. In contrast to Noorani & Carpenter's approach, pro- and antisaccades are not going to be presented in a blocked, but in an interleaved design. The LATER model has, so far, only been fitted to the blocked design, which adds even more exploratory value to our new approach. We hypothesize that, as antisaccade performance can be predicted by prosaccade performance when subjects are presented with the blocked version of the task, the prediction should also work in an interleaved design as well as across paradigms using different kind of stimuli.
Attention Capture by Gaze and Emotion – First in Time, First in Line?
Talk10:30 AM - 12:00 Noon (UTC) 2020/03/23 10:30:00 UTC - 2020/03/23 12:00:00 UTC
Direct gaze and emotional expression both constitute powerful social signals that capture attention. Previous theories postulate attention capture for direct gaze in approach-oriented emotions (e.g., angry, happy) and averted gaze benefits in avoidance-oriented emotions (e.g., fearful, disgusted). However, it remains unclear how humans integrate these social signals, especially, if a temporal benefit of one cue leads to an attention-capture advantage pertaining to the social signal. Our studies therefore investigated the temporal modulation of gaze and emotional expression. Participants identified a target letter on one of four faces. We manipulated gaze direction (direct and averted gaze) and emotional expression (neutral, angry, fearful, happy, and disgusted) with a) between-subject constant emotional expressions (temporal antecedence of emotion condition; Study I) as well as b) within-subject shifting of expressions from neutral to emotional (concurrent gaze and emotion conditions; Study II). When emotions had temporal benefit, direct gaze captured attention; an effect that was further modulated by emotion (Study I). When emotional expressions shifted (Study II), direct gaze seemed to capture attention in approach-oriented expressions while averted gaze facilitates target identification in avoidance-oriented expressions. Overall, gaze and emotion information in faces can be integrated both sequentially and simultaneously. The integration of gaze and emotion in holistic faces seems to underlie adaptive social interaction in humans.