Empathic behavioral and physiological responses to dynamic stimuli in depression

Empathic behavioral and physiological responses to dynamic stimuli in depression

Psychiatry Research 200 (2012) 294–305 Contents lists available at SciVerse ScienceDirect Psychiatry Research journal homepage: www.elsevier.com/loc...

436KB Sizes 1 Downloads 125 Views

Psychiatry Research 200 (2012) 294–305

Contents lists available at SciVerse ScienceDirect

Psychiatry Research journal homepage: www.elsevier.com/locate/psychres

Empathic behavioral and physiological responses to dynamic stimuli in depression Daniel Schneider a,b,n, Christina Regenbogen a,b, Thilo Kellermann a,b, Andreas Finkelmeyer c, Nils Kohn a,b, Birgit Derntl a,d, Frank Schneider a,b,e, Ute Habel a,b a

Department of Psychiatry, Psychotherapy and Psychosomatics, Medical School, RWTH Aachen University, Germany JARA Translational Brain Medicine, J¨ ulich, Germany c Institute of Neuroscience, Newcastle University, Newcastle Upon Tyne, England, UK d Institute of Clinical, Biological and Differential Psychology, University of Vienna, Austria e School of Medicine, University of Pennsylvania, Philadelphia b

a r t i c l e i n f o

a b s t r a c t

Article history: Received 7 December 2011 Received in revised form 26 March 2012 Accepted 30 March 2012

Major depressive disorder (MDD) is strongly linked to social withdrawal and interpersonal problems which characterize the disorder and further aggravate symptoms. Investigating the nature of impaired emotional-social functioning as a basis of interpersonal functioning in MDD has been widely restricted to static stimuli and behavioral emotion recognition accuracy. The present study aimed at examining higher order emotional processes, namely empathic responses and its components, emotion recognition accuracy and affective responses in 28 MDD patients and 28 healthy control participants. The dynamic stimulus material included 96 short video clips depicting actors expressing basic emotions by face, voice prosody, and sentence content. Galvanic skin conductance measurements revealed implicit processes in the multimethod assessment of empathy. Overall, patients displayed lower empathy, emotion accuracy, and affective response rates than controls. Autonomous arousal was higher in patients. A generalized emotion processing deficit is in line with the ‘‘emotional context insensitivity’’ (ECI) theory which proposes decreased overall responsiveness to emotional stimuli. The dissociation between hypo-reactivity in explicit and hyper-reactivity in implicit measures of emotion processing can be related to the ‘‘limbic-cortical dysregulation’’ model of depression. Our findings support the dissociation of autonomic and subjective emotional responses which may account for interpersonal as well as emotional deficits in depression. & 2012 Elsevier Ireland Ltd. All rights reserved.

Keywords: Depression Emotion processing Dynamic stimulus material Verbal information Emotional context insensitivity Galvanic skin conductance

1. Introduction Empathy, the ability to recognize, understand, and share the emotions of others conveys a feeling of social coherence and constitutes a central part of human social interactions which are essential for our adaptive functioning (Ekman, 1982; Plutchik, 1982; Plutchik and Kellermann, 1989; Ekman, 2003; Lewis et al., 2008). Although a variety of different definitions contribute to the complex and multidimensional construct of empathy (Derntl et al., 2009, 2010) most theoretical models include three core components: emotion recognition, affective response, and perspective taking (Preston and de Waal, 2002; Decety and Jackson, 2004; Leiberg and Anders, 2006; Singer, 2006; Batson, 2009;

n Corresponding author at: Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Pauwelsstrasse 30, 52074 Aachen, Germany. Tel.: þ49 241 80 89693; fax: þ 49 241 8082 401. E-mail address: [email protected] (D. Schneider).

0165-1781/$ - see front matter & 2012 Elsevier Ireland Ltd. All rights reserved. http://dx.doi.org/10.1016/j.psychres.2012.03.054

Singer and Lamm, 2009) while implying the distinction into ‘‘self’’ and ‘‘other’’ (de Vignemont and Singer, 2006). The ability to empathize is suggested to be affected in unipolar major depressive disorder due to the central defining clinical features: depressed mood and anhedonia which often lead to a limited affective range and subsequent social malfunctioning (Wells et al., 1989; Beck et al., 1996). Empirically, deficits have been characterized by an impaired ability to identify and discriminate emotions conveyed via facial expressions (Lepp¨anen et al., 2004; Lepp¨anen, 2006; Surguladze et al., 2004, 2005; Suslow et al., 2004; Bourke et al., 2010), as well as distorted emotion self-regulation capacities (Beauregard et al., 2006; Diener et al., 2009). These findings are further supported by self-reports following mood induction procedures conveying a blunted affect in depressed patients (Goodwin and Williams, 1982; Nelson and Stern, 1988; Falkenberg et al., 2012). The latter is one facet of emotion reactivity, which we see as the response aspect of emotion processing (e.g. subjective, behavioral, physiological, as well as cerebral, Keltner and Gross, 1999).

D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

While these findings yield insights into one component of empathy, emotion recognition, there exists also empirical evidence on impaired functioning in the cognitive aspect of empathy, ‘Theory of Mind’ (Cusi et al., 2012). Here, depressed patients have also been shown to be impaired in decoding and reasoning about mental states of others (Lee et al., 2005; Wolkenstein et al., 2011) or combining contextual information and prior knowledge about an individual or situation to understand behavior (Dziobek et al., 2006; Derntl et al., 2010). However, insights into the very nature of emotional empathy deficits in depression are very sparse ¨ ¨ (Brune and Brune-Cohrs, 2006). This may be due to the fact that the core emotional and cognitive components of empathy have been studied separately and not in their entity. Further, only few studies have addressed the processing of emotional verbal (Rubinow and Post, 1992; Elliott et al., 2002; Kan et al., 2004) or prosodic (Emerson et al., 1999) information. However, these communication cues have also been shown to be important signals when conveying emotionality in human interactions (de Gelder and Vroomen, 2000; Stevenson et al., 2011). Several theoretical concepts have been developed to account for the emotional deficits in depression. One line of research suggested emotion-specific abnormalities with a so-called ‘negativity bias’ (for a review see Bourke et al., 2010) accounting for ‘negative potentiation’ and ‘positive attenuation’ phenomena. This explains why depressed patients recognize sad facial expressions significantly better and show a larger attention bias towards them (‘negative potentiation’) (Gur et al., 1992; Matthews and Antes, 1992; Bouhuys et al., 1996; Hale, 1998; Elliot et al., 2002; ¨ Gotlib et al., 2004; Leppanen, 2006; Raes et al., 2006) but at the same time show reduced emotional reactions to happy facial expressions (‘positive attenuation’) (Sloan et al., 1997). Other studies showed that acute patients judged facial expressions more negatively than remitted patients and controls (e.g. Levkovitz et al., 2003). Contrary to this, the ‘emotional context insensitivity’ (ECI) suggests a generalized deficit in major depression to recognize emotions irrespective of their valence (Taesdale, 1983; Feinberg et al., 1986; Rubinow and Post, 1992; Persad and Polivy, 1993; Asthana et al., 1998; Rottenberg et al., 2005). The theory explains why patients become disengaged with the environment, and exhibit less attention and reactivity to both positive and negative stimuli (Nesse, 2000; Rottenberg, 2005; Rottenberg and Gross (2007); Bylsma et al., 2008). The evolutionary idea behind it is that further action in a situation where it has not been effective (e.g. famine) may waste valuable resources or lead to more negative outcomes if a previously tried strategy to reach a goal is not working. So depression can be described as a defensive motivational state that advocates personal and environmental disengagement resulting in an overall flattened affect (Nesse, 2000). While these theories coexist, controversies on a behavioral as well as physiological level cannot be accounted for by either of them. For instance, physiological responses towards emotional stimuli have been reported to be blunted (Dawson et al., 1977, 1985; Tsai et al., 2003; Foti et al., 2010), similar (Rottenberg et al., 2005), and even elevated (Guinjoan et al., 1995; Gross and Levenson, 1997; Falkenberg et al., 2012) in depressed patients compared to healthy controls. Furthermore, emotion processing is a multi-faceted construct, comprising automatic or implicit as well as consciously controlled or explicit components. Here, another dissociation became apparent by reports showing flattened emotional behavioral on the one side and elevated autonomic responses on the other (Brown et al., 1978; Guinjoan et al., 1995; Gross and Levenson, 1997; Suslow et al., 2010; Falkenberg, et al., 2012; for a review see Cusi et al., 2012). A theoretical model which may explain these inconsistencies and shed light on the seemingly apparent dissociation is the


‘limbic-cortical dysregulation’ model of depression (Mayberg, 1997). Accordingly, depressed patients tend to show hypo-activity in prefrontal regions and hyper-activity in limbic regions (Drevets, 1999, 2001; Mayberg et al., 1999; Siegle et al., 2007). Both of these regions are associated with a broader neural network involved in emotion processing (Henriques and Davidson, 1991; Damasio, 1996; Ochsner et al., 2002; Phan et al., 2002). Especially, the prefrontal cortex plays a role in emotion processing involving emotional decision making, reappraisal, and self-regulation (for a review see Phan et al., 2002). Additionally, neural activity in limbic brain regions has been linked to implicit and autonomous physiological arousal (Fredrikson et al., 1998; LeDoux, 1998; Mayberg et al., 1999; Williams et al., 2001; Siegle et al., 2007; Laine et al., 2009). Consequently, the model might account for both a generalized deficit in emotion reactivity on a behavioral level as described by the ECI and at the same time explains elevated autonomous arousal. The present study follows an innovative experimental approach to study empathy in depression and pays tribute to the multi-faceted construct of emotion processing while investigating aspects that can shed some light on the theoretical inconsistencies. We measure behavioral empathy and its components (emotion recognition, affective responses) in depressed patients. Except our own (Derntl et al., 2010; Regenbogen et al., 2012a,b), studies that assess the different empathy components simultaneously are completely lacking. We used a validated set of naturalistic dynamic emotional video clips instead of static material containing higher degrees of authenticity and naturalness, enabling ecological validity and being potentially more effective in eliciting empathic responses (see also Weyers et al., 2006; Trautmann et al., 2009). Hence, we were able to examine the combined influence and emotion-specific impact of nonverbal (facial expressions), para-verbal (prosody), and verbal (speech content) information on empathy and its two components. Furthermore, we controlled for the influence of speech comprehension on empathy by including an experimental condition in which the actor spoke in a foreign language. Neutral video clips served as a control. Lastly, we assessed empathic responses not only on a behavioral level but also included galvanic skin conductance (GSC) measurements to reveal both explicit and implicit components of empathy in a multimethod approach. We expected a generalized deficit of emotion processing in depressed patients. Patients were hypothesized to display attenuated behavioral empathy caused by low emotion recognition rates and a coexisting decrease in affective response matching the target emotion, the two components of empathy. In line with the ‘limbiccortical dysregulation’ model and preceding evidence in our group (Falkenberg et al., 2012), depressed patients were expected to exhibit a higher number of skin conductance responses during the perception of emotional stimuli as compared to healthy controls which might represent a postulated physiological effect of increased limbic activation. Further, video clips with accessible verbal information were suggested to result in higher behavioral empathy than those with incomprehensible speech, irrespective of participant group. This was expected against the background of multimodal processing facilitating emotion processing and empathy (Seubert et al., 2010; Regenbogen et al., 2012b). 2. Methods 2.1. Participants Twenty-eight patients with unipolar major depressive disorder (16 males, age mean ¼39.64, S.D. ¼10.13 years, range 21–54 years) were included. All patients were recruited from in- and out-patient treatment facilities of the Department of


D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

Psychiatry, Psychotherapy, and Psychosomatics, RWTH Aachen University. They were diagnosed by means of the Structured Clinical Interview for DSM-IV Disorders (SCID-I; Wittchen et al., 1997) and excluded in case of any psychiatric comorbidity or acute suicidal tendency. Depressive symptom severity was assessed using the Hamilton Depression Scale (HAMD-17; Hamilton, 1960) (mean ¼8.61, S.D. ¼ 3.03, range 4–15) and the Beck Depression Inventory (BDI-II; Beck, Steer & Brown, 1996) (mean¼ 24.78, S.D. ¼ 15.44, range 4–57). Scales revealed a moderate to severe overall depressive symptomatology. All patients received antidepressant medication (SSRI N ¼ 7; SNRI N ¼12; NARI N ¼2; TCA N ¼2; TeCA N¼ 5, Table 1). Twenty-eight healthy age- and education-matched control participants (15 males, age mean¼ 34.68, S.D.¼ 8.76 years, range 25–52 years; age: t¼1.96, P¼ 0.06; education: t¼1.58, P¼ 0.12, independent t-tests between groups). They were recruited via local advertisements and reported a clear history of psychiatric illness. Exclusion criteria for all participants involved a history of any neurological or cardiovascular illness, head injury, metabolic, endocrinological or cardiac disease, current substance abuse, mental retardation, left-handedness (Oldfield, 1971), or difficulties in identifying and describing emotions according to the 20-item Toronto Alexithymia Scale (TAS-20, Bagby et al., 1986) with an above threshold value of 61. Prior to testing, participants were informed about the experimental procedure and data acquisition. Informed written consent was obtained in accordance with the requirements of the local ethics committee and the declaration of Helsinki. All participants were given a monetary compensation of 20 h.

Table 1 Participants’ demographic data, questionnaire results and neuropsychological assessment. MDD ¼ patients with major depressive disorder, HC ¼healthy controls. For the VERT, data of one HC and six MDD patients were not available due to recoding problems which reduces degress of freedom (d.f.) to 48. SSRIs: Selective Serotonin Reuptake Inhibitors, SNRIs: Serotonin–Norepinephrine Reuptake Inhibitors, NARI: Noradrenaline Reuptake Inhibitors, TCAs: Tricyclic Antidepressants, TeCAs: Tetracyclic Antidepressants. For explanation of other abbreviations for questionnaires and neuropsychological test batteries see text. Demographic data




T (54)


Age (years) Education (years) Illness duration (years)

39.64 7 10.13 12.04 71.07 6.82 79.04

34.68 7 8.76 12.46 7 0.96

 1.96 1.58

0.055 0.121

Questionnaires SPF FS PT EC PD

29.75 7 6.11 11.86 7 3.34 14.32 7 3.06 15.32 7 2.61 11.75 7 3.15

33.82 7 6.14 13.21 7 2.95 15.39 7 2.81 14.82 7 2.70 10.04 7 2.76

2.49 1.61 1.37  0.71  2.17

0.016* 0.113 0.177 0.484 0.034*

E-scale CS ES EC CC

3.007 0.71 2.60 70.92 2.91 70.79 3.55 70.83 2.94 70.88

3.07 7 0.72 2.84 7 1.04 3.09 7 0.90 3.34 7 0.78 3.01 7 0.78

0.37 0.92 0.78  0.97 0.32

0.715 0.359 0.440 0.336 0.750


55.71 7 9.45 15.96 7 4.31 19.89 7 5.86 19.86 7 4.30

37.68 7 7.00 10.64 7 3.05 10.82 7 2.70 16.21 7 4.33

 8.11  5.34  7.44  3.16

o 0.001* o 0.001* o 0.001* 0.003*

Neuropsychology WST (words total) TMT-A (s) TMT-B (s) RWT (words total) VERT (% correct)

29.89 7 6.41 26.56 7 14.74 44.47 7 20.13 57.18 7 12.72 79.23 7 10.42

33.29 7 3.34 19.43 7 6.30 40.94 7 16.52 67.61 7 12.74 80.86 7 10.77

2.48  2.35  0.72 3.07 5.4

0.016* 0.022* 0.476 0.003* 0.589

Psychopathology HAMD-17 BDI-II

8.61 73.03 24.78 7 15.44

Medication SSRIs SNRIs NARI TCAs TeCAs

n¼ 7 n¼ 12 n¼ 2 n¼ 2 n¼ 5

*Po 0.05.

2.2. Stimulus material The stimuli consisted of 96 short video clips (mean duration¼11.8 s, S.D.¼1) that depicted a male or female actor in close-up, telling self-related short narratives. Stories were standardized according to syntactic set-up (past tense, first-person perspective), word count (mean¼28.26, S.D.¼ 3.10), and syllable count (mean¼45.75, S.D.¼4.09). They conveyed one of four emotion categories (happiness, sadness, fear, and disgust, 20 video clips each) or no emotion (neutral, 16 video clips each). For the analysis of emotion categories; the total set of 96 video clips was applied while for the analysis of different communication conditions, a subset of 48 video clips was analyzed. Here, stimuli were used in the following conditions: comprehensible-emotional, ‘cE’, comprised the actor telling an emotional story with corresponding emotional facial expression and prosody (16 clips). Incomprehensibleemotional, ‘iE’, was presented in a foreign language incomprehensible to our German native subjects (Polish, Russian, Croatian, Yugoslavian, or Serbian) but contained congruent emotional facial expression and prosody (16 clips). The third condition (comprehensible-neutral, ‘cN’) served as a control and contained a neutral narrative with neutral facial expression and prosody (16 clips). 2.3. Data acquisition Participants were tested in a sound-proof room while comfortably seated in front of a notebook (IBM, ThinkPads; 15.6-in. screen), equipped with headphones (HPM 1000, Behringers). Two flat reversible silver/silver chloride (Ag–AgCl) electrodes of 10 mm diameter were attached to their non-dominant left hand to measure galvanic skin conductance (GSC). Varioports 3-channel bioamplifier (Becker MEDITEC, Germany) served as the recording tool for the GSC data (16 Hz sampling rate) which was analyzed to extract phasic components from tonic activity based on continuous decomposition analysis (Benedeck and Kaernbach, 2010) implemented in Ledalabs software (Leipzig, Germany). Synchronization with the behavioral measures was done using Presentations software. 2.4. Procedure During the preparation phase (  5–10 min prior to the experiment), electrodes were attached to the participant’s left hand, palms-up to avoid motion, in order to enable the tonic GSC component to reach baseline level. Three demonstration trials preceded the experiment’s start. All participants were instructed to engage in the presented emotional communication situation interaction as vividly as possible by regarding the presented actor as a personally familiar person and to spontaneously give four subjective ratings after each video clip. The first two ratings were to indicate the emotion category of the experienced emotion (‘happy’, ‘sad’, ‘fear’, ‘disgust’, or ‘neutral’) as well as the associated emotional intensity (1–6). Subjects were then to rate the emotion and emotional intensity expressed by the actor. A white fixation cross set against a black screen was shown between the trials to allow GSC measures to return to baseline. 2.5. Neurocognitive assessment and empathy questionnaires All participants underwent neuropsychological testing after the experiment (Table 1). The battery included tests on crystallized verbal intelligence (Wortschatztest: WST, Schmidt and Metzler, 1992), executive functions (Trail Making Test, versions A and B, TMT-A/-B, Reitan, 1992), and word fluency (RWT, Aschenbrenner et al., 2000). In addition, two empathy questionnaires were introduced: the German version of the Interpersonal Reactivity Index, SPF (IRI, Davis, 1980) measures multiple dimensions of emotional (fantasy, emotional concern, personal distress) and cognitive components of empathy (perspective taking) on a five-point scale. The E-Scale (Leibetseder et al., 2007) assesses a variety of components related to empathic processing (emotional/cognitive sensitivity, emotional/cognitive concern). Also, the Toronto Alexithymia Scale (TAS-20, Bagby et al., 1986) was included for the exploration of deficits in identifying and describing emotions. To test basic emotional face recognition performance in still photographs, the short version of the ‘Vienna Emotion Recognition Task’ (VERT-K) was included. 2.6. Statistical analysis Within each experimental condition or emotion category, the relative frequencies of correctly recognized emotions (trials in which the recognized emotion matched the target emotion, ‘other-matching-target’ (OMT)) and correct affective responses (trials in which the own experienced emotion matched the target emotion, ‘self-matching-target’ (SMT)) was analyzed. Empathy was defined as trials in which both own emotion and other emotion matched the target emotion (‘self-and-other-matching-target’, SOMT). We further analyzed mean intensity levels of the other’s emotion (Int_OMT) as well as for the own emotion (Int_SMT). For the latter, we also analyzed those own intensity levels in which the participant had given an empathic response (Int_SOMT). For the error analysis, the absolute error value for each emotion (e.g. ‘rated happy when target emotion was sad’) was divided by the total number of cases when no congruence was achieved

D. Schneider et al. / Psychiatry Research 200 (2012) 294–305 (SOMT  1). The latter included cases in which the subject either falsely indicated the actor’s emotion, their own emotion, or both. Data were analyzed with separate full factorial general linear models (generalized estimating equations, GEEs, IBM s SPSS s, accounting for non-parametric distribution and/or violation of sphericity) were set up for each of the variables OMT, SMT, SOMT, Int_OMT, Int_SMT, Int_SOMT. Within-subject factors included ‘Communication Condition’ (three levels: cE, cN, iE) or ‘Emotion Category’ (four levels: happy, sad, fearful, disgusted, neutral [excluded for intensity levels]). Group differences between patients and controls were assessed with a betweensubject factor (‘Group’). All post-hoc comparisons were Bonferroni corrected. To test differences between patients and controls on normally distributed variables, independent samples t-tests were applied. Pearson’s correlation r (one-tailed, Bonferroni-corrected) was calculated to test relationships between explicit and implicit measures, and potential associations with demographic or neuropsychological data. For the analysis of skin conductance data, the number of galvanic skin responses (nGSR) above 0,02 mS were summed up during stimulus presentation which expresses general arousal level over time. Due to the relatively long stimuli, two time windows were chosen. The first (TW1: 1–3 s) was expected to include immediate responses to stimulus presentation (e.g. orienting response, appeal of the actor). In the second (TW2: 3–13 s) the emotional information developed and reached its apex which is why TW2 was the focused time window. Physiological data of 11 patients and 10 controls were excluded due to technical errors (patients n¼1; controls n¼ 2) associated with the vulnerability of SC measurements and hypo-or hyper-responsive participants (patients n¼10; controls n¼ 8) exceeding three standard deviations of stimulus-driven GSRs from the global mean across emotion categories. Regarding GSR measurements, this reduced the sample size to n¼ 17 (patients) and n¼ 18 (controls), respectively.

3. Results 3.1. Behavioral differences between patients and controls in different communication conditions 3.1.1. Emotion recognition and corresponding intensity levels The average emotion recognition (OMT) rate across all communication conditions was 84.67% (S.D. ¼14.95%) for patients and 89.66% (S.D. ¼12.96%) for controls (Table 2). The GEE analysis


revealed significant main effects of ‘Group’ and ‘Communication Condition’ but no significant interaction effect. Pairwise comparisons indicated that the OMT rate was highest in condition ‘cE’ compared to ‘cN’ (t ¼2.20, P¼0.018) and ‘iE’ (t ¼13.97, Po0.001). In addition, OMT rates were higher in condition ‘cN’ than in ‘iE’ (t ¼7.07, Po0.001). For the intensity levels of the recognized emotion (Int_OMT), the GEE revealed no significant main effect of ‘Group’ but a significant main effect of ‘Communication Condition’ and a significant interaction effect. Controls (t ¼6.46) but not patients (t ¼2.38) rated the recognized emotional intensity higher in condition ‘cE’ than in ‘iE’ (Po0.001 and P¼0.092, respectively). The difference between these two conditions in patients was only significant if no correction was applied (t ¼2.38, Po0.025).

3.1.2. Affective responses and corresponding intensity levels The GEE for affective response (SMT) rates revealed that across all conditions, the affective responses matched the target emotion in 66.74% (S.D.¼30.27%) of video clips for patients and in 74.7% (S.D.¼25.07%) for controls. The data showed two significant main effects of ‘Group’ and ‘Communication Condition’ but no interaction effect. Similar to the pattern present in OMT rates, patients matched their own affective state less frequently with the target emotion in all experimental conditions compared to controls. Again, SMT rates were significantly higher in condition ‘cE’ than in ‘iE’ (t¼ 8.84; Po0.001). However, SMT rates were highest in condition ‘cN’ as compared to ‘cE’ (t¼5.64, Po0.001) and ‘iE’ (t¼11.03, Po0.001). The GEE analysis of intensity ratings of the own emotion (Int_SMT) revealed no significant main effects of ‘Group’, ‘Communication Condition’, nor an interaction effect.

Table 2 Behavioral rating results of between groups and in three experimental conditions; behavioral responses comprised emotion recognition accuracy, affective responses, empathy and the corresponding intensity levels of these variables. Condition

M 7 S.D.





Wald w2


Emotion recognition (OMT) Total 87.17 7 14.17 cE 95.54 7 7.60 iE 74.22 7 13.00 cN 91.74 7 10.93

84.67 7 14.95 92.86 7 9.58 71.21 7 12.42 89.95 7 12.42

89.66 7 12.96 98.21 7 3.34 77.23 7 13.09 93.53 7 9.08

Group Condition Cond.  group

1 2 2

12.97 157.42 5.79

o 0.001nnn o 0.001nnn 0.055

Affective response (SMT) Total 70.72 7 27.99 cE 70.42 7 25.82 iE 49.00 725.11 cN 92.75 7 10.37

66.74 7 30.27 66.29 7 28.28 44.20 727.69 89.73 7 13.20

74.7 725.07 74.55 7 22.88 53.79 7 21.67 95.76 7 5.12

Group Condition Cond.  group

1 2 2

8.15 165.96 0.63

0.004nn o 0.001nnn 0.731

Empathy (SOMT) Total cE iE cN

64.21 7 29.72 64.06 728.24 43.08 726.81 85.49 7 16.32

72.92 7 24.44 74.11 7 22.74 53.35 7 21.88 91.29 7 9.52

Group Condition Cond.  group

1 2 2

6.81 120.93 0.06

0.009nn o 0.001nnn 0.972

3.86 70.69 3.97 70.73 3.75 70.65

4.107 0.95 4.35 70.72 3.84 71.15

Group Condition Cond.  group

1 1 1

1.76 37.34 5.84

0.184 o 0.001nnn 0.016n

3.42 70.76 3.46 70.68 3.38 70.84

3.26 70.92 3.36 70.86 3.16 70.97

Group Condition Cond.  group

1 1 1

0.62 2.43 0.43

0.433 0.119 0.514

3.01 71.01 3.08 71.08 2.93 70.93

3.087 0.95 3.32 70.91 2.83 70.99

Group Condition Cond.  group

1 1 1

0.113 42.45 0.02

0.736 o 0.001nnn 0.895


68.56 7 27.47 69.08 7 25.90 48.21 7 24.79 88.39 7 13.56

Intensity recognized emotion (Int_OMT) Total 3.98 7 0.71 cE 4.16 7 0.75 iE 3.8 70.66 Intensity affective response (Int_SMT) Total 3.34 7 0.84 cE 3.41 7 0.77 iE 3.27 7 0.90 Intensity given SOMT (Int_SMT) Total 3.04 70.98 cE 3.20 71.00 iE 2.88 7 0.96 P o 0.05. P o0.01. nnn Po 0.001. n



D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

Fig. 1. Rating results. Empathy (SOMT) and its two components emotion recognition (OMT) and affective response (SMT) in healthy controls (HC, white) and depressed patients (MDD, gray). The difference of rating measures between groups and experimental conditions were tested in separate generalized linear models.

3.1.3. Empathy and corresponding intensity levels The average SOMT rate across all conditions was 64.21% (S.D. ¼29.72%) for patients and 72.92% (S.D. ¼24.44%) for controls. The GEE analysis revealed a significant main effect of ‘Group’ significant main effect of ‘Communication Condition’ but no significant interaction effect. Compared to controls, patients displayed lower SOMT rates. Again, SOMT rates were higher in condition ‘cE’ compared to ‘iE’ (t ¼8.47, Po0.001) and highest in condition ‘cN1’ compared to ‘cE’ (t ¼4.63) and compared to ‘iE’ (t¼ 9.48) (both Po0.001). The GEE analysis of empathic intensity ratings (Int_SOMT) yielded no significant main effect of ‘Group’ but a significant main effect of ‘Communication Condition’. Here, the affective response to video clips which were empathically responded to was rated as more intense in condition ‘cE’ compared to ‘iE’ (t ¼5.56; Po0.001). There was no significant interaction effect. For an overview of rating results concerning communication conditions, see Fig. 1. In sum, empathy was lower in patients compared to controls. Patients showed decreased emotion recognition accuracy and own affective responses. Moreover, controls rated comprehensible emotional video clips as more intense than incomprehensible ones.

1 Empathy, of course, is an emotional construct. The high rates of neutral video clips should be regarded with care. High empathy rates here were seen as validation of a good stimulus quality (easy to recognize and therefore, unequivocally responded to with a neutral affective state).

3.2. Behavioral differences between patients and controls regarding emotion categories 3.2.1. Emotion recognition and corresponding intensity levels Concerning emotion specific effects, the GEE analysis showed significant main effects of ‘Group’ and ‘Emotion Category’ confirming that controls correctly recognized more emotional clips than patients. There was no significant interaction effect. Pairwise comparisons indicated that happy video clips had higher OMT rates compared to fearful (t ¼5.37) and disgusted (t ¼4.92) video clips (all Ps o0.001). Moreover, OMT rates significantly differed between sad and fearful (t ¼4.07, Po0.01) as well as sad and disgusted (t ¼3.02, P¼0.035) video clips. Neutral video clips evoked highest OMT rates as compared to happy (t ¼6.51), sad (t ¼4.03), fearful (t ¼4.07), and disgusted (t ¼3.02) video clips (all Pso0.001). OMT rates between happy and sad (t ¼1.22, P40.999) as well as fearful and disgusted (t¼1.12, P 40.999) video clips, however, did not significantly differ from each other. For the intensity values of the other’s emotion (Int_OMT), the GEE analysis showed no significant main effect of ‘Group’ but a significant main effect of ‘Emotion Category’ and a significant interaction effect. Controls gave higher intensity ratings to happy clips than to sad (t¼ 5.08) and fearful (t¼ 5.24) clips (all Pso0.001). Moreover, disgusted video clips were given higher intensity ratings than sad (t ¼3.82, Po0.01) and fearful (t ¼3.82, Po0.001) videos. There was no difference between happy and disgusted (t¼2.98, P¼0.068) or between sad and fearful clips (t ¼1.81, P40.999). In contrast, intensity levels did not differ between emotions in patients. The comparison between more

D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

intensely rated happy video clips (compared to sad clips) was only significant if no correction was applied (t ¼2.59, P¼0.015). 3.2.2. Affective response and corresponding intensity levels Regarding the affective response to different emotion categories, the GEE analysis showed a similar response pattern in both groups (Table 3) for SMT rates with a significant main effect of ‘Group’ and ‘Emotion Category’ but no significant interaction effect. Again, controls showed higher affective response rates than patients did. In addition, happy video clips were most often responded to with a congruent affective response than this was the case during the presentation of fearful (t¼4.72, Po0.001) clips. The difference between happy and sad (t¼2.08, P¼0.237) or disgusted clips (t¼2.71, P¼0.057) was not significant. Only neutral video clips elicited higher SMT rates than happy (t¼3.59), sad (t¼4.83), and disgusted (t¼5.92) video clips (Po0.001). The GEE analysis of the affective response intensities (Int_SMT) showed a significant main effect of ‘Emotion Category’ but no significant main effect of ‘Group’ or an interaction effect. More specifically, disgusted video clips were experienced more intensely compared to sad (t ¼4.63, Po0.001) and fearful


(t ¼2.84, P¼0.022) but not to happy (t ¼1.3, P40.999.) video clips. In contrast, there was no difference between happy and sad (t ¼2.37, P ¼0.078), happy and fearful (t ¼1.54, P¼ 0.612), or sad and fearful (t ¼  0.7, P 40.999).

3.2.3. Empathy and corresponding intensity levels Regarding differences in SOMT rates for emotion categories, GEE results indicated a significant main effect of ‘Group’ and ‘Emotion Category’ but no significant interaction effect. Patients displayed overall lower SOMT rates compared to controls. Posthoc comparisons showed happy video clips to yield the highest SOMT rates compared to fearful (t ¼5.18, Po0.001) and disgusted (t ¼3.55, P o0.01) but not to sad (t¼2.42, P¼0.107) video clips. Again, neutral video clips had higher SOMT rates than happy (t ¼3.76), sad (t ¼5.04), fearful (t ¼7.22), and disgusted (t¼ 6.3) video clips (all Ps o0.001). In addition, sad video clips induced higher SOMT rates than fearful (t ¼4.01, Po0.001) but not disgusted (t ¼0.71, P 40.999) video clips. Likewise, disgusted video clips evoked higher SOMT rates than fearful video clips (t ¼2.99, P ¼0.022).

Table 3 Behavioral rating results between groups and in different emotion categories; behavioral responses comprised emotion recognition accuracy, affective responses, behavioral empathy and the corresponding intensity levels of these variables. Emotion

M 7S.D.

GEE (N ¼ 56) MDD




Wald w2


Emotion recognition (OMT) Total 81.49 7 13.52 Happy 84.20 710.82 Sad 81.70 712.03 Fear 73.84 7 13.42 Disgust 75.98 7 12.66 Neutral 91.74 7 10.93

77.92 714.61 81.07 7 11.81 79.29 713.24 68.75 713.17 70.54 7 12.27 89.96 712.42

85.06 7 11.29 87.32 7 8.87 84.11 7 10.37 78.93 7 11.81 81.43 7 10.70 93.53 7 9.08

Group Emotion Emotion  group

1 4 4

16.77 71.73 2.23

o 0.001nnn o 0.001nnn 694

Affective response (SMT) Total 61.10 729.00 Happy 60.54 725.97 Sad 54.02 726.81 Fear 45.09 728.34 Disgust 53.13 7 23.77 Neutral 92.75 7 10.37

56.59 731.01 52.14 726.99 52.86 729.80 41.61 729.78 46.61 727.62 89.73 713.20

65.62 7 26.17 68.93 7 22.33 55.18 7 23.94 48.57 7 26.90 59.64 7 17.32 95.76 7 5.12

Group Emotion Emotion  group

1 4 4

6.53 156.23 9.18

0.011n o 0.001nnn 0.057

Empathy (SOMT) Total Happy Sad Fear Disgust Neutral

51.31 729.52 48.40 7 24.83 46.43 726.59 36.07 7 26.05 40.18 7 24.55 85.49 716.32

62.33 7 25.17 65.71 7 20.26 53.04 7 23.43 45.54 7 25.03 56.07 7 16.74 91.21 7 9.52

Group Emotion Emotion  group

1 4 4

9.19 118.81 6.35

o 0.01nn o 0.001nnn 0.175

Intensity recognized emotion (Int_OMT) Total 3.72 70.72 Happy 3.97 70.75 Sad 3.60 70.74 Fear 3.59 70.69 Disgust 3.72 70.69

3.63 7 0.66 3.84 7 0.75 3.51 7 0.74 3.60 7 0.57 3.57 7 0.56

3.81 7 0.76 4.11 7 0.73 3.68 7 0.75 3.57 7 0.79 3.88 7 0.77

Group Emotion Emotion  group

1 3 3

1.15 28.62 9.52

0.284 o 0.001nnn 0.023n

Intensity affective response (Int_SMT) Total 3.34 70.82 Happy 3.39 70.83 Sad 3.21 70.81 Fear 3.26 70.83 Disgust 3.51 70.80

3.44 7 0.76 3.37 7 0.83 3.34 7 0.71 3.44 7 0.72 3.62 7 0.76

3.24 7 0.86 3.40 70.85 3.08 70.89 3.07 70.89 3.41 7 0.83

Group Emotion Emotion  group

1 3 3

1.14 28.01 7.02

0.288 o 0.001nnn 0.071

Intensity given SOMT (Int_SOMT) Total 2.99 70.99 Happy 3.11 70.97 Sad 2.80 70.91 Fear 2.75 71.00 Disgust 3.29 71.08

2.91 7 1.07 2.89 7 1.02 2.81 7 0.95 2.76 7 1.09 3.18 7 1.21

3.07 70.91 3.34 7 0.89 2.79 7 0.89 2.73 7 0.92 3.40 70.94

Group Emotion Emotion  group

1 3 3

0.004 36.97 8.38

0.947 o 0.001nnn 0.039n


P o 0.05. P o0.01. nnn Po 0.001. n


56.82 7 27.93 57.05 724.10 49.73 7 25.05 40.80 725.76 48.13 7 22.31 88.39 7 13.56


D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

Concerning the affective response intensities to empathically responded video clips (Int_SOMT), the GEE analysis showed no significant main effect of ‘Group’ but a significant main effect of ‘Emotion Category’ and a significant interaction effect. For controls, happy video clips induced higher intensity ratings than sad (t¼ 4.3, Po0.01) and fearful (t¼ 4.98, P o0.001) but not disgusted (t¼  0.48, P40.999) video clips. Moreover, disgusted video clips were experienced as more intense than sad (t ¼5.37) and fearful (t¼ 4.83) video clips (both Po0.001). There was no significant difference for intensity ratings between sad and fearful video clips (t¼ 0.99, P40.999). In contrast, the intensity ratings of patients’ own affective response did not differ significantly between emotion categories (happy vs. sad: t ¼0.52, P 40.999; happy vs. fear: t ¼0.76, P40.999; happy vs. disgust: t ¼  1.29, P40.999; sad vs. fear: t ¼0.54, P 40.999; sad vs. disgust: t ¼  2.05, P¼ 0.27; fear vs. disgust: t ¼ 2.45, P¼0.13). Disgusted video clips received significantly higher Int_SOMT values than fearful video clips (t¼ 2.45, P¼0.022) if no correction was applied. Summed up, controls experienced happy and sad video clips which were empathically responded to as more intense than fearful and disgusted ones. They also generally rated happy and disgusted video clips as being more intense than sad and fearful ones. Across all video clips, experimental empathy correlated weakly with external measures of empathy. Pearson’s r in the patient sample ranged between r ¼0.351 (SOMT and total score SPF, P ¼0.067) and r ¼0.235 (SOMT and total score E-Scale, P¼0.229) and were slightly lower in the healthy subgroup for SOMT and the total score of the SPF: r ¼0.211, P ¼0.282 and slightly higher for SOMT and the total score of the E-Scale: r ¼0.244, P¼0.210. Symptom severity or illness duration of patients did not significantly correlate with behavioral empathy or its components.

3.3. Error analysis The analysis of participants’ erroneous response pattern is based upon all cases indicating no correct congruence between recognized and felt emotion (SOMT  1). This includes choosing an emotion category differing from the target emotion during otherratings, self-ratings, and their combination (Table 4). Subclass A indicates all cases in which the SOMT  1 rate was based upon both wrong emotion recognition (nOMT) and wrong emotional response (nSMT). This occurred in 37.29% (S.D. ¼17.6%) and 34.64% (S.D. ¼17.55%) of the cases in patients and controls, respectively. For example, when the target emotion was neutral, controls indicated the recognized emotion and their affective response both to be sad in 33.33% of these cases. In contrast, patients rated the recognized emotion and their affective response to be sad in 84% and 80%, respectively. Subclass B shows all cases in which the SOMT  1 rate was based only on nOMT. Patients misclassified an emotion in 13.87% (S.D.¼15.5%) and controls in 9.8% (S.D.¼ 12.35%) of these cases. For example, when the target emotion was sad, controls rated the recognized emotion to be fearful in 16.67%, disgusted in 16.67%, and neutral in 66.67% of erroneous cases. In contrast, patients rated the recognized emotion to be fearful in 19.44%, disgusted in 2.78%, and neutral in 77.78% of erroneous cases. Subclass C indicated all cases in which the SOMT  1 rate is solely based on nSMT. This occurred in 48.84% (S.D. ¼23.45%) of these cases in patients and in 55.56% (S.D.¼ 20.13%) of these cases in controls. For example, when the target emotion was happy, nSMT was based on a neutral affective response in 97.52% of the cases for controls and in 96.17% of the cases for patients. Group differences were not significant in subclass A, B, and C (t¼ 0.57, P¼0.565, t ¼1.09, P¼0.282, and t ¼1.15, P ¼0.255, respectively). We refrained from further statistical tests as the percentages in

Table 4 Error analysis, displaying mean percentage values (values in brackets ¼MDD patients) of three different types of misclassifications (A–C). Within these subclasses, the misclassifications happened in different emotion categories. A ‘confusion matrix’ is shown to explain which emotion participants chose other than the target emotion. M 7 S.D. Target emotion Happiness




A Wrong recognition and emotional response—34.64 717.55% (37.297 17.60%) Emotion other Happiness – 2.60 (0.00) Sadness 5.66 (14.63) – Fear 3.77 (3.66) 31.17 (24.05) Disgust 0.00 (2.44) 15.58 (7.59) Neutral 90.57 (79.27) 50.65 (68.35)

3.96 (2.10) 16.83 (15.38) – 21.78 (9.09) 57.43 (73.43)

15.48 29.76 19.05 – 35.71


22.22 (0.00) 33.33 (84.00) 44.44 (16.00) 0.00 (0.00) –

Affective response Happiness Sadness Fear Disgust Neutral

2.60 (2.53) – 6.49 (15.19) 2.60 (1.27) 88.31 (81.01)

5.94 (4.90) 5.94 (6.29) – 6.93 (5.59) 81.19 (83.22)

14.29 (12.60) 17.86 (14.96) 7.14 (6.30) – 60.71 (66.14)

22.22 (4.00) 33.33 (80.00) 44.44 (16.00) 0.00 (0.00) –

B Wrong emotion recognition—9.80 712.35% (13.877 15.50%) Emotion other Happiness – 0.00 (0.00) Sadness 5.56 (9.52) – Fear 0.00 (4.76) 16.67 (19.44) Disgust 0.00 (4.76) 16.67 (2.78) Neutral 94.44 (80.95) 66.67 (77.78)

0.00 (0.00) 5.88 (9.68) – 0.00 (0.00) 94.12 (90.32)

0.00 (5.56) 0.00 (2.78) 25.00 (2.78) – 75.00 (88.89)

15.00 30.00 45.00 10.00 –

C Wrong emotional response—55.567 20.13% (48.847 23.45%) Affective response Happiness – 2.87 (5.98) Sadness 0.83 (1.09) – Fear 0.83 (1.09) 5.17 (2.17) Disgust 0.83 (1.64) 1.15 (1.09) Neutral 97.52 (96.17) 90.80 (90.76)

5.88 (6.01) 1.60 (2.19) – 1.07 (1.09) 91.44 (90.71)

20.42 (18.82) 0.00 (1.18) 2.11 (1.18) – 77.46 (78.82)

60.00 (45.00) 0.00 (30.00) 40.00 (20.00) 0.00 (5.00) –

– 0.00 (9.76) 0.00 (2.44) 0.00 (1.22) 100.00 (86.59)


(8.66) (28.35) (12.60)

(26.32) (42.11) (26.32) (5.26)

D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

each cell entry are based on a very restricted number of stimuli which has a considerably decremental effect on statistical power. Taken together, indicating a ‘neutral’ response to an emotional stimulus was the most common reason for misclassifying a target emotion or not matching the own affective response to the target in all subclasses for both groups. Moreover, patients displayed a wider set of erroneous answers than controls. 3.4. Psychophysiological responses to the different emotion categories In the first time window (TW1; 1–3 s), ‘nGSR’ showed no significantly different effects between patients and controls (‘Group’) and all emotions (‘Emotion Category’). In the second time window (TW2; 3–13 s) a GEE analyzing nGSR revealed a significant main effect of ‘Group’ and ‘Emotion Category’ as well as a significant interaction effect (Table 5). Patients displayed more GSRs than controls in all emotion categories


(Fig. 2). For controls, post-hoc comparisons showed that happy video clips evoked more GSRs compared to sad (t¼ 4.07, P¼0.001) but not to fearful (t¼2.39, P¼0.634) or disgusted (t¼2.36, P¼0.69) video clips. The number of GSRs did also not differ between sad and fearful (t ¼  1.52, P4 0.999), sad and disgusted (t ¼  2.27, P¼0.876) nor fearful and disgusted (t¼  0.66, P 40.999) video clips. Neutral video clips evoked less GSRs as compared to happy (t ¼  7.51), sad (t ¼  4.41), fearful (t ¼ 5.85), and disgusted (t ¼  6.02) ones (all Ps o0.001). For patients, the number of GSRs did not differ between emotion categories (happy vs. sad: t ¼2.52, P¼0.422; happy vs. fear: t ¼1.23, P40.999; happy vs. disgust: t¼ 1.56, P40.999; sad vs. fear: t¼  1.71, P40.999; sad vs. disgust: t¼  0.33, P40.999; fear vs. disgust: t¼0.77, P40.999). However, neutral video clips evoked less GSRs as compared to happy (t ¼ 8.31), sad (t ¼  8.81), fearful (t¼  7.46), and disgusted (t ¼  7.7) ones (all Ps o0.001). The number of GSRs did not differ between groups in response to neutral clips (t ¼ 1.73, P¼0.094).

Table 5 Results of galvanic skin conductance parameters between groups and in different emotion categories. Responses to all video clips and to those which were empathically responded to are displayed sepatarely. TW2¼ time window 2. nGSR ¼number of galvanic skin responses. Emotion

M 7S.D.





13.42 7 9.2 18.14 7 8.53 14.31 7 8.22 15.91 7 8.72 15.66 7 8.24 7.86 7 4.72

17.19 79.26 20.717 9.82 18.007 8.28 19.59 79.59 18.41 78.86 9.247 5.12

11.72 7 6.73 15.72 7 6.48 10.83 7 6.65 12.44 7 6.25 13.06 7 6.87 6.56 7 4.05

nGSR given SOMT (TW2) Total 6.79 7 6.16 Happy 10.68 7 6.17 Sad 6.82 7 6.92 Fear 6.47 7 6.91 Disgust 7.45 7 4.95 Neutral 6.63 7 4.35

8.147 6.87 10.507 6.61 8.387 8.52 6.677 8.08 7.407 6.00 7.657 4.87

7.16 7 5.27 10.83 7 5.94 5.44 7 4.95 6.29 7 5.96 7.50 74.06 5.67 7 3.68

nGSR (TW2) Total Happy Sad Fear Disgust Neutral


Wald w2


Group Emotion Emotion  Group

1 4 4

6.03 174.73 12.77

0.014n o 0.001nnn 0.012n

Group Emotion Emotion  Group

1 4 4

4.17 40.91 11.31

0.041n o 0.001nnn 0.023n


P o 0.01.



P o 0.05. Po 0.001.


Fig. 2. Physiological results. The number of galvanic skin responses during stimulus presentation (nGSR) is displayed on the left, the number of galvanic skin responses during only empathically responded clips (nGSR_SOMT) is displayed on the right in healthy controls (HC, white) and depressed patients (MDD, gray). The difference of rating measures between groups and experimental conditions were tested in separate generalized linear models.


D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

When examining GSRs of empathically responded clips (nGSR_SOMT), the GEE analysis showed a significant main effect of ‘Group’, ‘Emotion Category’ as well as a significant interaction effect. Pairwise comparisons of the interaction effect indicated that happy video clips evoked more GSRs than sad (t ¼4.58, Po0.001) and disgusted (t¼3.58, Po0.01) video clips in controls. The difference of nGSR_SOMT between happy and fearful (t ¼2.78, P40.999), between sad and fearful (t ¼  0.6, P40.999) or disgusted (t ¼  2.77, P¼0.911), and between fearful and disgusted (t¼  1.49, P40.999) video clips was not significant. In contrast, patients displayed no differences in the amount of GSRs between happy and sad (t¼1.55, P40.999), happy and fearful (t ¼3.26, P40.999), happy and disgusted (t ¼3.46, P¼0.051), sad and fearful (t ¼2.83, P40.999), sad and disgusted (t ¼1.01, P40.999), nor fearful and disgusted (t ¼  0.55, P40.999). There was also no group difference for nGSR_SOMT in response to neutral clips (t ¼  1.36, P ¼0.182). The overall baseline level was also higher for patients than controls (t ¼3.73, P¼0.001). For an overview of nGSR and nGSR_SOMT results see Fig. 2. Correlation analysis showed a relationship between behavioral and physiological measures of emotional processing in both groups. More specifically, SOMT rates positively correlated with the number of GSRs evoked by video clips that induced empathy in (r¼ 0.752, Po0.001) but not in controls (r¼ 0.401, P¼0.099). When participants indicated the intensity of their affective response under the constraint of SOMT, higher intensity ratings were accompanied by generally more GSRs in controls (r ¼0.600, P¼0.008) and non-significantly less GSRs in patients (r ¼  0.247, P¼0.339). There were no correlations found for the other behavioral measures. In sum, emotional video clips that were empathically responded to evoked more GSRs in patients than in controls. Likewise, they generally responded to emotional video clips with more GSRs. Happy video clips evoked more GSRs than sad ones in controls. 3.5. Neuropsychological data Neuropsychological assessment showed that patients scored lower on tests on crystallized verbal intelligence (WST) and executive functions (TMT-A) as compared to controls (Table 1). Results of the empathy measures showed that patients had a lower overall SPF score and higher ratings on the ‘Personal Distress’ scale than controls. Groups did not differ regarding the other empathy components. However, patients scored higher than controls on all three subscales (‘DDF’ ¼Difficulty Describing Feelings, ‘DIF’¼ Difficulty Identifying Feelings, and ‘EOT’ ¼Externally Oriented Thinking) indicating possible alexithymia (Bagby et al., 1994). With regard to emotion recognition of photographs, results of the ‘VERT-K’ indicated that both groups recognized the emotions of static faces equally well (independent t-test: t ¼0.54; P¼0.589).

4. Discussion The present study investigated empathic processing in major depressive disorder (MDD) with several strong distinctions to previous studies. This included the use of natural dynamic stimuli, simultaneous assessment of different empathy components, namely emotion recognition and experience, a multimethod approach to measure emotional responses on a behavioral as well as physiological level, and the possibility to investigate the influence of accessible verbal information on these variables. Both explicit behavioral and implicit physiological measures suggested a general deficit of emotion processing in

patients compared to healthy controls. The assumptions of the ECI theory could be confirmed on a behavioral level (Rottenberg, 2005) while physiological responses were more in accordance with the ‘limbic-cortical dysregulation’ model. Our results point to specific impairments in all components of empathic processing in major depression. 4.1. All components of empathy are disturbed in depression Compared to controls, patients showed a general deficit in empathic processing of dynamic stimuli. That is, they performed worse in recognizing a dynamically presented target emotion, matching their own affective response to the target emotion, and subsequently displayed lower empathy rates. The demonstrated emotion recognition deficits did not manifest themselves when recognizing static emotional displays (‘VERT-K’) which suggests these deficits to be caused by a higher complexity of the stimulus material in which facial expression, prosody, and speech content have to be integrated to draw a conclusion on the emotion category. Across both groups, an error-analysis showed that the major force driving the decreased empathy ratings was the participants’ subjective emotional experience mismatching the target emotion, not incorrect emotion recognition rates. This adds to the importance of including affective response ratings when investigating empathy and confirms previous results of our group (Derntl et al., 2009; Regenbogen et al., 2012a,b). That patients had more difficulty in gaining access to their emotional experience (lower affective response rates) was also supported by higher alexithymia scores as indicated by all subscales of the TAS-20. The concept of empathy as conceptualized in our study, thus, depended on correct emotion recognition and affective responses at the same time. Importantly, no differences between patients and controls could be demonstrated for the recognition of (and the affective response towards) neutral video clips. This finding supports the notion of a specific deficit in emotion processing rather than a problem in processing facial expression, prosody, or speech content per se as suggested by theories of a general perceptual deficit in depression (Asthana et al., 1998). While our behavioral results support the ‘ECI’ theory of depression (Rottenberg, 2005), our physiological findings of elevated autonomic arousal contradict it. This dissociation has previously been shown (Brown et al., 1978; Guinjoan et al., 1995; Gross and Levenson, 1997; Falkenberg, et al., 2012). As GSR activity reflects sympathetic tone, it is often used as an indirect measure of attentional processes, cognitive effort, and emotional arousal (Crichley et al., 2000). Dynamic stimuli require the integration of multiple communication channels (Regenbogen et al., 2012b) as well as a combined recruitment of attentional, cognitive, and emotional processes to a much larger extent than static displays of emotion. Patients have been shown to have deficits in the integration of these domains (for a review see Ottowitz et al., 2002). Elevated autonomous activity in patients may derive from these deficits. While hereby showing that stimulus characteristics may influence autonomous arousal to a noticeable extent, on a broader level, the variance in stimulus-, subject-, and study design characteristics may help to explain the heterogeneity of GSR findings (Venables and Mitchell, 1996). That emotional video clips evoked stronger autonomous arousal compared to neutral clips in patients and controls showed that elevated GSRs were specific to emotional content. Importantly, the physiological response to neutral video clips did not differ between groups. This points to increased reactivity among depressed subjects elicited by affective stimuli. However, patients had a higher baseline level of autonomic arousal, possibly

D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

attributed to hyper-arousal in limbic regions which has to be taken into account when interpreting these findings. Taken together, both behavioral and physiological results support the notion of low coherence across different domains of emotional functioning and can be interpreted in the light of previous neuroimaging findings (e.g. Drevets, 1999; 2001). More specifically, the ‘limbic-cortical dysregulation’ model of depression can explain the behavioral and psychophysiological effects as a result of disparate cortical and limbic network changes associated with the disorder. That is, the co-occurrence of reciprocal hypo-responsiveness in prefrontal brain regions such as the prefrontal cortex and the cingulate gyrus and hyper-responsiveness of limbic structures such as the hypothalamus, hippocampus, and amygdala (Mayberg, 1997) might be related to the poor performance on the behavioral level due to a lack of attribution of emotional significance by the prefrontal areas and the increased autonomic arousal on a physiological level of patients compared to controls, respectively. Both ECI theory and ‘limbic-cortical dysregulation’ model predict behavioral deficits in emotion reactivity but give different explanations. The former postulates depressed mood states as internal signals that bias an organism against action. The latter illustrates neuroscientific findings on the nature of impaired emotion processing. In other words, the ECI does not directly relate to emotion processing abilities per se but rather to an evolutionary explanation of deficient motivational and reward processes, whereas the ‘limbic-cortical dysregulation’ model more directly relates to the proximate bases of mood and emotional processing. The present study suggests that deficits in empathic processing can best be interpreted in the light of the latter. However, we would like to stress that these are only possible explanations based on the different existing models, for our results. Since this was a behavioral study we are not able to really sufficiently answer those questions. Future studies should further investigate the exact nature of emotional processing, including affective and cognitive components of empathy, in depression by combining behavioral, physiological and neuroimaging tools to elucidate the implications and possible similarities of both theories. 4.2. The importance of verbal information in emotional processing Both explicit and implicit experimental measures suggest the importance of accessible verbal information for different aspects of empathic processing in patients and controls. That is, patients and controls showed higher behavioral empathy rates (and its components) when the presented actor was understandable, i.e. spoke in the native language of the participants (‘cE’) compared to a foreign language ( ‘iE’). This finding is in line with the idea that emotional information conveyed on multiple modalities simultaneously facilitates emotion transfer and emotion contagion (Massaro and Egan, 1996; Dolan et al., 2001; de Gelder et al., 2003; Ethofer et al., 2006;) and challenges communication theories which may underestimate the role of verbal signals in emotion processing (Argyle et al., 1970, 1971; Mehrabian, 1981). 4.3. Limitations The significant data loss of GSC measurements reduced the sample size considerably so that psychophysiological results and the correlational analysis with the behavioral measures should be interpreted with caution. Also, we provided isomorphic emotion categories inquiring the own emotional state which hindered participants, e.g., to respond to a sad video clip with pity. Limiting the answer spectrum might therefore constrain interpretation of experimental empathy. In addition, the interpretation of multi-


modal effects conveyed in dynamic video sequences on rating scores, intensity levels, and physiological responses, becomes more difficult to control for with the growing complexity of stimulus characteristics. Our experimental and external measures (questionnaire variable) of empathy resemble different contextual approaches which might be the reason for a lacking correlation between these constructs. This clearly demonstrates the limitations of self-reports in assessing emotion processing capacities. Self-reports in general may provide limited access to the exact nature and extent of empathic responses and are subject to response tendencies. Although important, relying on self-reports only is not sufficient to characterize empathic abilities. Indeed, previous studies have shown that a lack of correlations between self-report measures and experimental measures of empathy highlights the importance of behavioral tasks when characterizing empathic abilities (Dimaggio et al., 2008; Derntl et al., 2009; 2010). While antidepressant medication generally hinders explanatory power of experimental effects, the lack of correlations between antidepressant drug treatment and symptom severity with the outcome measures suggests that empathic deficits might be a trait rather than a state characteristic of depression (Kan et al., 2004).

5. Conclusion The present study extends valuable knowledge to pre-existing theories of a general emotion processing deficit to the domain of impaired empathic function in depression in terms of its occurrence in an authentic and socially relevant setting. Our behavioral findings support response domain-specific deficits according to the ECI theory but the ‘limbic-cortical dysregulation’ model of depression is in accordance with both explicit and implicit measures of emotion processing. Accessible speech content serves as an important cue for understanding and sharing the emotion of another person as well as feeling empathy.

Acknowledgments This work was supported by the Interdisciplinary Center for Clinical Research of the Medical Faculty of the RWTH Aachen University (IZKF, N2-6 and N4-4), the International Research Training Group (IRTG 1328), DFG-KFO 112, and HA 3203/7-1 (all funded by the German Research Foundation (DFG)), as well as the Initiative and Networking Fund of the Helmholtz Association (Helmholtz Alliance for Mental Health in an Aging Society), HelMA, HA-125. We are very grateful to Jonas Albers, Anna ¨ Bartsch, Jorn Kraemer, Carla Schirk, and Timur K. Toygar for their assistance and to all our participants. References Argyle, M., Alkema, F., Gilmour, R., 1971. The communication of friendly and hostile attitudes by verbal and non-verbal signals. European Journal of Social Psychology 1, 385–402. Argyle, M., Salter, V., Nicholson, H., Williams, M., Burgess, P., 1970. The communication of inferior and superior attitudes by verbal and non-verbal signals. British Journal of Social Psychology 1, 247–258. Aschenbrenner, S., Tucha, O., Lange, K.W., 2000. Regensburger word fluency test ¨ ¨ (Regensburger Wortflussigkeits—test, RWT). Hogrefe, Gottingen. Asthana, H.S., Mandal, M.K., Khurana, H., Haque-Nizamie, S., 1998. Visuospatial and affect recognition deficit in depression. Journal of Affective Disorders 48, 57–62. Bagby, R.M., Taylor, G.J., Ryan, D., 1986. Toronto alexithymia scale: relationship with personality and psychophysiology measures. Psychotherapy and Psychosomatics 45, 207–215.


D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

Bagby, R.M., Parker, J.D.A., Taylor, G.J., 1994. The twenty-item Toronto Alexithymia Scale-I. Item selection and cross-validation of the factor structure. Journal of Psychosomatic Research 38, 23–32. Batson, C.D., 2009. These things called empathy. In: Decety, J., Ickes, W. (Eds.), The Social Neuroscience of Empathy. MIT Press, Cambridge, Ma, pp. 3–16. Beauregard, M., Paquette, V., Le´vesque, J., 2006. Dysfunction in the neural circuitry of emotional self-regulation in major depressive disorder. NeuroReport 17, 843–846. Beck, A.T., Steer, R.A., Brown, G.K., 1996. Manual for the Beck Depression Inventory-II. Psychological Corporation, San Antonio, TX. Benedeck, M., Kaernbach, C., 2010. A continuous measure of phasic electrodermal activity. Journal of Neuroscience Methods 190, 80–91. Bouhuys, A.L., Geerts, E., Mersch, P.P.A., 1996. Relationship between perception of facial emotions and anxiety in clinical depression: does anxiety-related perception predict persistence of depression? Journal of Affective Disorders 43, 213–223. Bourke, C., Douglas, K., Porter, R., 2010. Processing facial emotion expression in major depression: a review. Australian and New Zealand Journal of Psychiatry 44, 681–696. Brown, S.L., Schwartz, G.E., Sweeney, D.R., 1978. Dissociation of self-reported and observed pleasure in depression. Psychosomatic Medicine 40, 536–548. ¨ ¨ Brune, M., Brune-Cohrs, U., 2006. Theory of mind—evolution, ontogeny, brain mechanisms and psychopathology. Neuroscience and Biobehavioral Reviews 30, 437–455. Bylsma, L.M., Morris, B.H., Rottenberg, J., 2008. A meta-analysis of emotional reactivity in major depressive disorder. Clinical Psychology Review 28, 676–691. Crichley, H.D., Elliott, R., Mathias, C.J., Dolan, R.J., 2000. Neural activity relating to generation and representation of galvanic skin conductance responses: a functional magnetic resonance imaging study. The Journal of Neuroscience 20, 3033–3040. Cusi, A.M., Nazarov, A., Holshausen, K., MacQueen, G.M., McKinnon, M.C., 2012. Systematic review of the neural basis of social cognition in patients with mood disorders. Journal of Psychiatry and Neuroscience 1, 1–16. Damasio, A.R., 1996. The somatic marker hypothesis and the possible functions of the prefrontal cortex. Philosophical Transactions of the Royal Society London 351, 1413–1420. Davis, M.H., 1980. A multidimensional approach to individual differences in empathy. Catalog of Selected Documents in Psychology 10, 85. Dawson, M.E., Schell, A.M., Catania, J.J., 1977. Autonomic correlates of depression and clinical improvement following electroconvulsive shock therapy. Psychophysiology 14, 569–578. Dawson, M.E., Schell, A.M., Braaten, J.R., Catania, J.J., 1985. Diagnostic utility of autonomic measures for major depressive disorders. Psychiatry Research 15, 261–270. de Gelder, B., Vroomen, J., 2000. The perception of emotions by ear and by eye. Cognition and Emotion 14, 289–311. de Gelder, B., Vroomen, J., Annen, L., Masthof, E., Hodiamont, P., 2003. Audio-visual integration in schizophrenia. Schizophrenia Research 59, 211–218. de Vignemont, F., Singer, T., 2006. The empathic brain: how, when and why? Trends in Cognitive Sciences 10, 435–441. Decety, J., Jackson, P.L., 2004. The functional architecture of human empathy. Behavioral and Cognitive Neuroscience Reviews 3, 71–100. ¨ Derntl, B., Finkelmeyer, A., Toygar, T.K., Hulsmann, A., Schneider, F., Falkenberg, D.I., 2009. Generalized deficit in all core components of empathy in schizophrenia. Schizophrenia Research 108, 197–206. Derntl, B., Finelmeyer, A., Eikhoff, S., Kellermann, T., Falkenberg, D.I., Schneider, F., Habel, U., 2010. Multidimensional assessment of empathic abilities: neural correlates and gender differences. Psychoneuroendocrinology 35, 67–82. Diener, C., Kuehner, C., Brusniak, W., Struve, M., Flor, H., 2009. Effects of stressor controllability on psychophysiological, cognitive and behavioural responses in patients with major depression and dysthymia. Psychological Medicine 39, 77–86. Dolan, R.J., Morris, J.S., de Gelder, B., 2001. Crossmodal binding of fear in voice and face. Proceedings of the National Academy of Sciences 98, 10006–10010. Drevets, W.C., 1999. Prefrontal cortical-amygdalar metabolism in major depression. Annals of the New York Academy of Science 877, 614–637. Drevets, W.C., 2001. Neuroimaging and neuropathological studies of depression: implications for the cognitive-emotional features of mood disorders. Current Opinion in Neurobiology 11, 240–249. Dziobek, I., Fleck, S., Kalbe, E., Rogers, K., Hassenstab, J., Brand, M., Kessler, J., Woike, J.K., Wolf, O.T., Convit, A., 2006. Introducing MASC: a movie for the assessment of social cognition. Journal of Autism and Developmental Disorders 36, 623–636. Ekman, P., 1982. Emotion in the human face. Cambridge University Press, New York. Ekman, P., 2003. Emotions inside out: 130 years after Darwin’s the expression of the emotions in man and animals. New York Academy of Sciences, New York. Elliott, R., Rubinsztein, J.S., Sahakian, B.J., Dolan, R.J., 2002. The neural basis of mood-congruent processing biases in depression. Archives of General Psychiatry 59, 597–604. Emerson, C.S., Harrison, D.W.H., Everhart, D.E., 1999. Investigation of affective prosodic ability in school-aged boys with and without depression. Neuropsychiatry, Neuropsychology, and Behavioral Neurology 12, 102–109. Ethofer, T., Pourtois, G., Wildgruber, D., 2006. Investigating audiovisual integration of emotional signals in the human brain. Progress in Brain Research 156, 345–361.

Falkenberg, I., Kohn, N., Schoepker, R., Habel, U., 2012. Mood induction in depressive patients: a comparative multidimensional approach. PLOS One 7, e30016. Feinberg, T.E., Rifkin, A., Schaffer, C., Walker, E., 1986. Facial discrimination and emotional recognition in schizophrenia and affective disorders. Archives of General Psychiatry 43, 276–279. Foti, D., Olvet, D.M., Klein, D.N., Hajak, G., 2010. Reduced electrocortical response to threatening faces in major depressive disorder. Depression and Anxiety 27, 813–820. ¨ Fredrikson, M., Furmark, T., Olsson, M.T., Fischer, H., Andersson, J., Langstrom, B., 1998. Functional neuroanatomical correlates of electrodermal activity: a positron emission tomographic study. Psychophysiolology 35, 179–185. Goodwin, A.M., Williams, J.M.G., 1982. Mood-induction research—its implication for clinical depression 20, 373–382. Gotlib, I.H., Krasnoperova, E., Neubauer Yue, D., Joormann, J., 2004. Attentional biases for negative interpersonal stimuli in clinical depression. Journal of Abnormal Psychology 113, 127–135. Gross, J.J., Levenson, R.W., 1997. Hiding feelings: the acute effects of inhibiting negative and positive emotion. Journal of Abnormal Psychology 106, 95–103. Guinjoan, S.M., Bernabo´, J.L., Cardinali, D.P., 1995. Cardiovascular tests of autonomic function and sympathetic skin responses in patients with major depression. Journal of Neurology, Neurosurgery, and Psychiatry 58, 299–302. Gur, R.C., Erwin, R.J., Gur, R.E., Zwil, A.S., Heimberg, C., Kraemer, H.C., 1992. Facial emotion discrimination, II: behavioral findings in depression. Psychiatry Research 41, 241–251. Hale, W.W., 1998. Judgement of facial expressions and depression persistence. Psychiatry Research 80, 265–274. Hamilton, M., 1960. A rating scale for depression. Journal of Neurology, Neurosurgery, and Psychiatry 23, 56–61. Henriques, J.B., Davidson, R.J., 1991. Left frontal hypoactivation in depression. Journal of Abnormal Psychology 100, 535–545. Kan, Y., Mimura, M., Kamijima, K., 2004. Recognition of emotion from moving facial and prosodic stimuli in depressed patients. Journal of Neurology, Neurosurgery and Psychiatry 75, 1667–1671. Keltner, D., Gross, J.J., 1999. Functional accounts of emotions. Cognition and Emotion 13, 467–480. Laine, C.M., Spitler, K.M., Mosher, C.P., Gothard, K.M., 2009. Behavioral triggers of skin conductance responses and their neural correlates in the primate amygdale. Journal of Neurophysiology 101, 1749–1754. LeDoux, J., 1998. Neural circuits underlying anxiety and fear. Biological Psychiatry 44, 1229–1238. Lee, L., Harkness, K.L., Sabbagh, M.A., Jacobsen, J.A., 2005. Mental state decoding abilites in clinical depression. Journal of Affective Disorders 86, 247–258. Leiberg, S., Anders, S., 2006. The multiple facets of empathy: a survey of theory and evidence. Progress in Brain Research 156, 419–440. ¨ Leibetseder, M., Laireiter, A.-R., Koller, T., 2007. Structural analysis of the E-scale. Personality and Individual Differences 3, 547–561. Lepp¨anen, J.M., Milders, M., Bell, J.S., Terriere, E., Hietanen, J.K., 2004. Depression biases the recognition of emotionally neutral faces. Psychiatry Research 128, 123–133. ¨ Leppanen, J.M., 2006. Emotional information processing in mood disorders: a review of behavioral and neuroimaging findings. Current Opinion in Psychiatry 19, 34–39. Levkovitz, Y., Lamy, D., Ternochiano, P., Treves, I., Fennig, S., 2003. Perception of dyadic relationship and emotional states in patients with affective disorder. Journal of Affective Disorder 75, 19–28. Lewis, M., Haviland-Jones, J.M., Feldman Barrett, L., 2008. Handbook of emotions, third ed. Guilford Press, New York. Massaro, D.W., Egan, P.B., 1996. Perceiving affect from the voice and the face. Psychonomic Bulletin and Review 3, 215–221. Matthews, G.R., Antes, J.R., 1992. Visual attention and depression: cognitive biases in the eye fixations of the dysphoric and the nondepressed. Cognitive Therapy and Research 16, 359–371. Mayberg, H.S., 1997. Limbic-cortical dysregulation: a proposed model of depression. Journal of Psychiatry and Clinical Neurosciences 9, 471–481. Mayberg, H.S., Liotti, M., Brannan, S.K., McGinnis, S., Mahurin, R.K., Jerabek, P.A., Silvia, J.A., Tekell, J.L., Martin, C.C., Lancaster, J.L., Fox, P.T., 1999. Reciprocal limbic-cortical function and negative mood: converging PET findings in depression and normal sadness. American Journal of Psychiatry 156, 675–682. Mehrabian, A., 1981. Silent messages: implicit communication of emotions and attitudes. Wadsworth, Belmont, CA. Nelson, L.D., Stern, S.L., 1988. Mood induction in a clinically depressed population. Journal of Psychopathology and Behavioral Assessment 10, 277–285. Nesse, R.M., 2000. Is depression an adaptation? Archives of General Psychiatry 57, 14–20. Oldfield, R.C., 1971. The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologica 9, 97–113. Ochsner, K.N., Bunge, S.A., Gross, J.J., Gabrieli, J.D., 2002. Rethinking feelings: an FMRI study of the cognitive regulation of emotion. Journal of Cognitive Neuroscience 14, 1215–1229. Ottowitz, W.E., Dougherty, D.D., Savage, C.R., 2002. The neural network basis of abnormalities of attention and executive function in major depressive disorder: implication of the medical disease model to psychiatric disorders. Harvard Review of Psychiatry 10, 86–99. Persad, S.M., Polivy, J., 1993. Differences between depressed and nondepressed individuals in the recognition of and response to facial emotional cues. Journal of Abnormal Psychology 102, 358–368.

D. Schneider et al. / Psychiatry Research 200 (2012) 294–305

Phan, K.L., Wager, T., Taylor, S.F., Liberzon, I., 2002. Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. NeuroImage 16, 331–348. Plutchik, R., 1982. A psychoevolutionary theory of emotions. Social Science Information 21, 529–553. Plutchik, R., Kellermann, H., 1989. Emotion: Theory, Research, and Experience, vol. 1–4. Academic Press, New York. Preston, S.D., de Waal, F.B.M., 2002. Empathy: its ultimate and proximate bases. Behavioral and Brain Sciences 25, 1–72. Raes, F., Hermans, D., Williams, J.M.G., 2006. Negative bias in the perception of others’ facial emotional expressions in major depression: the role of depressive rumination. Journal of Nervous and Mental Disease 194, 796–799. Regenbogen, C., Schneider, D.A., Finkelmeyer, A., Kohn, N., Derntl, B., Kellermann, T., Gur, R.E., Schneider, F., Habel, U., 2012a. The differential contribution of facial expressions, prosody, and speech content to empathy. Cognition and Emotion 1, 1–20. Regenbogen, C., Schneider, D.A., Gur, R.E., Schneider, F., Habel, U., Kellermann, T., 2012b. Multimodal human communication—targeting facial expressions, speech content and prosody. NeuroImage 60, 2346–2356. ¨ administration and scoring. Reitan, R.M., 1992. Trail making test: manual fur Reitan Neuropsychology Laboratory, South Tucson, Arizona, USA. Rottenberg, J., Gross, J.J., Gotlib, I.H., 2005. Emotion context insensitivity in major depressive disorder. Journal of Abnormal Psychology 4, 627–639. Rottenberg, J., 2005. Mood and emotion in major depression. Current Directions in Psychological Science 14, 167–170. Rottenberg, J., Gross, J.J., 2007. Emotion and emotion regulation: a map of psychotherapy researchers. Clinical Psychology Science and Practice 14, 323–328. Rubinow, D.R., Post, R.M., 1992. Impaired recognition of affect in facial expression in depressed patients. Biological Psychiatry 31, 947–953. Schmidt, K.H., Metzler, P., 1992. Wortschatztest (WST). Beltz, Weinheim. Seubert, J., Kellermann, T., Loughead, J., Boers, F., Brensinger, C., Schneider, F., Habel, U., 2010. Processing of disgusted faces is facilitated by odor primes: a functional MRI study. NeuroImage 53, 746–756. Siegle, G.J., Thompson, W., Carter, C.S., Steinhauer, S.R., Thase, M.E., 2007. Increased amygdala and decreased dorsolateral prefrontal BOLD responses in unipolar depression: related and independent features. Biological Psychiatry 61, 198–209. Singer, T., 2006. The neuronal basis and ontogeny of empathy and mind reading: review of literature and implications for future research. Neuroscience and Biobehavioral Reviews 30, 855–863. Singer, T., Lamm, C., 2009. The social neuroscience of empathy. Annals of the New York Academy of Sciences 1156, 81–96. Sloan, D.M., Strauss, M.E., Quirk, S.W., Sajatovic, M., 1997. Subjective and expressive emotional response in depression. Journal of Affective Disorders 46, 135–141.


Surguladze, S., Senior, C., Young, A.W., Bre´bion, G., Travis, M.J., Phillips, M.L., 2004. Recognition accuracy and response bias to happy and sad facial expressions in patients with major depression. Neuropsychology 18, 212–218. Surguladze, S., Brammer, M.J., Keedwell, P.K., Giampietro, V., Young, A.W., Travis, M.J., Williams, S.C.R., Phillips, M.L., 2005. A differential pattern of neural response toward sad versus happy facial expressions in major depressive disorder. Biological Psychiatry 57, 201–209. Suslow, T., Dannlowski, U., Lalee-Mentzel, J., Donges, U.S., Arolt, V., Kersting, A., 2004. Spatial processing of facial emotion in patients with unipolar depression: a longitudinal study. Journal of Affective Disorders 83, 59–63. ¨ Suslow, T., Konrad, C., Kugel, H., Rumstadt, D., Zwitserlood, P., Schoning, S., Ohrmann, P., Bauer, J., Pyka, M., Kersting, A., Arolt, V., Heindel, W., Dannlowski, U., 2010. Automatic mood-congruent amygdala responses to masked facial expressions in major depression. Journal of Biological Psychiatry 67, 155–160. Stevenson, R.A., VanDerKlok, R.M., Pisoni, D.B., James, T.W., 2011. Discrete neural substrates underlie complementary audiovisual speech integration processes. NeuroImage 55, 1339–1345. Taesdale, J.D., 1983. Negative thinking in depression: cause, effect, or reciprocal relationship? Advances in Behaviour Research and Therapy 5, 3–25. Trautmann, S.A., Fehr, T., Herrmann, M., 2009. Emotions in motion: dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations. Brain Research 1284, 100–115. Tsai, J.L., Pole, N., Levenson, R.W., Munoz, R.F., 2003. The effects of depression on the emotional responses of spanish-speaking latinas. Cultural Diversity and Ethnic Minoring Psychology 9, 49–63. Venables, P.H., Mitchell, D.A., 1996. The effects of age, sex and time of testing on skin conductance activity. Biological Psychology 43, 87–101. Wells, K.B., Stewart, A., Hays, R.D., Burnam, M.A., Rogers, W., Daniels, M., Berry, S., Greenfield, S., Ware, J., 1989. The functioning and well-being of depressed patients: results from the medical outcomes study. The Journal of the American Medical Association 262, 914–919. ¨ Weyers, P., Muhlberger, A., Hefele, C., Pauli, P., 2006. Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology 43, 450–453. Williams, L.M., Phillips, M.L., Brammer, M.J., Skerrett, D., Lagopoulos, J., Rennie, C., Bahramali, H., Olivieri, G., David, A.S., Peduto, A., Gordon, E., 2001. Arousal dissociates amygdala and hippocampal fear responses: evidence from simultaneous fMRI and skin conductance recording. NeuroImage 14, 1070–1079. Wittchen, H.U., Zaudig, M., Fydrich, T., 1997. Strukturiertes Klinisches interview ¨ DSM-IV: SKID. Hogrefe, Gottingen. ¨ fur ¨ Wolkenstein, L., Schonenberg, M., Schirm, E., Hautzinger, M., 2011. I can see what you feel, but I can’t deal with it: impaired theory of mind in depression. Journal of Affective Disorders 132, 104–111.