Skip to main content
Full access
Regular Article
Published Online: 1 February 2003

Regional Cerebral Blood Flow Changes During Visually Induced Subjective Sadness in Healthy Elderly Persons

Publication: The Journal of Neuropsychiatry and Clinical Neurosciences

Abstract

This study examined regional cerebral blood flow (rCBF) changes associated with visually induced sad affect in healthy elderly persons. Subjects viewed sadness-laden, happiness-laden, and emotionally neutral image sets while rCBF was recorded using [15O] water PET. The sad image set included human faces and scenery/objects (“scenes”). To control for secondary sensory processing, the neutral and happy comparison sets included exclusively either human faces or scenes. During the sad condition, the ventral prefrontal and temporal cortices were more active compared with happy and neutral scenes conditions and the thalamus was more active compared with happy and neutral faces conditions. Ventral prefrontal cortex and thalamus were associated with processing of sad visual stimuli, whether compared with neutral or happy stimuli. The specific findings associated with sad affect were contingent on the comparison stimuli content (scenes or human faces), not affect (i.e., comparison with neutral or happy conditions).
Functional imaging studies are beginning to define the role of human limbic and paralimbic structures in emotion processing.112 Because most studies have focused on younger individuals, little is known about emotion-related neural circuitry of older persons. The first study examining emotion-evoked regional cerebral blood flow (rCBF) changes in elderly persons13 found that visually induced happiness, fear, and disgust were associated with increased rCBF in emotion-specific limbic structures. Sadness, however, was not examined. The main aim of the present study therefore was to examine regional brain activity during sad affect compared with neutral and happy affect induced by visual stimuli. The identification of the functional neuroanatomy associated with processing of sadness-inducing stimuli in healthy elderly subjects will serve as a basis to examine the systems level of the neurobiological changes occurring during aging14 that may influence phenomenology and pathophysiology of depression and depressive disorders in elderly individuals.
Because this is one of the first investigations of functional imaging of emotion processing in elderly persons, its purpose remained largely exploratory rather than hypothesis testing. We considered, however, that in younger adults the ventral medial prefrontal cortex has been consistently reported to be a region of the neural circuit associated with sad affect. Implication of the ventral medial prefrontal cortex (including the subgenual cingulate cortex) in emotion regulation spans from physiological sad emotion to clinical depression.8,15,16 The importance of the subgenual prefrontal cortex in clinical depression is confirmed by the role it plays in modulating serotonergic, noradrenergic, and dopaminergic neurotransmitter systems targeted by antidepressant drugs.17

METHODS

Subjects

Subjects were 17 healthy elderly volunteers (13 right-handed; 8 females and 9 males) recruited from the community. Average age was 65 years (SD=7.3; range 57–79), and mean education level was 14.5 years (SD=3.4; range 9–19). Subjects had no history of psychiatric or neurological disorder or alcohol/substance abuse, no current use of psychotropic medications, and no gross brain abnormalities on magnetic resonance (MR) scans. Psychiatric morbidity was ruled out by using the Structured Clinical Interview for DSM-IV (SCID)18 and the Present State Examination.19 Medical and psychiatric records were collected as well. Mean full-scale IQ was 116.7 (SD=16.0; range 93–150), verbal IQ was 110.4 (SD=12.0, range 93–130), and performance IQ was 120.7 (SD=17.6, range 83–150). Scores on the Benton Facial Recognition test long form (mean=46.5, SD=3.5, range 41–51) and on the Visual Form Discrimination test (mean=29.4, SD=2.4, range 26–32) were within the high end of the normative curve.20 All subjects gave written informed consent to protocols approved by the University of Iowa Human Subjects Institutional Review Board.

Activation Stimuli

This report is based on data collected as part of a larger project undertaken to examine the functional neuroanatomy associated with several emotional states in healthy individuals and in subjects with brain damage. This study focuses on sadness in elderly volunteers in comparison with emotionally neutral and happy conditions.
Emotion in the project was induced through visual stimuli containing either nonfamiliar human faces or objects and scenery. Subjects were told that they would be watching pictures with emotional content and that they should let the pictures influence their emotional state. Subjects were also made aware that they would be asked to rate their feelings (i.e., happiness, amusement, sadness, fear, disgust, anger, and surprise) and arousal intensity by using verbal analog scales ranging from 0 to 10 (0=absence of feeling, 10=very intense feeling).
Stimuli were chosen from a large database of standardized emotionally evocative color pictures producing highly reliable affective and psychophysiological responses.21 Full description of individual images is available from the authors on request. The emotionally neutral and happy stimuli included exclusively either human faces or objects, scenery, animals, or landscapes (“scenes”). Thus, in the neutral and happy sets of stimuli made up of “scenes,” no human faces were included. Sad affect, however, is a relatively difficult emotional state to induce in elderly persons by using stimuli including exclusively human faces (Paradiso and Robinson, unpublished data). Viewing of sad faces may induce compassion and empathy, but less so sadness. Hence we felt that images eliciting sadness should include a combination of scenes depicting filth, squalor, desperation, malnourishment, disease, and death with stimuli portraying the facial features of suffering individuals.
The sad condition was contrasted with the two happiness-laden conditions (only faces or only scenes), which showed appetitive satisfaction, beauty, and success, and with the two emotionally neutral conditions (only faces or only scenes).
Five sets of 18 complex images were selected on the basis of their cumulative normative valence score (scale: 0=very negative, 9 =very positive) and of the response of a group of raters in Iowa (not included in the study). The mean valence was computed as the sum of all individual pictures' normative valence scores divided by the number of pictures in the set. The mean valences were as follows: sad sequence (faces combined with scenes), 2.57 (SD=0.66); neutral faces, 5.48 (SD=0.80); neutral scenes, 5.53 (SD=0.78); happy faces, 7.15 (SD=0.71); and happy scenes, 7.32 (SD=0.47). Mean arousal scores were computed as the sum of all individual pictures' normative arousal scores divided by the number of pictures in the set. The mean arousal scores were as follows: sad sequence, 4.98 (SD=0.90); neutral faces, 3.89 (SD=0.73); neutral scenes, 3.70 (SD=0.92); happy faces, 4.45 (SD=0.65); and happy scenes, 4.68 (1.09).
Subjects had not seen the images prior to the experiment day. Each set of stimuli was shown once and in random order on an 11×8-inch computer monitor positioned 14 inches from the subject. Images were displayed individually for 6 seconds (108 seconds total per image set). Images were 7.75 inches wide and 7.5 inches high and subtended visual angles of 29° vertically and 28° horizontally. Display of images began 10 seconds prior to the arrival of the oxygen-15 ([15O]) water bolus in the brain, assessed individually for each subject.22

PET Data Acquisition

We measured rCBF by using the bolus [15O] water method23 with a GE-4096 PLUS scanner. Fifteen slices (6.5 mm center to center) with an intrinsic in-plane resolution of 6.5 mm full width, half maximum, and a 10-cm axial field of view were acquired. Images were reconstructed by using a Butterworth filter (cutoff frequency=0.35 Nyquist). Cerebral blood flow was determined by using [15O] water (50 mCi per injection) and methods previously described.24 For each injection, arterial blood was sampled from time=0 (injection) to 100 seconds. Imaging was initiated at injection and consisted of 20 frames at 5 seconds per frame for a total of 100 seconds. The parametric (i.e., blood flow) image was created by using a 40-second summed image (the initial 40 seconds immediately after bolus transit) and the measured arterial input function. A preliminary injection was employed to establish stimulus timing.22 Ratings were collected while subjects were lying in the scanner 60 to 90 seconds after the end of each individual PET scan/emotion stimulation.

MR Image Acquisition and Processing

MR images consisted of contiguous coronal slices (1.5 mm thick) acquired on a 1.5-tesla GE Signa scanner. Technical parameters of the MRI acquisition were as follows: spoiled gradient recall acquisition sequence, flip angle=40 degrees, TE=5 ms, TR=24 ms, number of excitations=2. MR images were analyzed with locally developed software (BRAINS).25 All brains were realigned parallel to the anterior commissure/posterior commissure (AC-PC) line and the interhemispheric fissure to ensure comparability of head position across subjects. Alignment also placed the images in standard Talairach space.26 Images from multiple subjects were co-registered and resliced in three orthogonal planes to produce a three-dimensional data set that was used for visualization and analysis.

PET Image Processing

Co-registration of each individual's PET and MR images utilized a two-stage process using an initial coarse fit based on surface matching of MRI and PET images and then a variance minimization program using surface fit data as input.27 Brain landmarks identified on MRI were used to place each co-registered image into standardized coordinate space. An 18-mm Hanning filter was applied to the PET images.25

Statistical Analysis

Statistical analysis of the blood flow images utilized a modification of the Montreal method.28 A within-subject subtraction of the sad condition with the faces and scenes neutral conditions and with the faces and scenes happy conditions was performed. The faces and scenes neutral conditions were also contrasted with each other. This was followed by across-subject averaging of the subtraction images and computation of voxel-by-voxel t-tests of blood flow differences.
Subjective ratings were analyzed by using means and standard deviations and overall and paired F-tests. In such analyses, conditions are measured relative to one another and are referred to as activations.

RESULTS

Behavioral Observations

Subjects' ratings of their feelings following stimuli presentation are shown in Table 1. Mean sadness ratings for the sad condition compared with both neutral conditions and both happy conditions were consistent with the intended emotion (F=39.8, df=4,13, P<0.0001). Sadness ratings reported by subjects after the sad condition were different from ratings reported after the neutral scenes (F=109.4, df=1,16, P<0.0001), neutral faces (F=118.7, df=1,16, P<0.0001), happy faces (F=168.2, df=1,16, P<0.0001), and happy scenes conditions (F=168.2, df=1,16, P<0.0001). Overall arousal intensities during the sad, neutral faces, and neutral scenes conditions were not statistically different (F=3.19, df=2,15, P>0.07). However, intensity of arousal after exposure to sad stimuli was greater in pairwise comparison with the neutral faces (F=5.36, df=1,16, P<0.05) and neutral scenes conditions (F=5.73, df=1,16, P<0.05). Arousal intensities during the sad, happy faces, and happy scenes conditions were not statistically different (F=1.9, df=2,15, P>0.18). Pairwise comparisons between the two faces conditions and between the two scenes conditions were also not statistically different (sad vs. happy faces, F=4.05, df=1,16, P>0.06; sad vs. happy scenes, F=0.14, df=1,16, P>0.7).

Explanation of Tables 2–4

In Tables 2, 3, and 4, brain regions and Brodmann areas (in parentheses) with relatively increased rCBF after the subtraction are shown. The region names and Brodmann areas are based on the inspection of the co-registered MR and PET images, as well as the Talairach Atlas x, y, and z coordinates. The x, y, and z coordinates represent spatial coordinates with respect to a point located in a horizontal plane through the anterior and posterior commissures (z=0), at the midline of this brain slice (x=0), and at the anterior commissure (y=0). x is the distance in millimeters to the left (negative) and to the right (positive) of the midline. y is the distance in millimeters anterior (positive) or posterior (negative) to the anterior commissure. z is the distance in millimeters above (positive) or below (negative) a horizontal plane through the anterior and posterior commissures.
Tables also show the tmax (highest t-value identified in the peak), and the volume of the peak in cc that exceed the t=3.61 (df=3 , 872) threshold. This threshold, which has been consistently used by our center, corresponds to an uncorrected significance level of <0.0005 per voxel. Regions of significant activation were identified on the t-map images and corrected for the large number of t-tests performed, the lack of independence between voxels, and the resolution of the processed images.28 There were about 300,000 gray matter voxels in our images, representing approximately 242 resolution elements.28 After filtering, the 3-D image resolution is 2.5 cc. The degrees of freedom were extremely large for the t-tests (3,872=n of resels∗[n of subjects –1]). Unless otherwise specified, only areas that exceeded 50 contiguous voxels were included in the table, in order to omit isolated outlying values.

Sad Condition vs. Neutral Scenes Condition

Compared with the neutral scenes condition, induced sadness produced rCBF increases in a sector of the ventral prefrontal cortex (Figure 1A,B and Table 2). The fusiform gyrus (anterior on the right and more posterior on the left), the right inferior temporal gyrus, the left middle and superior temporal cortices and the cortex of the superior temporal sulcus were also more active. Increased rCBF during processing of emotionally neutral scenes was found in the left hippocampus and parahippocampal gyrus and in the primary and secondary visual cortex. The visual cortex included the calcarine fissure, the cuneus and the superior occipital gyrus, and the posterior aspect of the lingual gyrus (Table 2).

Sad Condition vs. Neutral Faces Condition

This comparison showed increased rCBF during induced sadness in cerebellar nuclei and primary visual cortex. A small sector of the left posterior dorsal thalamus was also found to be relatively more active. At the customary t-value used in our laboratory (t=3.61), this area of increased blood flow was 19 voxels large. Because thalamic activation was found in our previous study of emotion induction in elderly persons during another negative emotion (i.e., disgust), we performed an analysis at a more liberal t-value of 3.00. This analysis yielded an area 171 voxels large of increased thalamic activity (Figure 1C,D). Increased rCBF during processing of the emotionally neutral facial stimuli (i.e., sad vs. neutral faces) was found in the left amygdala, in the right and left hippocampus, and in the secondary visual cortex (Table 2). The activated extrastriate cortex included the right lingual gyrus and the right middle occipital gyrus.
Some of the above findings might have been the result of differential arousal. In order to control for overall arousal and to verify the specificity of the findings to sadness rather than nonspecific emotionality, the sad condition was compared with the happiness-laden stimuli. Results are described below.

Sad Condition vs. Happy Scenes and Happy Faces Conditions

Compared with the happy scenes condition, sadness was associated with rCBF increases in ventral medial prefrontal and left posterior ventral prefrontal cortex, right inferior temporal cortex, and left superior temporal cortex. Compared with the happy faces condition, sadness was associated with thalamic rCBF increases and the right ventral medial temporal cortex (lingual gyrus) and left superior temporal cortex (Table 3).

Neutral Faces vs. Neutral Scenes Comparison

This comparison between the neutral faces and neutral scenes conditions was performed to tease apart regions related to processing of the human face. Observing nonfamiliar human faces activated the anterior portion of the right temporal fusiform gyrus and a more posterior portion in the left fusiform gyrus, in a pattern essentially identical to that found in the comparison of the sad condition with the two scenes conditions (Figure 2). The left amygdalar complex showed increased flow as well (Figure 2A). A large area of relatively increased blood flow in primary and secondary visual cortex was associated with observing emotionally neutral objects and scenes. This large area of activation was very similar to that observed in the neutral scenes condition contrasted with the sad condition and included the calcarine fissure, the cuneus and the superior occipital gyrus, and the lingual gyrus. The area of increased activity in the lingual gyrus progressed anterior and bilaterally (Table 4).

DISCUSSION

This study examined the functional neuroanatomy associated with a state of sadness induced through visual stimuli in healthy elderly individuals. There were two major findings in this study. First, a visually induced state of sadness was associated with increased activity in the ventral medial prefrontal cortex and in the thalamus. These activations were not dependent on reported overall arousal and were specific for the sad affect. Second, limbic nodes associated with induced sadness differed depending on the content (faces or “scenes”) of the comparison condition regardless of whether this content was neutral or happy. Structures of the ventral medial prefrontal cortex were associated with sad affect when the sad condition was contrasted with both neutral and happiness-laden scenes conditions. On the other hand, sad affect was associated with increased thalamic activity when contrasted with both happy and neutral faces. These data show that variations in the results of functional imaging studies of emotion may be moderated by the specific content of the control stimuli, raising the question of the reliability of imagery-based emotion induction studies. As an alternative, recalling past episodes of sad events is a powerful way to induce sad affect. However, a control condition balanced for imagery of human faces and objects may be difficult to achieve using methods based on personal memories.
It is clear that in the present study, the regions related to sad emotion differed depending on whether the control condition contained faces or objects/scenes. What is the neurobiological basis for this phenomenon? Brain systems associated with processing of emotions share brain regions with systems processing human faces.29 When contrasted with neutral and happiness-laden conditions containing either exclusively scenes or exclusively human faces, the neural activity associated with processing either faces or scenes components in the sad condition was subtracted out. For instance, medial portions of the ventral prefrontal cortex were engaged during sad affect in comparison with scenes conditions but not in comparison with faces conditions. To further examine this issue, we analyzed the neutral faces versus neutral scenes comparison at a significance threshold of t=2.20 and found increased activity in the ventral prefrontal cortex during processing of neutral faces (Figure 2A,B). However, only the comparison of the sad condition with the neutral and happiness-laden scenes conditions yielded a large enough change in activity in the ventral medial prefrontal cortex to be detected at our laboratory's customary conservative t-value of 3.61. The activity generated in the ventral prefrontal cortex during processing of either neutral or happy faces30 explains the failure to show differential ventral prefrontal activity during the sad condition when compared with faces stimuli.
Hence, the ventral prefrontal cortex may respond to both emotionally neutral faces and sadness-charged faces, but the larger increase in activity may be achieved as a result of processing of predominantly face-delivered sad stimuli. The ventral portion of the human prefrontal cortex may function as convergence zone for stimuli carrying social significance31 (e.g., conspecies faces30) and emotional introspection.3,32 Studies of patients with ventral prefrontal damage and socially inappropriate behaviors are consistent with this observation. Individuals with ventral prefrontal lesions show impairment in recognition of faces expressing emotion and inability to evaluate subjective emotional states.33,34
The neuronal circuitry delineated in the present study is consistent with the temporal-thalamic-ventral prefrontal cortex connectivity postulated to be the anatomical basis of stimulus-reinforcement association and extinction in the visual domain.33 The ventral prefrontal region shares connections with the pivotal limbic nodes of the thalamus. Functional imaging studies have shown that the thalamus is one of the limbic regions associated with sad mood induced by visual stimuli and recall of a personal event.5,35 Similar results were obtained studying emotional responses to negative visual stimuli in the elderly using PET.13 The present study is consistent with a role for the thalamus in the experiential aspects of emotion.6 Further studies should aim at understanding the role of individual thalamic nuclei in different emotion generation modalities and valence.
Several regions in striate and extrastriate visual cortex were associated with sad affect in the present study. The inferior temporal cortex and the superior temporal sulcus send direct and indirect (via thalamus) efferents to the ventral prefrontal cortex.33 These fibers transport visual stimuli. Recent studies have shown that extrastriate cortex is more extensively activated by emotional36 than neutral facial expressions,9 therefore suggesting its participation in emotion processing.3,37 In the present study, differential extrastriate cortex activity was a finding in the sad/happy contrasts, making unlikely its relationship to arousal.
As a footnote to that observation, the role of arousal in the present study and in functional neuroimaging studies of emotion activation is an important one to discuss. Whereas in this study subjects' reported arousal during the happy scenes and during sadness almost overlapped, during the happy faces condition arousal was one unit larger (on a scale 0–10) than during sadness. This result did not reach a statistically significant difference, and one should question whether it has any physiological importance. To a broader observation, having values of arousal that are fully equal across emotions may be less crucial if we consider that functional neuroimaging of emotion usually operates under the assumption that arousal is an independent factor varying linearly with valence. The corollaries of this assumption of linearity are that arousal can be fully controlled for through experimental design and that arousal components attached to different emotions (e.g., anger and disgust), albeit of the same values, have overlapping neural functional correlates. This assumption and its corollaries may create a false perception of precision.
Consistent with the results of other negative-emotion induction studies,13 sadness in elderly individuals did not show increased rCBF in the amygdala. Changes in amygdala activity are usually reported during externally induced negative emotions in younger people.14 Single-cell recordings29 and human neuroimaging studies38 have demonstrated increased amygdalar activity in association with face processing. Hence it is plausible that here the differential activity in the amygdala is related to stimulus content (i.e., human faces) rather than affect. Hippocampal and parahippocampal activity found in association with both neutral conditions compared with the emotional condition is perhaps explained by the encoding of new perceptual information (in the case of the parahippocampal gyrus related to scenes) at the time of PET imaging.39 Midline cerebellar activity during emotion in elderly subjects was present in both our previous13 and present studies, which required no movement for task completion. Increased cerebellar activity has been reported in emotion studies involving the visual domain.4,40 Unraveling the specific role of the cerebellum in emotion requires further research.
Whereas the intended emotion of sadness was reliably induced, it should be noted that some subjects reported other emotions in response to sad stimuli as well. As can be observed in Table 1, fear and disgust were reported and are among the emotions that may have bearings on the interpretation of the data. This phenomenon may be due to the nature of our stimuli. However, we gather from our clinical and research experience that human emotions are very often not univocal. If emotional states are carefully explored and assessed, one is likely to detect the complexity of the human affects. Precision may be just an artifact introduced by the observer at the expense of the complexity of human experience.
In summary, sad affect elicited through complex visual stimuli was associated with both ventral prefrontal cortex and thalamic activity, the region activated depending on the content rather than the affect of the comparison condition. This finding suggests that the ventral medial prefrontal cortex is an important convergence zone for responses to faces and to sad affect. Whereas the thalamus was not part of our original a priori hypothesis, many studies since our study was designed have found thalamic activation during negative affect in samples of younger subjects.

ACKNOWLEDGMENTS

The authors thank Alberto Abreu, Teresa Kopel, Stephanie Rosazza, and Gene Zeien for their assistance with this research. This work was supported in part by National Institute of Mental Health (NIMH) Research Scientist Award MH00163 to Dr. Robinson and by NIMH Grant MH52879.
FIGURE 1. Sagittal (top) and coronal views. Green crosshairs show the location of the slice. Coronal views show location as if facing patient. PET data, showing regions that are significantly activated (i.e., all contiguous voxels exceeding the predefined threshold for statistical significance, t=3.61) are superimposed on a composite MR image derived by averaging the MR scans from the subjects. The value of t is shown on the color bar. Panels A and B display the subgenual prefrontal cortex activation (yellow/red) in the sad condition versus neutral scenes comparison. Panels C and D show left thalamic (yellow/red, green crosshair) and cerebellar activation (C) in the sad condition versus neutral faces comparison.
FIGURE 2. Transaxial (top) and sagittal views. Green crosshairs show the location of the slice. Images follow radiological convention. PET data, showing regions that are significantly activated, are superimposed on a composite MR image derived by averaging the MR scans from all the subjects. The value of t is shown on the color bar. The right aspect of Figure 2 (C,D) displays the ventral medial prefrontal cortex activation (green crosshair points to Tailarach coordinates coordinates x=2, y=22, z= –12) in the sad condition compared with the neutral scenes condition, showing all contiguous voxels that exceed the predefined threshold for statistical significance (t=3.61; compared with sagittal and coronal planes in Figure 1 A,B). The left aspect of Figure 2 (A,B) demonstrates the ventral medial prefrontal cortex activation during processing of neutral faces compared with neutral scenes, detected by using statistical significance threshold t=2.20 (total voxels=327) at Talairach coordinates x=2, y= 22, z= –12. Left amygdala activity is also shown.
TABLE 1. Emotion ratings
TABLE 2. Areas of increased activity during induced sadness and processing of neutral faces or neutral scenes
TABLE 3. Areas of increased activity during induced sadness compared with happiness
TABLE 4. Areas of increased activity during human processing of neutral faces and neutral scenes

References

1.
Irwin W, Davidson RJ, Lowe MJ, et al: Human amygdala activation detected with echo-planar functional magnetic resonance imaging. Neuroreport 1996; 7:1765-1769
2.
Morris JS, Frith CD, Perrett DI, et al: A differential neural response in the human amygdala to fearful and happy facial expressions. Nature 1996; 383(6603):812-805
3.
Lane RD, Fink GR, Chau PM-L, et al: Neural activation during selective attention to subjective emotional responses. Neuroreport 1997; 8:3969-3972
4.
Paradiso S, Johnson DL, Andreasen NC, et al: Cerebral blood flow changes associated with attribution of emotional valence to pleasant, unpleasant and neutral visual stimuli in a PET study of normal subjects. Am J Psychiatry 1999; 155:1618-1629
5.
George MS, Ketter TA, Parekh PI, et al: Brain activity during transient sadness and happiness in healthy women. Am J Psychiatry 1995; 152:341-351
6.
Reiman EM, Lane RD, Ahern GL, et al: Neuroanatomical correlates of externally and internally generated human emotion. Am J Psychiatry 1997; 154:918-925
7.
Teasdale JD, Howard RJ, Cox SG, et al: Functional MRI study of the cognitive generation of affect. Am J Psychiatry 1999; 156:209-215
8.
Liotti M, Mayberg HS, Brannan SK, et al: Differential limbic-cortical correlates of sadness and anxiety in healthy subjects: implications for affective disorders. Biol Psychiatry 2000; 48:30-42
9.
Dolan RJ, Fletcher P, Morris J, et al: Neural activation during covert processing of positive emotional facial expressions. Neuroimage 1996; 4:194-200
10.
Lang PJ, Bradley MM, Fitzsimmons JR, et al: Emotional arousal and activation of the visual cortex: an fMRI analysis. Psychophysiology 1998; 35:199-210
11.
Breiter HC, Etcoff NL, Whalen PJ, et al: Response and habituation of the human amygdala during visual processing of facial expression. Neuron 1996; 17:875-877
12.
Zald DH, Pardo JV: Emotion, olfaction, and the human amygdala: amygdala activation during aversive olfactory stimulation. Proc Natl Acad Sci USA 1997; 94:4119-4124
13.
Paradiso S, Robinson RG, Andreasen NC, et al: Emotional activation of limbic circuitry in elderly normal subjects in a PET study. Am J Psychiatry 1997; 154:384-389
14.
Powers RE: Neurobiology of aging, in Textbook of Geriatric Neuropsychiatry. Edited by Coffey CE, Cummings JL. Washington, DC, American Psychiatric Press, 1994, pp 35-70
15.
Drevets WC, Price JL, Simpson JR, et al: Subgenual prefrontal cortex abnormalities in mood disorders Nature 1997; 386:824-827
16.
Mayberg HS, Liotti M, Brannan SK, et al: Reciprocal limbic-cortical function and negative mood: converging PET findings in depression and normal sadness. Am J Psychiatry 1999; 156:675-682
17.
Goodwin FK, Jamison KR: Manic-Depressive Illness. New York, Oxford University Press, 1990
18.
First MB, Spitzer RL, Gibbon M, et al: Structured Clinical Interview for DSM-IV Axis I Disorders (SCID), Clinician Version: Administration Booklet. Washington, DC, American Psychiatric Press, 1997
19.
Wing J, Cooper E, Sartorius N (eds): Measurement and Classification of Psychiatric Symptoms. Cambridge, UK, Cambridge University Press, 1974
20.
Benton A, Sivan A, des Hamsher K, et al: Contributions to Neuropsychological Assessment, 2nd ed. New York, Oxford University Press, 1994
21.
Lang PJ, Bradley MM, Cuthbert BN: International Affective Picture System (IAPS): Technical Manual and Affective Ratings. Gainesville, FL, The Center of Research in Psychophysiology, University of Florida, 1995
22.
Hurtig RR, Hichwa RD, O'Leary DS, et al: The effects of timing and duration of cognitive activation in [15O] water PET studies. J Cereb Blood Flow Metab 1994; 14:423-430
23.
Herscovitch P, Markham J, Raichle ME: Brain blood flow measured with intravenous H215O, I: theory and error analysis. J Nucl Med 1983; 24:782-789
24.
Hichwa RD, Ponto LLB, Watkins GL: Clinical blood flow measurements with [15O] water and positron emission tomography, in Chemists' Views of Imaging Centers. Edited by Emran AM. New York, Plenum, 1995, pp 401-417
25.
Arndt S, Cizadlo T, Andreasen NC, et al: Tests for comparing images based on randomization and permutation methods. J Cereb Blood Flow Metab 1996; 16:1271-1279
26.
Talairach J, Tournoux P: Co-planar Stereotaxic Atlas of the Human Brain. Stuttgart, Thieme, 1988
27.
Woods R, Mazziotta J, Cherry S: MRI-PET registration with automated algorithm. J Comput Assist Tomogr 1993; 17:536-546
28.
Worsley K, Evans S, Marrett S, et al: A three-dimensional statistical analysis for CBF activation studies in human brain. J Cereb Blood Flow Metab 1992; 12:900-918
29.
Rolls ET: Neurons in the cortex of the temporal lobe and in the amygdala of the monkey with responses selective for faces. Human Neurobiology 1984; 3:209-222
30.
Wilson FA, Scalaidhe SP, Goldman-Rakic PS: Dissociation of object and spatial processing domains in primate prefrontal cortex. Science 1993; 260:1955-1958
31.
Damasio AR: On some functions of the human prefrontal cortex, in Structure and Functions of the Human Prefrontal Cortex. Edited by Grafman J, Holyoak KJ, Boller F. Ann NY Acad Sci 1995; 769:241-251
32.
Paradiso S, Chemerinski E, Yazici K, et al: The frontal lobe syndrome reassessed: comparison of patients with lateral or medial frontal brain damage. J Neurol Neurosurg Psychiatry 1999; 67:664-667
33.
Rolls ET: The orbital frontal cortex. Phil Trans R Soc Lond B 1996; 351:1433-1444
34.
Hornak J, Rolls ET, Wade D: Face and voice expression identification in patients with emotional and behavioural changes following ventral frontal lobe damage. Neuropsychologia 1996; 34:247-261
35.
Damasio AR, Grabowski TJ, Bechara A, et al: Subcortical and cortical brain activity during the feeling of self-generated emotions. Nat Neurosci 2000; 3:1049-1056
36.
Puce A, Allison T, Gore JC, et al: Face-sensitive regions in human extrastriate cortex studied by functional MRI. J Neurophysiol 1995; 74:1192-1199
37.
Lang PJ, Bradley MM, Fitzsimmons JR, et al: Emotional arousal and activation of the visual cortex: an fMRI analysis. Psychophysiology 1998; 35:199-210
38.
Kawashima R, Sugiura M, Kato T, et al: The human amygdala plays an important role in gaze monitoring: a PET study. Brain 1999; 122:779-783
39.
Epstein R, Harris A, Stanley D, et al: The parahippocampal place area: recognition, navigation, or encoding? Neuron 1999; 23:115-125
40.
George MS, Ketter TA, Gill DS, et al: Brain regions involved in recognizing facial emotion or identity: an oxygen-15 PET study. J Neuropsychiatry Clin Neurosci 1993; 5:384-394

Information & Authors

Information

Published In

Go to The Journal of Neuropsychiatry and Clinical Neurosciences
Go to The Journal of Neuropsychiatry and Clinical Neurosciences
The Journal of Neuropsychiatry and Clinical Neurosciences
Pages: 35 - 44
PubMed: 12556569

History

Published online: 1 February 2003
Published in print: February 2003

Authors

Details

Sergio Paradiso, M.D., Ph.D.
Received March 16, 2001; revised October 15, 2001; accepted October 22, 2001. From the Department of Psychiatry, Department of Radiology, and PET Center, University of Iowa College of Medicine, Iowa City, IA. Address correspondence to Dr. Paradiso, The University of Iowa College of Medicine, Psychiatry Research–MEB, Iowa City, IA 52242-1000. E-mail: [email protected]
Robert G. Robinson, M.D.
Received March 16, 2001; revised October 15, 2001; accepted October 22, 2001. From the Department of Psychiatry, Department of Radiology, and PET Center, University of Iowa College of Medicine, Iowa City, IA. Address correspondence to Dr. Paradiso, The University of Iowa College of Medicine, Psychiatry Research–MEB, Iowa City, IA 52242-1000. E-mail: [email protected]
Laura L. Boles Ponto, Ph.D.
Received March 16, 2001; revised October 15, 2001; accepted October 22, 2001. From the Department of Psychiatry, Department of Radiology, and PET Center, University of Iowa College of Medicine, Iowa City, IA. Address correspondence to Dr. Paradiso, The University of Iowa College of Medicine, Psychiatry Research–MEB, Iowa City, IA 52242-1000. E-mail: [email protected]
G. Leonard Watkins, Ph.D.
Received March 16, 2001; revised October 15, 2001; accepted October 22, 2001. From the Department of Psychiatry, Department of Radiology, and PET Center, University of Iowa College of Medicine, Iowa City, IA. Address correspondence to Dr. Paradiso, The University of Iowa College of Medicine, Psychiatry Research–MEB, Iowa City, IA 52242-1000. E-mail: [email protected]
Richard D. Hichwa, Ph.D.
Received March 16, 2001; revised October 15, 2001; accepted October 22, 2001. From the Department of Psychiatry, Department of Radiology, and PET Center, University of Iowa College of Medicine, Iowa City, IA. Address correspondence to Dr. Paradiso, The University of Iowa College of Medicine, Psychiatry Research–MEB, Iowa City, IA 52242-1000. E-mail: [email protected]

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Get Access

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - Journal of Neuropsychiatry and Clinical Neurosciences

PPV Articles - Journal of Neuropsychiatry and Clinical Neurosciences

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share