Addiction is defined as compulsive drug use despite negative consequences. The goals of the addicted person become narrowed to obtaining, using, and recovering from drugs, despite failure in life roles, medical illness, risk of incarceration, and other problems. An important characteristic of addiction is its stubborn persistence
(1,
2). Although some individuals can stop compulsive use of tobacco, alcohol, or illegal drugs on their own, for a large number of individuals rendered vulnerable by both genetic and nongenetic factors
(3–
5), addiction proves to be a recalcitrant, chronic, and relapsing condition
(2). The central problem in the treatment of addiction is that even after prolonged drug-free periods, well after the last withdrawal symptom has receded, the risk of relapse, often precipitated by drug-associated cues, remains very high
(6,
7). Were this not the case, treatment could simply consist of locking addicted people away in a protective environment until withdrawal symptoms were comfortably behind them, issuing a stern warning about future behavior, and having done with it.
Memory disorders are often thought of as conditions involving memory loss, but what if the brain remembers too much or too powerfully records pathological associations? During the last decade, advances in understanding the role of dopamine in reward-related learning
(8) have made a compelling case for a “pathological learning” model of addiction that is consistent with long-standing observations about the behavior of addicted people
(6). This work, along with more recent computational analyses of dopamine action
(9,
10), has suggested mechanisms by which drugs and drug-associated stimuli might attain their motivational power. At the same time, cellular and molecular investigations have revealed similarities between the actions of addictive drugs and normal forms of learning and memory
(11–
14), with the caveat that our current knowledge of how memory is encoded
(15) and how it persists
(15,
16) is far from complete for any mammalian memory system. Here I argue that addiction represents a pathological usurpation of the neural mechanisms of learning and memory that under normal circumstances serve to shape survival behaviors related to the pursuit of rewards and the cues that predict them
(11,
17–20).
A Hijacking of Neural Systems Related to the Pursuit of Rewards
Individual and species survival demand that organisms find and obtain needed resources (e.g., food and shelter) and opportunities for mating despite costs and risks. Such survival-relevant natural goals act as “rewards,” i.e., they are pursued with the anticipation that their consumption (or consummation) will produce desired outcomes (i.e., will “make things better”). Behaviors with rewarding goals tend to persist strongly to a conclusion and increase over time (i.e., they are positively reinforcing)
(21). Internal motivational states, such as hunger, thirst, and sexual arousal, increase the incentive value of goal-related cues and of the goal objects themselves and also increase the pleasure of consumption (e.g., food tastes better when one is hungry)
(22). External cues related to rewards (incentive stimuli), such as the sight or odor of food or the odor of an estrous female, can initiate or strengthen motivational states, increasing the likelihood that complex and often difficult behavioral sequences, such as foraging or hunting for food, will be brought to a successful conclusion, even in the face of obstacles. The behavioral sequences involved in obtaining desired rewards (e.g., sequences involved in hunting or foraging) become overlearned. As a result, complex action sequences can be performed smoothly and efficiently, much as an athlete learns routines to the point that they are automatic but still flexible enough to respond to many contingencies. Such prepotent, automatized behavioral repertoires can also be activated by cues predictive of reward
(19,
23).
Addictive drugs elicit patterns of behavior reminiscent of those elicited by natural rewards, although the patterns of behavior associated with drugs are distinguished by their power to supplant almost all other goals. Like natural rewards, drugs are sought in anticipation of positive outcomes (notwithstanding the harmful reality), but as individuals fall deeper into addiction, drug seeking takes on such power that it can motivate parents to neglect children, previously law-abiding individuals to commit crimes, and individuals with painful alcohol- or tobacco-related illnesses to keep drinking and smoking
(24). With repetitive drug taking comes homeostatic adaptations that produce dependence, which in the case of alcohol and opioids can lead to distressing withdrawal syndromes with drug cessation. Withdrawal, especially the affective component, can be considered to constitute a motivational state
(25) and can thus be analogized to hunger or thirst. Although avoidance or termination of withdrawal symptoms increases the incentive to obtain drugs
(26), dependence and withdrawal do not explain addiction
(7,
19). In animal models, reinstatement of drug self-administration after drug cessation is more potently motivated by reexposure to the drug than by withdrawal
(27). Perhaps more significantly, dependence and withdrawal cannot explain the characteristic persistence of relapse risk long after detoxification
(6,
7,
19).
Relapse after detoxification is often precipitated by cues, such as people, places, paraphernalia, or bodily feelings associated with prior drug use
(6,
7) and also by stress
(28). Stress and stress hormones such as cortisol have physiological effects on reward pathways, but it is interesting to note that stress shares with addictive drugs the ability to trigger the release of dopamine
(28) and to increase the strength of excitatory synapses on dopamine neurons in the ventral tegmental area
(29). Cues activate drug wanting
(11,
30), drug seeking
(19,
31), and drug consumption. The drug-seeking/foraging repertoires activated by drug-associated cues must be flexible enough to succeed in the real world, but at the same time, they must have a significantly overlearned and automatic quality if they are to be efficient
(19,
23,
31). Indeed the cue-dependent activation of automatized drug seeking has been hypothesized to play a major role in relapse
(18,
19,
23).
Subjective drug craving is the conscious representation of drug wanting; subjective urges may only be attended to or strongly experienced if drugs are not readily available or if the addicted person is making efforts to limit use
(19,
23,
31). It is an open question whether subjective drug craving, as opposed to stimulus-bound, largely automatic processes, plays a central causal role in drug seeking and drug taking
(32). Indeed, individuals may seek and self-administer drugs even while consciously resolving never to do so again.
In laboratory settings, drug administration
(33,
34) and drug-associated cues
(35–
37) have been shown to produce drug urges and physiological responses such as activation of the sympathetic nervous system. Although a full consensus has yet to emerge, functional neuroimaging studies have generally reported activations in response to drug cues in the amygdala, anterior cingulate, orbital prefrontal and dorsolateral prefrontal cortex, and nucleus accumbens.
The Dopamine Hypothesis
A large body of work, including pharmacological, lesion, transgenic, and microdialysis studies, has established that the rewarding properties of addictive drugs depend on their ability to increase dopamine in synapses made by midbrain ventral tegmental area neurons on the nucleus accumbens
(38–
40), which occupies the ventral striatum, especially within the nucleus accumbens shell region
(41). Ventral tegmental area dopamine projections to other forebrain areas such as the prefrontal cortex and amygdala also play a critical role in shaping drug-taking behaviors
(42).
Addictive drugs represent diverse chemical families, stimulate or block different initial molecular targets, and have many unrelated actions outside the ventral tegmental area/nucleus accumbens circuit, but through different mechanisms (e.g., see references
43,
44), they all ultimately increase synaptic dopamine within the nucleus accumbens. Despite its central role, dopamine is not the whole story for all addictive drugs, especially opioids. In addition to causing dopamine release, opioids may act directly in the nucleus accumbens to produce reward, and norepinephrine may play a role in the rewarding effects of opioids as well
(45).
Recent work at the behavioral, physiological, computational, and molecular levels has begun to elucidate mechanisms by which dopamine’s action in the nucleus accumbens, prefrontal cortex, and other forebrain structures might elevate the incentives for drug taking to the point at which control over drug taking is lost. Two important caveats in reviewing this research are that it is always treacherous to extend what we learn from normal laboratory animals to complex human situations such as addiction and that no animal model of addiction fully reproduces the human syndrome. That said, the last several years have brought important progress in investigating the pathogenesis of addiction.
Dopamine Action: The Reward Prediction-Error Hypothesis
The dopamine projections from the ventral tegmental area to the nucleus accumbens are the key component of the brain reward circuitry. This circuitry provides a common currency for the valuation of diverse rewards by the brain
(21,
46). Within the ventral tegmental area/nucleus accumbens circuit, dopamine is required for natural stimuli, such as food and opportunities for mating, to be rewarding; similarly, dopamine is required for the addictive drugs to produce reward
(22,
39,
40,
47). The most obvious difference between natural goal objects, such as food, and addictive drugs is that the latter have no intrinsic ability to serve a biological need. However, because both addictive drugs and natural rewards release dopamine in the nucleus accumbens and other forebrain structures, addictive drugs mimic the effects of natural rewards and can thus shape behavior
(9,
22,
23). Indeed, it has been hypothesized that addictive drugs have a competitive advantage over most natural stimuli in that they can produce far greater levels of dopamine release and more prolonged stimulation.
What information is encoded by dopamine release? An early view of dopamine function was that it acted as a hedonic signal (signaling pleasure), but this view has been called into question by pharmacological blockade, lesion
(48), and genetic studies
(49) in which animals continued to prefer (“like”) rewards such as sucrose despite dopamine depletion. Moreover, the actions of nicotine have always remained a mystery on this account, because nicotine is highly addictive and causes dopamine release but produces little if any euphoria.
Instead of acting as a hedonic signal, dopamine appears to promote reward-related learning, binding the hedonic properties of a goal to desire and to action, thus shaping subsequent reward-related behavior
(48). In an important series of experiments involving recordings from alert monkeys, Schultz and colleagues
(8,
50–52) investigated the circumstances under which midbrain dopamine neurons fire in relation to rewards. These experiments provided important general information about dopamine inputs but not about the different actions of dopamine on the nucleus accumbens, dorsal striatum, amygdala, and prefrontal cortex. Schultz et al. made recordings from dopamine neurons while monkeys anticipated or consumed sweet juice, a rewarding stimulus. Monkeys were trained to expect the juice after a fixed time following a visual or auditory cue. What emerged was a changing pattern of firing of dopamine neurons as the monkeys learned the circumstances under which rewards occur. In awake monkeys, dopamine neurons exhibit a relatively consistent basal (tonic) pattern of firing; superimposed on this basal pattern are brief phasic bursts of spike activity, the timing of which is determined by the prior experience of the animal with rewards. Specifically, an unexpected reward (delivery of juice) produces a transient increase in firing, but as the monkey learns that certain signals (a tone or light) predict this reward, the timing of this phasic activity changes. The dopamine neurons no longer exhibit a phasic burst in response to delivery of the juice, but they do so earlier, in response to the predictive stimulus. If a stimulus is presented that is normally associated with a reward but the reward is withheld, there is a pause in the tonic firing of dopamine neurons at the time that the reward would have been expected. In contrast, if a reward comes at an unexpected time or exceeds expectation, a phasic burst in firing is observed. It has been hypothesized that these phasic bursts and pauses encode a prediction-error signal. Tonic activity signals no deviation from expectation, but phasic bursts signal a positive reward prediction error (better than expected), based on the summed history of reward delivery, and pauses signal a negative prediction error (worse than expected)
(9,
53). Although consistent with many other observations, the findings of these demanding experiments have not been fully replicated in other laboratories nor have they been performed for drug rewards; thus, their application to addictive drugs remains heuristic. It is important to note that this work would predict an additional advantage for drugs over natural rewards. Because of their direct pharmacological actions, their ability to increase dopamine levels upon consumption would not decay over time. Thus, the brain would repeatedly get the signal that drugs are “better than expected.”
Berridge and Robinson
(48) showed that dopamine is not required for the pleasurable (hedonic) properties of sucrose, which, in their investigation, continued to be “liked” by rats depleted of dopamine. Instead they have proposed that nucleus accumbens dopamine transmission mediates the assignment of “incentive salience” to rewards and reward-related cues, such that these cues can subsequently trigger a state of “wanting” for the goal object as distinct from “liking.” In their view, an animal can still “like” something in the absence of dopamine transmission, but the animal cannot use this information to motivate the behaviors necessary to obtain it. Overall, it can be concluded that dopamine release is not the internal representation of an object’s hedonic properties; the experiments by Schultz et al. suggest instead that dopamine serves as a prediction-error signal that shapes behavior to most efficiently obtain rewards.
This view of dopamine function is consistent with computational models of reinforcement learning
(9,
53,
54). Reinforcement learning models are based on the hypothesis that the goal of an organism is to learn to act in such a way as to maximize future rewards. When such models are applied to the physiological data described earlier, pauses and phasic spiking of dopamine neurons can be conceptualized as the internal representation of reward prediction errors by which the planned or actual actions of the monkey (“agent”) are “criticized” by reinforcement signals (i.e., rewards that turn out to be better, worse, or as predicted). Dopamine release can thus shape stimulus-reward learning to improve prediction while it also shapes stimulus-action learning, i.e., the behavioral response to reward-related stimuli
(8,
9). Given the likelihood that addictive drugs exceed natural stimuli in the reliability, quantity, and persistence of increased synaptic dopamine levels, a predicted consequence of these hypotheses would be profound overlearning of the motivational significance of cues that predict the delivery of drugs. At the same time, much remains unclear. For example, in the monkeys studied by Schultz and colleagues, brief bursts and pauses in the firing of dopamine neurons served as a prediction-error signal. However, drugs such as amphetamine may act for many hours and would thus disrupt all normal patterns of dopamine release, both tonic and phasic, to produce a grossly abnormal dopamine signal. The effects of drug-related dopamine kinetics on reward-related behavior are only beginning to be studied
(55).
A Role for the Prefrontal Cortex
Under normal circumstances, organisms value many goals, making it necessary to select among them. A significant aspect of addiction is the pathological narrowing of goal selection to those that are drug related. The representation of goals, assignment of value to them, and selection of actions based on the resulting valuation depend on the prefrontal cortex
(56–
59). Successful completion of goal-directed behavior, whether foraging (or in modern times, shopping) for food or foraging for heroin, requires a complex and extended sequence of actions that must be maintained despite obstacles and distractions. The cognitive control that permits goal-directed behaviors to proceed to a successful conclusion is thought to depend on the active maintenance of goal representations within the prefrontal cortex
(56,
59). Further, it has been hypothesized that the ability to update information within the prefrontal cortex such that new goals can be selected and perseveration avoided is gated by phasic dopamine release
(8,
60).
If phasic dopamine release provides a gating signal in the prefrontal cortex, addictive drugs would produce a potent but highly distorted signal that disrupts normal dopamine-related learning in the prefrontal cortex, as well as in the nucleus accumbens and dorsal striatum
(9,
19). Moreover, in an addicted person, neural adaptations to repetitive, excessive dopaminergic bombardment
(61) might decrease responses to natural rewards or reward-related cues that elicit weaker dopamine stimulation, compared with drugs that directly cause dopamine release; that is, natural stimuli might fail to open the hypothesized prefrontal gating mechanism in an addicted person and therefore fail to influence goal selection. The upshot of such a scenario would be a biased representation of the world, powerfully overweighted toward drug-related cues and away from other choices, thus contributing to the loss of control over drug use that characterizes addiction. It is interesting to note that initial neuroimaging studies reported abnormal patterns of activation in the cingulate cortex and orbital prefrontal cortex in addicted subjects
(62–
64).
Although far more neurobiological investigation is needed to understand the effects of tonic and phasic dopamine signals, the ways in which addictive drugs disrupt them, and the functional consequences of that disruption, current understanding of the role of dopamine in both stimulus-reward learning and stimulus-action learning has several important implications for the development of drug addiction. Cues that predict drug availability would take on enormous incentive salience, through dopamine actions in the nucleus accumbens and prefrontal cortex, and drug-seeking behavioral repertoires would be powerfully consolidated by dopamine actions in the prefrontal cortex and dorsal striatum
(9,
18,
19,
23,
65).
Implications of the Specificity of Drug-Associated Cues
Stimulus-reward and stimulus-action learning associate specific cues, occurring within specific contexts, with particular effects such as “wanting” a reward, taking action to gain the reward, and consumption of the reward. (An important aspect of context is whether the cue is delivered more or less proximate to the reward
[66]; for example, experiencing a drug-associated cue in a laboratory has a different implication for action than experiencing the same cue on the street.) Learning the significance of a cue and connecting that information with an appropriate response require the storage of specific patterns of information in the brain. This stored information must provide internal representations of the reward-related stimulus, its valuation, and a series of action sequences so that the cue can trigger an effective and efficient behavioral response
(19). The same must be true for aversive cues that signal danger.
If the prediction-error hypothesis of dopamine action is correct, phasic dopamine is required for the brain to update the predictive significance of cues. If the dopamine-gating hypothesis of prefrontal cortex function is correct, phasic dopamine is required to update goal selection. In either case, however, dopamine provides general information about the motivational state of the organism; dopamine neurons do not specify detailed information about reward-related percepts, plans, or actions. The architecture of the dopamine system—a relatively small number of cell bodies located in the midbrain that may fire collectively and project widely throughout the forebrain, with single neurons innervating multiple targets—is not conducive to the storage of precise information
(67). Instead, this “spraylike” architecture is ideal for coordinating responses to salient stimuli across the many brain circuits that do support precise representations of sensory information or of action sequences. Precise information about a stimulus and what it predicts (e.g., that a certain alley, a certain ritual, or a certain odor—but not a closely related odor—predicts drug delivery) is dependent on sensory and memory systems that record the details of experience with high fidelity. Specific information about cues, the evaluation of their significance, and learned motor responses depend on circuits that support precise point-to-point neurotransmission and utilize excitatory neurotransmitters such as glutamate. Thus, it is the associative interaction between glutamate and dopamine neurons in such functionally diverse structures as the nucleus accumbens, prefrontal cortex, amygdala, and dorsal striatum
(68,
69) that brings together specific sensory information or specific action sequences with information about the motivational state of the organism and the incentive salience of cues in the environment. The functional requirements for recording detailed information about reward-related stimuli and action responses are likely to be similar to those underlying other forms of associative long-term memory, from which follows directly the hypothesis that addiction represents a pathological hijacking of memory systems related to reward
(11,
19).
Robinson and Berridge
(30,
70) proposed an alternative view—the incentive sensitization hypothesis of addiction. In this view, daily drug administration produces tolerance to some drug effects but progressive enhancement—or sensitization—of others
(71). For example, in rats, daily injection of cocaine or amphetamine produces a progressive increase in locomotor activity. Sensitization is an attractive model for addiction because sensitization is long-lived process and because some forms of sensitization can be expressed in a context-dependent manner
(72). Thus, for example, if rats receive a daily amphetamine injection in a test cage rather than their home cages, they exhibit sensitized locomotor behavior when placed again in that test cage. The incentive sensitization theory posits that just as locomotor behavior can be sensitized, repeated drug administration sensitizes a neural system that assigns incentive salience (as opposed to hedonic value or “liking”) to drugs and drug-related cues. This incentive salience would lead to intense “wanting” of drugs that could be activated by drug-associated cues
(30,
70). In the main, the incentive sensitization view is consistent with the view that dopamine functions as a reward prediction-error signal
(9). It would also seem uncontroversial that the incentive salience of drug-related cues is enhanced in addicted individuals. Moreover, there is no disagreement that the ability of these cues to activate drug wanting or drug seeking depends on associative learning mechanisms. The point of disagreement is whether the neural mechanism of sensitization, as it is currently understood from animal models, plays a necessary role in human addiction. In animal models, sensitized locomotor behavior is initiated in the ventral tegmental area and is then expressed in the nucleus accumbens
(73,
74), presumably through enhancement of dopamine responses. Given the relative homogeneity of ventral tegmental area projections to the nucleus accumbens or to the prefrontal cortex and the ability of these projections to interact with many neurons, it is difficult to explain how such enhanced (sensitized) dopamine responsiveness could be attached to specific drug-related cues without calling on the mechanisms of associative memory. Despite a still confused experimental literature, recent evidence from a study of gene-knockout mice lacking functional AMPA glutamate receptors found a dissociation between cocaine-induced locomotor sensitization (which was retained in the knockout mice) and associative learning; that is, the mice no longer demonstrated a conditioned locomotor response when placed in a context previously associated with cocaine, nor did they show conditioned place preference
(75). At a minimum these experiments underscore the critical role of associative learning mechanisms for the encoding of
specific drug cues and for connecting these cues with
specific responses
(19,
23). Even if sensitization were to be demonstrated in humans (which has not convincingly been done), it is unclear what its role would be beyond enhancing dopamine-dependent learning mechanisms by increasing dopamine release in specific contexts. It is ultimately those learning mechanisms that are responsible for encoding the representation of highly specific, powerfully overvalued drug cues and for connecting them with specific drug-seeking behaviors and emotional responses.
Finally, an explanation of addiction requires a theory of its persistence. Many questions remain about the mechanisms by which long-term memories persist for many years or even a lifetime
(15,
16,
76). From this point of view, sensitized dopamine responses to drugs and drug cues might lead to enhanced consolidation of drug-related associative memories, but the persistence of addiction would seem to be based on the remodeling of synapses and circuits that are thought to be characteristic of long-term associative memory
(15,
16).
Cellular and Molecular Mechanisms of Addiction and Long-Term Memory
As implied by the foregoing discussion, candidate molecular and cellular mechanisms of addiction at the behavioral and systems levels ultimately must explain 1) how repeated episodes of dopamine release consolidate drug-taking behavior into compulsive use, 2) how risk of relapse from a drug-free state can persist for years, and 3) how drug-related cues come to control behavior. Intracellular signaling mechanisms that produce synaptic plasticity are attractive candidate mechanisms for addiction because they can convert drug-induced signals, such as dopamine release, into long-term alterations in neural function and ultimately into the remodeling of neuronal circuits. Synaptic plasticity is complex, but it can be heuristically divided into mechanisms that change the strength or “weight” of existing connections and those that might lead to synapse formation or elimination and remodeling of the structure of dendrites or axons
(15).
As has been described, the specificity of drug cues and their relationship to specific behavioral sequences suggest that at least some of the mechanisms underlying addiction must be associative and synapse specific. The best-characterized candidate mechanisms for changing synaptic strength that are both associative and synapse specific are long-term potentiation and long-term depression. These mechanisms have been hypothesized to play critical roles in many forms of experience-dependent plasticity, including various forms of learning and memory
(77,
78). Such mechanisms of synaptic plasticity could lead subsequently to the reorganization of neural circuitry by altering gene and protein expression in neurons that are receiving enhanced or diminished signals as a result of long-term potentiation or long-term depression. Long-term potentiation and long-term depression have thus become important candidate mechanisms for the drug-induced alterations of neural circuit function that are posited to occur with addiction
(11). There is now good evidence that both mechanisms occur in the nucleus accumbens and other targets of mesolimbic dopamine neurons as a consequence of drug administration, and growing evidence suggests that they may play an important role in the development of addiction. A detailed discussion of these findings exceeds the scope of this review (for reviews, see references
11,
79–81). Molecular mechanisms underlying long-term potentiation and long-term depression include regulation of the phosphorylation state of key proteins, alterations in the availability of glutamate receptors at the synapse, and regulation of gene expression
(78,
82).
The question of how memories persist
(15,
16,
76) is highly relevant to addiction and not yet satisfactorily answered, but persistence is ultimately thought to involve the physical reorganization of synapses and circuits. Provocative early results have demonstrated that amphetamine and cocaine can produce morphological alterations in dendrites within the nucleus accumbens and prefrontal cortex
(83,
84).
An important candidate mechanism for the physical remodeling of dendrites, axons, and synapses is drug-induced alteration in gene expression or in protein translation. At the extremes of time course, two types of gene regulation could contribute to long-term memory, including the hypothesized pathological memory processes underlying addiction: 1) long-lived up- or down-regulation of the expression of a gene or protein and 2) a brief burst of gene expression (or protein translation) that leads to physical remodeling of synapses (i.e., morphological alterations leading to changes in synaptic strength, generation of new synapses, or pruning of existing synapses) and, thus, to the reorganization of circuits. Both types of alterations in gene expression have been observed in response to dopamine stimulation and to addictive drugs such as cocaine
(85,
86).
The longest-lived molecular alteration currently known to occur in response to addictive drugs (and other stimuli) in the nucleus accumbens and dorsal striatum is up-regulation of stable, posttranslationally modified forms of the transcription factor ΔFosB
(85). At the other end of the temporal spectrum is the transient (minutes to hours) expression of a large number of genes likely dependent on activation of dopamine D
1 receptors and of transcription factor CREB, the cyclic AMP-response element binding protein
(86). CREB is activated by multiple protein kinases, including the cyclic AMP-dependent protein kinase and several Ca
2+-dependent protein kinases such as calcium/calmodulin dependent protein kinase type IV
(87,
88). Because CREB can respond to both the cyclic AMP and Ca
2+ pathways and can therefore act as a coincidence detector, its activation has been seen as a candidate for involvement in long-term potentiation and in associative memory. In fact, a large body of research both in invertebrates and in mice supports an important role for CREB in long-term memory (for reviews, see references
87 and
88).
Given a theory of addiction as a pathological usurpation of long-term memory, given the increasingly well-established role for CREB in several forms of long-term memory
(87,
88), and given the ability of cocaine and amphetamine to activate CREB
(88–
90), there has been much interest in the possible role of CREB in the consolidation of reward-related memories
(11,
19). Direct evidence for such a role is still lacking. There is, however, relatively strong evidence linking cocaine and amphetamine stimulation of the dopamine D
1 receptor–CREB pathway to tolerance and dependence. The best-studied CREB-regulated target gene that might be involved in tolerance and dependence is the prodynorphin gene
(91–
93), which encodes the endogenous opioid dynorphin peptides that are kappa opioid receptor agonists. Cocaine or amphetamine leads to dopamine stimulation of D
1 receptors on neurons in the nucleus accumbens and dorsal striatum, leading in turn to CREB phosphorylation and activation of prodynorphin gene expression
(93). The resulting dynorphin peptides are transported to recurrent collateral axons of striatal neurons, from which they inhibit release of dopamine from the terminals of midbrain dopamine neurons, thus decreasing the responsiveness of dopamine systems
(91,
94). D
1 receptor mediated increases in dynorphin can thus be construed as a homeostatic adaptation to excessive dopamine stimulation of target neurons in the nucleus accumbens and dorsal striatum that feed back to dampen further dopamine release
(91). Consistent with this idea, overexpression of CREB in the nucleus accumbens mediated by a viral vector increases prodynorphin gene expression and decreases the rewarding effects of cocaine
(95). The rewarding effects of cocaine can be restored in this model by administration of a kappa receptor antagonist
(95).
Homeostatic adaptations such as the induction of dynorphin, which decreases the responsiveness of dopamine systems, would appear to play a role in dependence and withdrawal
(26,
96). Given the limited role of dependence in the pathogenesis of addiction
(6,
11,
19,
27,
40), other studies have focused on potential molecular mechanisms that might contribute to the enhancement of drug reward (for reviews, see references
12,
13). The best-studied candidate to date is the transcription factor ΔFosB. Prolonged overexpression of ΔFosB in an inducible transgenic mouse model increased the rewarding effects of cocaine, and overexpression of CREB and short-term expression of ΔFosB had the opposite effect of decreasing drug reward
(97). In addition, a distinctly different profile of gene expression in the mouse brain was produced by prolonged expression of ΔFosB, compared to CREB or short-term expression of ΔFosB
(97). The implications of these findings are that at least some genes expressed downstream of CREB, such as the pro-dynorphin gene
(93), are involved in tolerance and dependence and that genes expressed downstream of ΔFosB might be candidates for enhancing responses to rewards and to reward-related cues. The analysis is complicated by existing experimental technologies because all mechanisms to artificially overexpress CREB markedly exceed the normal time course (minutes) of CREB phosphorylation and dephosphorylation under normal circumstances. Thus, a role for CREB in consolidation of reward-related associative memories should not be discarded on the basis of the existing evidence. New efforts to develop animal models of addiction
(98,
99) may prove extremely useful in efforts to relate drug-inducible gene expression to synaptic plasticity, synaptic remodeling, and relevant behaviors.