Eating Disorders as Sociocultural Phenomena Versus Serious Medical Illnesses
Food-related preoccupations and concerns about body weight and shape have forever been part of society. People with sensitive or extreme temperaments sometimes become zealously preoccupied with prominent cultural attitudes and practices, such as those pertaining to feasting, fasting, food-related taboos and rituals, and concerns with weight and shape. At times, these individuals are ineffective at coping with these concerns and begin to behave maladaptively, consequently falling into states of psychological and physiological impairment (i.e., disorders). For these susceptible outliers, sociocultural pressures and fads may funnel their personal vulnerabilities into the shape of eating disorders (EDs).
Before examining the sociocultural phenomena that give EDs form, we must first consider what makes people susceptible to them. Three studies are illustrative. The first evaluated childhood obsessive-compulsive personality traits and found that childhood perfectionism and inflexibility (rigidity) were particularly strong predictors of ED, as well as being rule-bound, displaying doubt and cautiousness, and having a drive for order and symmetry. This was more pronounced for anorexia nervosa (AN) than for bulimia nervosa (BN). Each trait increased the additional risk of developing an ED by an approximate factor of seven (
Anderluh et al. 2003). The second study identified childhood anxiety symptoms at age 10. The authors linked physical anxiety symptoms to the development of BN and linked worrying to the development of AN during adolescence (
Schaumberg et al. 2019). A third study identified several concurrent risk factors for BN, necessitating the presence of both general psychiatric vulnerabilities and specific factors concerning attitudes and behaviors about weight and dieting. People who developed mood, anxiety, and substance use disorders were found to have increased likelihood of personal vulnerabilities (e.g., childhood characteristics, premorbid psychiatric disorders, behavioral problems, lifetime parental psychiatric disorders) and environmental factors (e.g., parental problems, disruptive events, parental caring and involvement patterns, recent parental psychiatric disorders, teasing or bullying, and sexual or physical abuse). Histories of childhood abuse and parental alcoholism were particularly common. In addition to these general psychiatric vulnerabilities (which usually result in depression), young women who developed BN had higher rates of environmental risk factors such as dieting, obesity, and parental ED (
Fairburn et al. 1997).
These and other studies suggest that when vulnerable individuals who tend to be obsessional, perfectionistic, moody, and anxious are exposed to cultural attitudes and values that stress feasting, fasting, food taboos and rituals, and preoccupations with weight and shape, some embody and transmute these sociocultural phenomena into serious EDs.
Feasting
In the beginning of human existence, some food was available, but often not enough. Food scarcities may have driven evolutionary pressures toward the appearance of “thrifty genes” that offered easier storage of body fat during plentiful times as insurance against lean times, thus fostering overweight and obesity during periods of abundance. Festive gorging might have occurred after successful large game hunts, or at harvest times once agriculture developed. Experts have pondered the significance and meaning of the prehistoric “Venus of Willendorf” carving (dated to approximately 28,000–25,000 B.C.E.), but it seems probable that this portly feminine figure with huge, pendulous breasts was a figure of adoration and admiration rather than scorn. After societies evolved from egalitarian sharing to rich and poor strata, the rich developed fancy banquets; in Ancient Rome, they went so far as to institute vomitoria for participants who overgorged, thus architecturally supporting and culturally condoning purging behavior.
With contemporary food abundances in developed areas, rates of obesity and, correspondingly one may assume, of binge eating disorder (BED), are increasing. Pima Indians (the Othama or Akimel O’odham people) in Southern Arizona have a great deal more significant obesity and diabetes mellitus than their counterparts in northern Mexico, just across the border, whose livelihoods require considerably more daily caloric expenditure. In the “supersize” culture of fast foods and oversized portions that is promulgated in the United States and has been exported to the rest of the world, rates of obesity have grown significantly.
Fasting
Variously but widely practiced in many cultures and religions, intentional fasts devotionally demonstrate self-denial and may produce altered states of consciousness. Fasts are practiced in Baha’i, Eastern Orthodox Christianity, Evangelical Christianity, Hinduism, Islam, Jainism, Judaism, Native American religions, Roman Catholicism, and Taoism. Moses, Jesus, and Buddha were all said to have engaged in prolonged fasts associated with their spiritual pursuits.
Historically, dating from the fifth century B.C.E. in India, the Jain religious practice of Sallekhana involves voluntarily fasting to death by gradually reducing one’s intake of food and liquids, representing the thinning of human passions and the body. Chandragupta Maurya (340–297 B.C.E.), who founded the historically significant Mauryan Empire, is said to have renounced his throne to spend several years following his Jain guru and to have died by self-starvation. Large populations, including entire families, followed his example. A takeaway lesson here is that high-profile, influential leaders who adopt dramatic eating practices often become trendsetters and lead others to devotionally or blindly follow, even to the followers’ significant detriment.
Extreme versions of fasting and self-denial have clearly generated near-epidemic, clinically significant fashion trends. In
anorexia mirabilis, during the fourteenth and fifteenth centuries, women intending to demonstrate Christian devotion through self-denial became obsessed with mortification of the body and abhorrent of the flesh. In parallel with some cases of contemporary AN, religious fasting in these women triggered extreme and persistent self-denial of food. Many achieved altered states of consciousness, often experienced as ecstatic “highs.” The more spiritually perfectionistic and single-minded among them starved themselves and died in large numbers (
Espi Forcen and Espi Forcen 2015). The Roman Catholic Church canonized several hundred of these women as saints, the most prominent among them St. Catherine of Siena. Clinical descriptions of these women in “holy anorexia” suggest strong resemblances to contemporary descriptions of AN.
Food-Related Taboos and Rituals
Taboos and rituals concerning food and eating are extremely common and ancient across societies. Mosaic laws of kashruth (defining kosher foods, animal slaughter, and food preparation, such as prohibiting pork and shellfish) date to more than a millennium B.C.E. Certain Jain, Buddhist, and Hindu traditions advocating nonviolence to all living things and associated taboos on eating beef and requirements of strict vegetarianism, particularly among the Brahmin class, date to at least 500 B.C.E. Some anthropologists argue that such taboos originated in economic considerations or for health reasons, whereas others believe food taboos usefully set kinship groups apart from one another, differences that defined marital options. Strictly observant Brahmins and Jains abstain from onions and garlic as well (perhaps reflecting allium sensitivities on the part of some ancient thought leaders).
Regardless of their origins, millions of people accept food taboos “off the shelf,” developing deep disgust and fear of certain foods and feelings of shame and guilt over related transgressions. In contemporary society, subgroups with food aversions are increasingly common (e.g., vegan, gluten-free, sugar-abstinent, various allergies). Clearly, rule-following individuals with obsessional and compulsive tendencies might be particularly prone to adopt food rules involving food elimination, portion-size limitations, and so on, growing rules on top of rules, to the point of caloric and micronutrient deficiencies associated with avoidant/restrictive food intake disorder (ARFID), orthorexia nervosa, or AN.
Preoccupations With Weight and Shape
For both sexes, physical appearances denoting attractiveness and health have been associated with higher social status and preferential mate selection for virtually all of recorded history. Throughout the ages, artists have depicted what different epochs and ethnicities variously considered most desirable. Over the past two centuries, we can easily trace fads and fashions showing strong influences of high-status opinion leaders on weight- and shape-related practices. The following are a few salient illustrations.
In the mid-nineteenth century, Princess Elizabeth of Austria, Queen of Hungary (1837–1898; peak influence circa 1859–1860), who was then one of the highest-status women in Europe, helped set the stage for widespread dieting, corseting, and “tight-wasting” among young women of royal connection and persuasion. Known to binge-eat and purge, she weighed herself obsessively multiple times daily; throughout her adult life, her weight varied from BMIs of 14–17. Numerous princesses followed her lead, and high-status young females started to show clinical symptoms that culminated in the original independent descriptions of AN among upper-class women by Gull in England and Lasègue in France (
Gull 1873;
Lasègue 1873).
Certain technologies became more widespread and might have also increased attention to shape and weight. Although crude mirrors were available beforehand, the first silver-glass mirror was invented in 1835, and the general availability of household mirrors developed in the mid- to late nineteenth century. Similarly, street-corner “penny scales” first became popular in the 1920s and 1930s (“measure your weight for a penny”), and affordable household bathroom scales first became widely available in the 1940s. Just as internet and cellphone addictions required the invention and widespread availability of those technologies, mirrors and scales permitted easier obsessional and compulsive attention to weight and shape.
Starting in the late nineteenth century and blossoming in the twentieth, trickle-down influences on shape and weight also became much more pronounced with modern advertising and mass media. One of the first fads to widely promote slender female body shapes, the “flapper” fashions of the 1920s, was accompanied by increased reporting in college newspapers of purging behaviors among college women. This trend abated in the 1930s, 1940s, and 1950s, during which time curvaceous body shapes among high-status and celebrity women became more prominent.
However, during the 1960s, several mass culture influences converged to promote the social desirability of slender appearance in high-status (predominantly white) women. Along with the sexual revolution, Helen Gurley Brown, editor-in-chief and top-tier trendsetter at Cosmopolitan magazine, pushed images of slim fashion models, lauding the thin look. Among celebrities, slim American and British actresses and models became top fashion icons. In contrast to previous, somewhat older, and more solidly built presidential wives, Jacqueline Kennedy cut a slim, youthful figure. The advent of early television and the appearance of many actresses as heavier on screen than they were in real life may have led some actresses to slim down to improve their on-screen appearance. George Balanchine, the most prominent American ballet director, favored increasingly thin dancers who were much thinner than their European counterparts, and Hugh Hefner, publisher of Playboy magazine, chose slimmer women in the 1960s and 1970s to personify female sexuality as “Playmate” centerfolds.
As popular culture became stocked with growing numbers of fashion magazines portraying unrealistically slim, airbrushed fashion models, many women became obsessed with these magazines, and those vulnerable to EDs consistently found themselves feeling worse after reading them. Television fashion shows flourished that featured predominantly slim models (as currently represented by shows such as
Project Runway and
America’s Next Top Model). When first exposed to slim feminine Western images on television, well-built Polynesian Island teenage females who had been content with themselves previously became increasingly self-conscious and dissatisfied with their appearance and for the first time ever began to develop EDs. In post-apartheid South Africa, Westernization may be impacting attitudes toward shape and weight, resulting in greater likelihood of EDs (
Morris and Szabo 2013). Similar shifts have been seen in Latina/Hispanic populations, leading some authorities to view EDs not only as culture bound syndromes but also as markers of cultural change (
Miller and Pumariega 2001).
Today, adding to the ubiquitous sniping, competitive, gossipy, and snarky “friendships” that add to normative adolescent peer pressures, youth are contending with powerful social media sites, such as Facebook, Instagram, and Snapchat, that are dominated by visual images and afford further opportunities for negative self-evaluation by appearance-preoccupied, insecure, self-doubting, anxious young women who strongly link self-esteem to physical appearance. As these young women find their own appearances to be wanting and inferior, they appear more vulnerable to developing EDs.
Evolution of ED Treatment Programs
The treatment of any disorder depends upon several factors: 1) contemporaneous general theories of illness (e.g., Hippocratic-Galenic theory of the four humors); 2) specific concepts of etiology for the disorder (psychosocial, neurochemical, psychodynamic); 3) the diagnostic criteria and terminology in use (
anorexia nervosa vs.
pubertätsmagersucht vs.
hysterical anorexia); and 4) the training methods of treating clinicians. Challenges to whatever treatment methods are currently in vogue lead to a period of defensive (and often angry) skepticism about the need for change and to replacement by a new method (paradigm shift). Physicians once believed that peptic ulcers were caused by excess acid production in the stomach from either stress or dietary indiscretion, and thus they prescribed antacids, low-acid foods, and acid-buffering medications. Only after Warren and Marshall, using heroic methods of self-inoculation, identified the
Helicobacter pylori bacterium as the real cause did that field evolve to its present-day effective antibiotic treatment (
Marshall and Adams 2008).
This section reviews the evolution of ED treatment approaches and programs in view of the factors just described and offers an appreciation of the expanding concept of EDs and newer approaches to treatment. The convention used herein considers EDs to be syndromes involving psychological and medical components, not simply disordered eating as exemplified by Roman-era gluttony or medieval asceticism through fasting. Although AN is the least common of these disorders, it was historically the first described, with a wide variety of treatment methods employed over the centuries. The term anorexia nervosa as it is commonly used in English implies a disorder that generally moves from voluntary onset into involuntary continuation. An understanding of EDs in the twenty-first century is greatly assisted by an appreciation of their history, lest the current diagnostic criteria and treatment methods be considered as arising de novo like Aphrodite from the sea, fully formed and enduring.
Anorexia Nervosa
Early Days
A convenient starting point for understanding AN in the modern era is the two cases described by Morton in 1689 (
Pearce 2004). Richard Morton was a distinguished seventeenth-century physician who had the perspicacity to differentiate these two cases from the more common wasting disorders of his age, the most common being tuberculosis. He recognized that “nervous consumption” and “cares and passions of the mind” contributed to the patients’ emaciation, which was different from most cases of medically caused wasting.
We can trace our modern understanding of AN to almost-simultaneous publication in 1873 of papers by William Gull, physician to Queen Victoria’s son, and Charles Lasègue, a French physician and knight of the Legion of Honor.
Gull (1873) offered a still-useful description of the physiological changes of AN, recognizing its emotional origin in “a morbid mental state.” His pragmatic treatment approach included frequent small feedings, separation of the patient from the family, and “moral” treatment (psychotherapy), quite an improvement from Morton.
Lasègue’s (1873) descriptions of familial exhaustion from ineffective use of threats and pleas, patients’ lack of recognition of the severity of their illness, and the way “the anorexia gradually becomes the sole object of preoccupation and conversation,” have an almost contemporary sound. He recommended slow treatment, recognizing that a chronic state of illness was highly probable.
Treatment during the century from the 1870s to the 1970s followed an almost random pattern resulting from radical changes in etiological theories of AN that focused first on the body (the pituitary hormones, 1890s), then on the mind (the psychoanalytic heyday, 1920–1960s), and then on the brain’s chemistry (neurotransmitters, 1965–1980). Instead of a systematic and logical gradual increase in understanding AN, treatments followed the most prominent and popular theories of origin of the day.
Neuroendocrinology to the Forefront
No new treatment strategies were advocated until about 1914, when Simmonds published his famous paper on postpartum pituitary necrosis (
Birch 1974). The similarities in cachectic appearance between pituitary necrosis and AN led to confidence that the origin of AN had been found. As a result, AN jumped into medical textbooks as being of endocrine origin, where it remained until about 1930 and still continued at times afterward to be considered of medical origin, with treatments that followed
pari passu. The assumption that AN was of endocrine origin led to its treatment by endocrine replacement, which was not generally available until later decades. Psychological treatments were considered unnecessary. Despite this emphasis on a medical origin, the concept of a wandering uterus being somehow involved kept coming in and out of the etiological picture, resulting in nonmedical but opaque treatments. Only women of childbearing age could develop AN, so males need not apply.
Psychoanalytic Theories and Treatments
In the period from about 1940 to 1965, AN again jumped textbooks, from medical into those of psychodynamic theory and consequent forms of psychodynamic treatment. The case analysis of Ellen West, whom Binswanger considered to have schizophrenia, probably represented a woman with BN (
Bray 2001). In
1950, Nemiah described a large series of female patients with AN; he considered their disorder to have arisen in a psychoneurotic setting, perhaps with obsessive-compulsive traits. Parents, especially mothers, were the primary culprits. He found a common feature to be a setting of overprotectiveness leading to dependence and hostility in the developing child, who remained infantile in emotional development. In
1961, Blitzer and colleagues offered a florid description of the context in which AN developed:
Preconscious and conscious fantasies relating to food and eating included animistic ideas about food, delusions that certain kinds of foods were poisonous, fear of oral impregnation and gastric pregnancies, idea of anal birth, orally aggressive and sometimes cannibalistic impulses, and the equation of not eating with a lifelong childlike dependent status. (p. 369)
To undo and remediate this nexus required extensive psychodynamic treatment. Psychodynamic theories and psychodynamic psychotherapy treatment methods proliferated and splintered; as a result, many nails protruded from the assumptive theoretical floor, and many carpenters were needed to hammer them back in. Psychodynamic approaches to origin included ego psychology, object relations theory, interpersonal theories, attachment theory, self-psychology, and family systems theory as well as classical psychoanalytic theories. A paucity of evidence for the therapeutic benefit of any of these theories exists.
Pendulum Swings From All Mind to All Brain
The 1963 Nobel Prizes in Physiology or Medicine (recipients Euler, Hodgkin, and Huxley) and 1970 (Katz, Von Euler, and Axelrod) signaled that heady days were ahead to explain many psychiatric disorders as a result of neurotransmitter dysfunction. Theories of the neurochemical origin of AN led to treatment trials of newly described neuroleptics and other medications that promoted eating. These included early antipsychotics—initially chlorpromazine—but later expanded to other first- and second-generation antipsychotics, antihistamines, serotonin reuptake inhibitors, dual-action antidepressants, and marijuana. These treatments were based largely on the belief that if patients would eat more and gain weight, all would be well—or at least better.
This treatment approach led to mixed results. Some patients increased their eating and weight with benefit and partial improvement, whereas others ate more but began to induce vomiting because this pharmacological intrusion provoked their most feared behavior of overeating and becoming fat. Some patients who were unwilling to challenge their overvalued beliefs “ate their way out of the hospital” and relapsed soon after discharge. Currently, theories that abnormal neurotransmitter function, especially in serotonin, predisposes people to developing AN continue to be incompletely understood and unproven. More recent studies have promoted olanzapine and other second-generation antipsychotics/neuroleptics as treatment for AN. Their use as monotherapy is based on the assumption that improved eating and weight gain are the core of treatment.
Integrative and Pragmatic Approaches to Treatment
In the 1960s and 1970s, British clinicians, especially Arthur Crisp and Gerald Russell, and Hilde Bruch in the United States, began approaching the origin of AN more agnostically—or at least, considered theories of its origin to be less rigid.
Crisp and Kalucy (1973) believed that adolescents with AN had existential fears of maturation, and
Bruch (1973) considered family dysfunction to be common, encouraging increased self-initiative and more accurate identification of bodily states. Clinical case descriptions enlarged the field to include males. Treatments recommended by these authors harkened back to Gull and Lasègue by combining nurse-supervised refeeding with less theoretical psychotherapy, with the goals of persuading patients to change their distorted perception of fatness and challenging patients’ fear of becoming fat. Their treatment programs, with good statistical documentation and follow-up studies, offered the first evidence of improved outcome from the natural course of illness.
In the 1980s and onward, intensive hospital-based, integrative, multidisciplinary team approaches to serious cases of AN provided evidence that improvement was possible in most patients and that remission was possible in some. Many studies found long-term outcomes of one-third remitted, one-third stably improved, and one-third chronically and severely ill. All of these programs that demonstrated substantial improvement utilized a team approach (i.e., psychiatrist, psychologist, nutritionist, nurse, social worker, educator); an agreement to use a common psychotherapeutic method (often cognitive-behavioral therapy [CBT]), with extended treatment as needed; and a stepwise progression to less intensive treatment settings (e.g., inpatient, residential care, day program, outpatient follow-up). The program at Johns Hopkins has shown the benefits of rapid refeeding and sequential steps in successful treatment. Unfortunately, reduced funding for fully adequate treatment of AN has resulted in premature discharges leading to increased relapses and readmissions.
CBT appears to be the most effective method of psychotherapy for AN and to contribute substantially to improved outcome, but evidence is less well proven than in BN, largely because hospitalized patients with AN are initially too ill to benefit from a manualized evidence-based psychotherapy, and random assignment to varying psychotherapy methods is difficult with inpatients of varying chronicity and severity. Nonetheless, CBT appears to be the most useful psychotherapy for a programmatic team approach to AN treatment.
Broadening of the Treatment Mission
Five developments have led to a broadening of AN treatment:
1.
The treatment of AN, in most cases, involves also treating a cluster of comorbid disorders, both psychiatric and medical. Pure food-restricting AN typically involves treating two or three comorbid psychiatric diagnoses, most commonly depressive, anxiety, obsessive-compulsive state or trait, substance abuse, and personality disorders (especially those of the Cluster C subtype, with sensitive, persevering, anxious traits).
2.
Treatment of a patient involves also treating their family or significant others in varying intensities.
Treasure et al. (2010) documented the tremendous burden faced by caregivers of patients with AN and the need for respite care, as well as formal family therapy at times, without any assumptions that families are pathogenic. At a minimum, families require education and support.
3.
Innovative approaches have shown that less restrictive home environments, with suitable family training and interaction with clinicians, may be suitable for treatment of even moderately severe AN in teens.
4.
At times, the environment, avocation, or vocation of a patient with AN needs to be modified.
5.
Medical comorbidities abound in patients with AN. Long-term complications involve a surprising degree of bone mineral density deficiencies (even in males) and persistent gastrointestinal problems. Short-term medical problems may be divided into self-ameliorating (e.g., bradycardia, hypothermia) and urgent/emergent (e.g., electrolyte abnormalities, arrhythmia) signs.
Bulimia Nervosa: A Disorder Hiding in Plain Sight
Few scientific publications have changed the field of psychiatry as quickly or substantially as Gerald
Russell’s 1979 contribution, “Bulimia Nervosa: An Ominous Variant of Anorexia Nervosa.” Clinicians since have shaken their heads, asking why BN had neither been noticed nor described prior to this, which serves to remind us that recognition of new syndromes through keen observation of clinical psychopathology is still possible. On a positive note, despite its relatively recent description and acceptance as a serious ED, treatment for BN came along quickly and convincingly, perhaps for several reasons. Cases of BN are considerably more prevalent than those of AN. They are usually not as severe in terms of medical comorbidity and are less likely to require inpatient treatment; therefore, they are more suitable for rigorous evidence-based studies, especially random assignment to contrasting treatments.
Beck developed CBT in the 1960s, convincingly demonstrating it to be as effective as antidepressants for the treatment of nonpsychotic depressive disorders.
Fairburn et al. (1997) were among the first to apply CBT to the treatment of BN; in the 1980s, they demonstrated it to be effective for BN, and by the early to mid-1990s, CBT had become the most convincingly proven psychotherapy for the disorder. To this day, it has rigorous support for the treatment of BN. Analogues of CBT, such as dialectic behavioral therapy and interpersonal psychotherapy, have also been effective, especially in patients with variants of BN, such as borderline personality disorder. These evidence-based therapies often are employed in a group setting, not merely as an economic convenience but also to utilize group support and challenge. The National Institute for Health and Care Excellence in Great Britain requires CBT as the first line of treatment for BN, with other approaches used only if needed for variant disorders. It has given CBT an A rating for the treatment of BN, a designation indicating superiority to other psychological treatments or medications.
For a brief period of time, monotherapy treatment of BN with selective serotonin reuptake inhibitors was in vogue. However, temporary decreases in the frequency of binge eating and purging may revert when the medication is discontinued. Antidepressants may have a role in treating comorbid depressive disorders that accompany BN, but by themselves are inadequate. Monotherapy with psychopharmacological agents primarily reflects a failure to understand the psychopathology of BN. Its entranced morbid fear of fatness leads to dieting, but without the persevering traits of AN that lead to substantial weight loss. The psychopathologies of both AN and BN often include perceptual distortion and overvaluation of the benefits of slimming, as well as a morbid fear of becoming fat.
Although BN treatment should not be oversimplified, and success is not assured in all cases, it has come along much more quickly, and with less psychodynamic baggage, than AN treatment. Patients with BN and comorbid suicidality or severe hypokalemia, or those whose illness is refractory to outpatient treatment, may still need inpatient admission. The subtype of AN with binge-purge features requires a combination of approaches for food-restricting AN and for BN. Binge-purge variants of AN often present with even more comorbid psychological diagnoses than food-restricting AN or BN individually.
Long-term follow-up of BN has revealed variations in the illness trajectory over time. The most common finding is that, prior to their established pattern of BN, about half of patients attempted or actually achieved significant weight loss. AN—either the full disorder or a subclinical form—frequently precedes BN. People with BN appear to lack the persevering traits of food-restricting AN. When their foot is “off the brake” of restrained eating, especially after using alcohol, they engage in binge episodes that short-circuit their attempt at significant weight loss.
Binge-Eating Disorder: A Late-Comer but a True Eating Disorder
BED has been recognized as a true ED since about the mid-1990s, again, making many wonder how such an obvious disorder had been overlooked. Its late recognition has been due primarily to assumptions that binge eating without purging, especially in obese individuals, represented hedonic overeating, lack of willpower, or gluttony. Again, astute clinical inquiry into the presence of psychopathology led to recognition of BED as a true malady. Good phenomenological inquiry is at the heart of syndrome recognition.
Lessons learned from the benefits of CBT in BN were quickly applied to patients with BED, with moderate success. The core of BED treatment includes several facets not dissimilar to BN treatment: interrupting the abnormal binge-eating behavior; challenging the person’s ingrained sense of helplessness in the face of relentless binge-eating urges, usually in the service of improving abnormal mood states; and developing new strategies to deal with life’s challenges. Although CBT alone may at times be sufficient, BED is the most likely of the three major recognized EDs to require concomitant use of antidepressants. Successful treatment may also require subsequent decision making about remaining obesity, which is sometimes of a morbid degree. Bariatric surgeons generally require patients to prove their BED symptoms have been absent for about 1 year before they will accept them for surgery, a not unreasonable demand. At times, otherwise successful bariatric surgery is undone by postoperative relapse into binge eating. Therefore, good follow-up of patients previously diagnosed with BED or BN is essential after bariatric surgery.
Other Eating Disorders
Other variants of abnormal eating combined with psychopathological states may yet become accepted as EDs. There has always been tension between “lumpers” and “splitters,” contrasting those who more broadly group clinical states together as EDs with those who use a more narrow definition to split off each specific clinical presentation. For example, patients with purging disorder, in the experience of many clinicians, have some degree of unwanted driven eating prior to purging; although to an external observer the amount of food consumed may be modest and not typical of BN binges, it is unacceptable to the person. Whether this will turn out to be truly its own ED, split off from that of BN (or of the binge-purge type of AN), or lumped together remains to be seen.
ARFID, the catch-all diagnosis employed by DSM-5 (
American Psychiatric Association 2013), presents other problems. Unless the ED category is expanded beyond conventional boundaries to include abnormal eating without an overvalued drive for thinness, morbid fear of fatness, and associated distortion of body image, it lacks syndromic integrity with current understanding of what an ED is. Children younger than about 7 years do not internalize a sociocultural drive for thinness. An ED diagnosis, by convention, requires an abnormal mental state as well as disordered eating, with functional impairment of a reasonable duration. Many children have abnormal eating but do not necessarily have an ED. A separate category of abnormal eating in future diagnostic manuals may be helpful without invoking an ED designation. Time will tell.
Evidence-Based Treatment Parameters
Reversing Nutritional Insufficiencies
Treatment for EDs has predominantly focused on interrupting the abnormal behaviors associated with eating habits (restriction, binge eating, purging), reversing physical and metabolic abnormalities caused by either the malnutrition itself or the behaviors (predominantly electrolyte abnormalities), and attempting to alter the underlying ED psychopathology and treat any psychiatric comorbidities.
More recently, recognition that low weight itself is inadequate to identify and evaluate the severity of ED pathology or predict its associated sequelae has complicated this approach. The phenomenon of acute starvation and medical complications in patients who “don’t look thin” is well known (
Peebles et al. 2010;
Whitelaw et al. 2014), yet an obvious bias exists toward people whose body shape changes visibly indicate their nutritional insufficiency. Individuals with an ED who do not appear underweight are identified less quickly, resulting in a longer duration of illness before treatment (
Lebow et al. 2015). This bias could narrow the applicability of study results if care is not taken to adequately include these patients. The description of BN as “hiding in plain sight” and the delay in recognizing BED illustrate the delay in recognizing the malnutrition of EDs in people who do not appear outwardly starved by body size. New inclusion in diagnostic manuals of “atypical” AN attempts to recognize the severity of EDs “hidden” behind the appearance of normal weight; however, even the definition of inadequate weight in “typical” AN is somewhat problematic, because consensus on how to calculate a target body weight (
Lebow et al. 2018) is lacking. This may result in diagnostic error when assigning patients to either typical or atypical disease (
Forman et al. 2014). In addition, many descriptive studies find higher rates of atypical AN than likely justifies nomenclature suggesting scarcity, with studies within specialty treatment centers reporting 25% (
Sawyer et al. 2016) to 34% (
Forman et al. 2014) and larger epidemiological studies reporting 3–4 times (
Stice et al. 2013) and up to 10 times (
Hammerle et al. 2016) as many patients meeting criteria for atypical AN.
Resolution of Psychopathology and Comorbid Psychiatric Conditions
Keys et al.’s (1950) original starvation study demonstrated that nutritional rehabilitation can lead to changes in more than just medical sequelae, including improvements in depression, anxiety, irritability, concentration, and social interactions, thus recognizing that improving the body does, in fact, improve the mind. Much of the treatment research in EDs has focused on adults with low-weight AN and has usually shown improvements in ED and comorbid psychiatric pathologies (
Channon and de Silva 1985;
Meehan et al. 2006;
Pollice et al. 1997). The specific linking of weight gain with psychological improvement is sometimes difficult to evaluate because many studies do not directly evaluate this connection. Early results were more mixed, with some showing a direct correlation (
Eckert et al. 1982), others showing weight changes not being associated with psychological changes (
Coulon et al. 2009;
Mattar et al. 2012), and still others showing mixed results (
Kawai et al. 2008;
Laessle et al. 1988). With regard to severe and enduring AN in adults,
Touyz et al. (2013) demonstrated an approach that did not specifically emphasize weight gain as directly but still resulted in improvements in eating and comorbid psychiatric psychopathologies. However, patients did, in general, gain weight, and their nutrition improved.
Family-based treatment for children and adolescents with EDs has shown improvements in both ED and comorbid psychiatric pathology (
Le Grange et al. 1992,
2014;
Lock et al. 2005), with some data showing that patients with greater eating psychopathology and particularly binge-purge subtypes of AN (considered more severe in pathology) benefited to an even greater degree from nutritional rehabilitation (
Accurso et al. 2014;
Eisler et al. 2000;
Le Grange et al. 1992;
Lock et al. 2006). Furthermore, even early weight gain trajectory (a proxy for total nutritional rehabilitation) can predict overall outcome (
Accurso et al. 2014). However, some studies still show improvement in eating and comorbid psychiatric psychopathologies without as robust a change in nutrition restoration (
Robin et al. 1999). Overall, the data, hough murky at times, suggest that nutritional rehabilitation is an important consideration for treating not only the physical sequelae of EDs but also underlying psychopathology and comorbid psychiatric conditions.
Early Intervention
Viewed through the medical lens of “disease staging,” the identification of disease and provision of early intervention are usually linked with improved outcomes. Although consensus on a rubric for staging ED severity has been difficult to achieve, overall data conclude that early treatment is beneficial, with most descriptive studies supporting the conclusion that a longer duration of illness is associated with more severe medical complications, predicts a worse outcome, and is associated with higher relapse rates. Thus, aside from prevention strategies, early intervention is the best way to treat EDs (
Berends et al. 2018).
Historically, data have suggested that younger age at onset is associated with more severe ED-related obsessions and comorbid psychiatric conditions. However, some have suggested that the severity of illness in younger patients may, in fact, be the result of delayed identification and a longer duration of illness prior to first treatment (
Neubauer et al. 2014). Recent data have supported this finding, showing that the duration of illness itself (and not age) indicates a higher risk of relapse (
Berends et al. 2018).
Treasure and Russell (2011) presented a cogent description of the potential theoretical framework behind the idea of early intervention, particularly regarding the developmental process that takes place in the brains of young people. The malnutrition resulting from EDs comes at a time when tremendous developmental changes are occurring in the brain. The loss of brain matter itself interrupts the normal maturation process. The optimal hormonal milieu is also lost, because steroid hormone synthesis depends on adequate substrate, namely cholesterol. Disruption of this milieu impacts areas of the brain involved in self-regulation, impulsivity, mood, and excitability. Finally, animal models have demonstrated the impact of disordered eating on the sensitization of reward pathways within the developing brain, which results in progressive cementing of those pathways over time (
Lutter and Nestler 2009;
Treasure et al. 2010). This suggests that early intervention may be the only way to truly reverse these paths.