The idea labeled the “Tomato Effect” by Goodwin and Goodwin (
1) posited that efficacious treatments are too easily rejected when they do not “make sense” in terms of accepted theories, just as tomatoes were rejected in North America for centuries because the reigning dogma was that all plants in the nightshade family were toxic. Although broadly pertinent, this theory is particularly applicable to two complementary sets of studies on nutrition and mental health. The two studied areas represent different aspects of dietary intake and might be labeled
supplementation and
elimination. In the supplementation field, the theoretical framework is that nutrient intake that is suboptimal (at least for the individual's genetic make-up) can cause or exacerbate psychiatric symptoms and is correctable by the consumption of additional nutrients. The medical literature on this topic extends back at least 90 years (
2) and is currently increasing rapidly, with studies on both single-nutrient (
3) and broad-spectrum interventions (
4). There is no generally accepted explanatory mechanism for positive treatment effects but many have been proposed, such as
1) inborn errors of metabolism that cause high Michaelis constants and decreased binding affinity of enzymes (
5), and
2) deficiencies in mitochondrial function (
6). Hence, the research base showing beneficial effects of micronutrient treatment is primarily empirical, still lacking definitive evidence of mechanisms that would “make sense” of it all. This is not an unusual situation in medicine: often a treatment is shown to help before the exact mechanism is worked out.
The second perspective, involving elimination of unwelcome chemicals that may have adverse effects, includes the research on both environmental toxins (
7) and food “additives” (usually meaning colors, flavors, and preservatives). This is the topic addressed by Stevenson et al.'s group in this issue of the
Journal (
8), and their data move this field forward significantly by providing important information on potential mechanisms. As with the research on supplementation, the elimination perspective has also been explored extensively over a long period of time, at least since Feingold's observations in the 1970s. The empirical data have demonstrated a relationship between food additives and behavior in some children with some additives, but not consistently (
9). One interesting effect of the uneven scientific results has been a dramatic split between parent advocacy groups, powered by individuals convinced of the meaningful role of additive-free food for their children, and the scientific community, which has been stumped as to how to tease out the importance and relevance of additive-behavior interactions. Uneven results in science are often the result of weakly powered studies and poor methodology, but that does not appear to be the case for this topic. Some good data have been generated over the years from randomized controlled trials and other study designs. What appeared to be missing up until now was a way to understand
individual differences in response to exposure.
The background to the current study in this issue rests on the novel step taken several years ago by the University of Southampton researchers when they chose to study the relationship between food additives and hyperactive/inattentive behavior in nonclinical samples. Initially, they reported on a sample of preschool children on the Isle of Wight, each of whom was categorized as hyperactive or not and atopic (based on skin prick tests) or not (
10). During all 4 weeks of the study, children were on a diet that was free of artificial colors and a preservative (sodium benzoate); during the second and fourth weeks they were challenged with either a placebo (fruit juice) or juice containing the colors and preservative. The investigators were then able to evaluate behavioral changes in response to the additive-free diet, as well as in response to the challenges. There was evidence of improved behavior when the additives were absent and return of problems upon challenge, but with no preferential effects in those categorized as atopic. Importantly, children in both the hyperactive and nonhyperactive groups showed the relationship between additives and behavior, which was the first demonstration of this phenomenon in a community sample of children.
Their second study was carried out in a different group of 153 preschool children, as well as in 144 children aged 8–9 years (
11). Again, it was a community-based assessment of artificial food colors and sodium benzoate. The design was a within-subject crossover between a placebo drink and two active mixes, neither of which had an exceptionally high level of additives (maximum was equal to about 224 g of candy). The overall results were supportive of their 2004 findings, although more consistently for the 8/9-year-old children. As they acknowledged, the cumulative findings did not tease out the impact of artificial colors versus the preservative sodium benzoate, which needs to be evaluated in a future study.
The current report provides the genotyping information from the same 297 children as the 2007 study (
11). As the authors point out, previous genetic studies have failed to completely account for hyperactive ADHD-type behaviors, and the neurotransmitter systems that have received the most attention thus far have not included histamine. Yet histamine H3 receptors are a logical area to explore, as they affect hyperactivity levels in animal models and also influence frontal cortex dopamine release. In addition, there is strong evidence that artificial colors can trigger histamine release and urticaria. The findings described in this issue show that the behavioral effects they reported in 2007 were likely moderated by histamine degradation gene polymorphisms in both the preschool and school-aged samples, as well as by the well-studied DAT1 polymorphism in the older age group. Other catecholamine genes did not seem to be relevant in this sample. This is a novel set of findings, as histamine has been relatively neglected up to this point in ADHD genetic studies.
How might this polymorphism of the histamine gene (histamine N-methyltransferase [HNMT]) function? The logic is as follows: there is evidence that food additive challenges trigger histamine release; H3 receptors are known to be present in mammalian brain; HNMT polymorphisms influence histamine clearance. The links are clear, and the paper is a watershed, although still falling far short of definitive proof. As with any breakthrough, more questions emerge than are answered, including distinguishing the role of the food colors from the food preservative in the challenge drink they used. The social implications of each possibility are important. There are economic and safety issues regarding food spoilage that are obstacles to the ready adoption of restrictions on food preservatives, but there is no cost to health or safety in giving up artificial food colors. It certainly appears that over and above this set of studies, the cumulative evidence is sufficient for society to demand adherence to the precautionary principle and to begin to restrict the use of artificial dyes, at least in foods that target children.
Whether the focus is on putting something into the diet (micronutrients) or taking something out (artificial food colors), the recent research on nutrition and mental health is progressing rapidly. What Stevenson and colleagues have done is open the door to “making sense” of their study results by looking at an underlying mechanism. Further work in these areas, well-supported by funding agencies, could ensure that the topic of food additives and behavior does not continue being a tomato.