Aside from an understandable need to support the "Full Employment for Anthropologists Act of 2008," there is a very good reason to embrace the call to expand use of qualitative research methods in mental health services: we can't understand what's actually going on without them. This isn't special pleading—it is mundane fact. If we want to understand how it works (whatever "it" may messily turn out to be), what it means, and the enabling contingencies that seem to sustain it, we need to deploy such methods.
That both quantitative and qualitative methods are needed and appropriate, depending upon the task at hand, seems deniable only by those with a professional stake in being contrary. As Robins and colleagues argue in the lead article in this issue, parity in planning and engaged communication make for fruitful mixed-methods projects. "Q-squared" as our merry European colleagues sometimes call it.
Calls for closely watched, carefully documented, and scrupulously interpreted accounts of programs in action, of undeclared (and potentially disabling) agendas, and of tacit refusals that masquerade as enthusiastic endorsements steadily pour in from clinicians and researchers. It isn't hard to see why. Within the irksome, recalcitrant, unsettled, misbehaving nest of hidden variables—what Hohmann and Shear once mischievously referred to as "the noise of real life"—often lie undetected arbiters of program success. It can be as elusive as the added value of unpaid work or as plain as the transportation costs no one asked about.
Published examples of qualitative methods in action are becoming more common. In this issue, Palinkas and colleagues found evidence for Schoen's "reflexive practitioner," but not as rare exemplar of craft. Instead, such practitioners appear as common and ill-accommodated burrs in intervention studies. When this finding is added to the study's jarring contextual finding that nearly a third of clinicians withdrew from the study for reasons external to the intervention itself, the difficulties of conducting and learning from such research projects become evident. In a remarkably matter-of-fact passage, however, Palinkas and colleagues describe how "creativity and compromise" were used to keep clinicians engaged—a hopeful if inadvertent rejoinder to the problems inventoried by Robins and colleagues.
Requiring students to take sides in this long-running civil war between methods of inquiry has outlived its utility. A working knowledge of each other's toolkits and sensibilities seems essential to productive work in the field today. And more: Clifford Geertz argued that an ethnographic proficiency is key to competent citizenship on the globalized home front. Mixed-methods research is good practice in diversity—in "learning to grasp what we cannot [or choose not to] embrace." The skill of bridging cultures is "arduously learned" (Geertz again) and, once learned, something we must "work continuously to keep alive." Words to live (and work) by.