The involvement of patients and other stakeholders across all stages of health-related research is now a broadly recognized priority (
1–
6). For example, stakeholder inputs before project conception help ensure that research questions align closely with the concerns of people directly affected by a given condition or care practice. Incorporating the expertise, lived experience, and social connections of stakeholders improves key study elements such as instrument design or participant retention. Thorough interpretation of results requires diverse perspectives. When relevant patients, providers, and administrators are involved in decisions about how study findings are disseminated, the reach of the findings is broadened, and their implementation may occur sooner. Engagement demonstrates respect for the people affected and advances the goal of high-quality research. Accordingly, the U.S. Patient-Centered Outcomes Research Institute (PCORI) (
1,
2) and multiple other private and government sponsors, most explicitly the United Kingdom’s National Institute for Health Research (
7,
8), now insist on stakeholder involvement in the research process.
Patient engagement may highlight and help overcome particular challenges within research on mental health care, such as inadequate integration with other medical care, marginalized communities, disrupted connections between patients and health systems, and the divergence between traditional patient outcomes in psychiatric research (e.g., symptoms and hospitalizations) and outcomes that patients identify as especially important (e.g., quality of life and a sense of identity) (
9–
14). Indeed, the origins of the broad movement toward patient-involved and even patient-led research lie in the stigmatization, patronization, and neglect experienced by key communities of patients (e.g., those living with HIV, mental illness, or disability) who were predominantly outside of traditional medical power structures and became motivated to reform the practice of research (
8,
15).
It can be difficult to characterize current engagement in health research through a review of the research literature because approaches, settings, and terminologies vary so widely, and researchers often report only their study findings without providing the details of the underlying engagement processes (
16,
17). The literature includes prominent examples of patient-involved prospective surveys and projects aimed at developing research agendas (
18–
20). Many other reports of research with significant stakeholder engagement are either randomized controlled trials or qualitative studies focused on specific clinical questions. These studies are typically conducted in local clinical settings and engage with patients and other stakeholders from those settings (
9,
21–
25). Some focused clinical studies have leveraged partnerships with national patient advocacy organizations (
13,
26–
28). Patient engagement has not been prominently reported in research that relies on large health system databases. The potential for greater patient involvement in this type of research is currently developing in “patient-powered research networks” that make use of web-based communication and data collection technologies (
29–
31), but reports from completed projects are scant. Our review of the current literature has revealed a paucity of published reports of patient engagement in retrospective studies using health system data and focusing on broader policy concerns such as insurance design or access to care.
This article presents an example of quantitative health systems research that uses large-scale administrative data and incorporates major patient stakeholder engagement. Our project studied the impact of different commercial health insurance arrangements in the United States on outcomes for individuals living with bipolar disorder and how people with this condition cope with the costs of care. Here, we explain how different engagement components fit into our project and furthered its central goal of improving our understanding of the effects of insurance benefit design on access to behavioral health care. Our engagement approach and experience may inform future research collaborations. The detailed research results of this project are published elsewhere (
32–
36).
Research Context
Our project investigated health insurance design, focused on individuals with bipolar disorder, and combined two distinct quantitative and qualitative studies. For the quantitative study, we used a national database of commercial health insurance claims to evaluate how a change from a low-deductible insurance plan to a high-deductible plan affects care and costs for people with bipolar disorder (
32). High-deductible health plans (HDHPs) require plan members to pay 100% of the costs for many services at the beginning of each year, until the deductible threshold (e.g., $2,500) has been reached. Once the deductible is reached, “traditional”-type coverage begins, with members paying copayments or coinsurance for most services. HDHPs usually have lower monthly premiums than traditional plans but require more patient cost-sharing for services received, posing a barrier to care for some people and shifting costs onto sicker individuals, who require more care.
We identified individuals with bipolar disorder through diagnosis codes in anonymized claims data. We created two cohorts on the basis of specific employers and whether these employers either kept all their covered employees in a traditional, low- or no-deductible plan for at least 2 years or switched all their employees from a traditional plan to an HDHP in the second year. In a retrospective, quasi-experimental design, we compared outcomes after the switch into HDHPs for matched enrollees between the two cohorts. We measured changes in routine mental health services use, acute services use, and patient out-of-pocket costs. Our research questions are summarized in
Box 1. The aims of the quantitative study were to estimate increases in cost burden for HDHP enrollees and to determine whether people responded to higher costs by changing their utilization of care.
In a parallel qualitative study, we conducted in-depth interviews focused on people living with bipolar disorder and using employer-sponsored coverage, including commercial plans of diverse types (
33). We explored respondents’ experiences navigating the health care system and making choices about care in the context of their cost-sharing requirements (
Box 1).
These two halves of our mixed-methods investigation were meant to inform each other throughout the 3-year project period, with interviews probing and revealing phenomena not readily observable in claims data. The overarching research design could be described as “concurrent triangulation,” with equal priority placed on the quantitative and qualitative elements (
37). The academic collaborators on the core team had extensive experience in using quantitative methods to study effects of HDHPs generally and of insurance coverage design for populations with serious mental illness (
38–
50). We incorporated the qualitative study and an array of stakeholder engagement components into this project to enhance its depth, comprehensiveness, interpretation, and impact.
Engagement Components and Protocol
The project had four major engagement components (see diagram in an online supplement to this article): core team members experienced with bipolar disorder, a stakeholder advisor panel that met quarterly, the in-depth interview study, and outreach via social media and e-mail to communities affected by bipolar disorder. The interview study occupied a unique position, simultaneously representing research activity and engagement.
The lynchpin of our engagement with people living with bipolar disorder was the partnership between academic researchers and the Depression and Bipolar Support Alliance (DBSA), a large national nonprofit organization focused on mood disorders (
51,
52). DBSA is distinctive for its focus on advocacy, local support groups, wellness, and online tools and information and is led primarily by “peers”—that is, individuals with mood disorders—and their family members. The term “peers” is preferred in the community over “patients,” which is unsuitable for individuals not in formal treatment and too limited to describe even people who are in treatment. “Peer” better conveys a person in full. An executive officer of DBSA (P.M.F.) served as coinvestigator on our core research team, contributing to all research activities and coleading the interview study and community engagement.
Although the project’s engagement with DBSA peers and family members is the focus of this article, our stakeholder engagement more broadly included psychiatrists and individuals representing health care delivery systems, insurers, and employers.
Box 2 summarizes the backgrounds of our core research team and stakeholder advisor panel. Members of both bodies had substantial technical expertise and often wore more than one hat. Most were present during protocol development, before application for funding.
Our DBSA coinvestigator led recruitment for the in-depth interviews, including distribution of flyers to leaders of all the approximately 300 local DBSA chapters in the United States. Volunteers contacted her for preliminary screening. Then, an interviewer based in an academic setting carried out the secondary screen, obtained informed consent, and conducted 1-hour interviews. Respondents were individuals with bipolar disorder or family members involved in their care. Following a semistructured guide, the interviews explored domains such as the types of care people used, costs, affordability, and priorities for care (
Box 1). The interview guide was jointly drafted by two academics and the DBSA coinvestigator.
We conducted community outreach and feedback exclusively through DBSA channels approximately quarterly over 3 project years. Our protocol called for selecting topics on a rolling basis to fit project needs as they developed. For example, we envisioned seeking at various times further insights into care for bipolar disorder and how people manage it, community interpretations of and interest in preliminary findings, and the most effective ways of presenting results. DBSA offered multiple avenues for reaching people with personal experience or professional knowledge of bipolar disorder, including DBSA’s Facebook page, the “Care for Your Mind” blog (
53), local DBSA chapters, a network of regional affiliate advocacy organizations, and DBSA’s longstanding Peer Council.
Our engagement approach was shaped mainly by the PCORI Engagement Rubric (
54). PCORI requires that sponsored research questions be patient-centered (informed and endorsed by patients) and that patients and other key stakeholders be meaningfully involved. Although PCORI’s Rubric was strongly influenced by the community-based participatory research (CBPR) (
12,
15) movement that rose to prominence in the decades before PCORI’s establishment, the two frameworks are not interchangeable. CBPR emphasizes equitable sharing of research roles among partners and stresses results that advance social justice and eliminate disparities. PCORI emphasizes comparative effectiveness and generating information to support better decision making in health care. In addition, CBPR is especially focused on the patient or service user, whereas PCORI presses for involvement by multiple categories of stakeholders and decision makers, including family members, clinicians, and health system administrators. PCORI welcomes the use of CBPR strategies in the research it funds but is flexible on what engagement may look like, and a wide range of projects and pathways have emerged (
1,
8,
55,
56). Our approach aligns with each distinct principle of CBPR (e.g., mutual benefit, empowerment, and sharing of information) (
12,
15) to an extent, but not with the across-the-board rigor that would earn the CBPR label.
Engagement as Experienced
Investigators
The project benefited from fast friendship between the academic and advocacy coleads on the interview study and community engagement (J.M.M. and P.M.F.). Extensive telephone, e-mail, and face-to-face communications fostered intensive colearning, whereby each colead developed an understanding of the other’s professional world, norms, and language. For example, the DBSA coinvestigator received formal training in the protection of human subjects, and we shared thoughts on how institutional review board requirements for health-related research contrast with the freedom DBSA typically enjoys as an advocacy organization gathering and reporting information about its audience. We discussed other relevant projects under way, eventually forging new collaborations. Early in the current project, the academic colead urged recruiting interviewees from a defined population—DBSA chapters—while expressing skepticism of web-based platforms that the DBSA colead felt would accelerate recruitment. The academic colead felt that a single recruitment avenue offered clarity in later reporting and confidence that interviewees had “authentic” roots in DBSA. We adhered to that decision, but, over time, the authenticity of the DBSA Facebook audience became increasingly obvious, and slow chapter-based recruitment became a concern. On best specific strategies for outreach to chapters and Facebook readers, DBSA consistently charted the course. Outreaches were crafted, and their resulting feedback was analyzed jointly by the two coleads.
The academic colead acted as liaison between qualitative and quantitative project activities because it was not feasible for all investigators to fully immerse themselves in both. Nevertheless, all investigators attended monthly project meetings, contributing to protocol refinements for both studies. Our DBSA coinvestigator brought contemporary issues into these discussions, such as the growing role of paid peer counselors and the distinction between “getting to wellness” and merely eliminating clinical symptoms. Although the term “patient” is prominent in publications about health care and its costs and research engagement, many DBSA peers consider the word too narrow and disempowering, so our investigators learned to seek alternatives. The DBSA coinvestigator, in turn, reviewed our progress with DBSA’s peer-led national headquarters throughout the project period.
Stakeholder Advisor Panel
Quarterly meetings brought together the stakeholder advisors and the full research team. Investigators reviewed recent study results and community feedback, upcoming plans, and any problems or pending decisions. The meetings were opportunities to take stock of the project and receive fresh perspectives and ideas. All attendees had substantial content expertise, and the moderate size of the group permitted discussions that were simultaneously high-level, informal, and generally satisfying. Nevertheless, after a year, in response to a peer representative’s concerns, we revised our meeting format. The academic researchers had come to panel meetings having greater familiarity with both the project and the other attendees but having little experience with engaged research. In their diligence, researchers sometimes inadvertently dominated discussions, and some peer representatives felt unheard. Recognition of this problem was slow because most attendance was by telephone. To rebalance the meetings and elevate peer voices, we adjusted our procedures as follows: helping external advisors feel readier to contribute by distributing slide decks further in advance, with slides explicitly labeled “questions for our advisors,” and establishing a hierarchical speaking order for discussion periods. We invited comments first from peer representatives, next from other external advisors, and then from researchers. Participants agreed that the discussion quality improved as a result of these changes.
Most research team members were unable to extensively examine interview transcripts or comments gathered through social media, so quarterly meetings were their primary engagement experience. Peer representatives on the advisor panel offered interpretations of study results through the lens of personal experience, suggested which results would be useful for peer audiences, and highlighted health care system dysfunctions deserving closer study. The liaison investigator encouraged follow-up comments and conversations after these meetings, which were frequent and fruitful and strengthened feelings of trust.
In-Depth Interview Study
The in-depth interviews had high value within the project both as a formal research aim and activity and as a way to incorporate more peer voice and experience into the project generally. Our claims data came from a single large insurer, and our quantitative study cohorts were highly specified. By contrast, interview respondents reported a range of coverage situations—all were employer sponsored, but with different years’ duration, levels of generosity, and benefit structures. Respondents shared details of their living circumstances and journeys toward recovery and provided fully dimensional views of their experience navigating systems of care for bipolar disorder. They heightened our awareness of issues such as the U.S. psychiatrist shortage, narrow provider networks, self-coordination between primary and behavioral care, disruptions in care following routine changes in employer benefits, mental illness stigma, complications of psychotropic treatment, and strategies for achieving wellness and reducing costs. Insights from the interviews guided analyses of the claims data. For example, stronger awareness of network-related problems led us to examine regional differences more closely and develop plans for detecting evidence of care that is obtained entirely outside of coverage.
Our claims data analyses detected only modest effects on care utilization following the shift into HDHPs, despite substantial increases in cost-sharing. Specifically, we observed evidence of cost-driven reductions in psychotherapy visits but not in psychiatrist visits or medications. These modest findings were more easily understood in light of our interview findings. Respondents reported great reluctance to tinker with their mental health care if it was working; many explained that when money was tight, they usually tried to cut back in other areas, to the point of genuine hardship, but not their care for bipolar disorder (
33). Conducting interviews thus augmented our project by informing and improving empirical analyses and extending beyond them with distinct findings. Having a mixed-method research design can hedge against the disappointment of “null” statistical results. Interviews seldom produce null results, because their purpose is to unearth what people experience and say, and they almost always yield some insight or finding. Moreover, by illustrating the trade-offs and hardships in daily life that often stem from high costs of care, the interviews revealed consequences of high costs that were not detectable in claims data. This strengthened our net findings and increased the likelihood that policy makers consuming the research might consider reforms to protect patients from such harms.
The interviews were also deeply moving: people told vivid, intimate stories with granularity and humor. They humanized our research questions in ways that claims data cannot, sharpening our sense of the project’s potential impact. The interviews effectively created another community of clients for our research, reminding us of its importance and the need to do it well.
Community Outreach and Feedback
We periodically reflected on what aspects of the project would be clarified with more real-world examples or would be augmented by fresh perspectives; we formulated these into questions for DBSA’s audience. These outreach-and-feedback activities overlapped in purpose with our interview study in that we received many additional stories and insights gained through lived experience, adding to our research ideas, understanding, and motivation.
However, in the research realm, engagement feedback is classified very differently from study data (i.e., interviews and claims data). We could learn from these comments and analyze and summarize them but could never publish studies from them; they remained for internal use only and were not monitored under standard human subject protections. Also, feedback comments were not collected with the same rigor as the study data. Questions could be changed at the last minute on Facebook and the blog website, and anyone on the Internet could respond. Small sample sizes were not cause for concern. For researchers, this flexibility felt liberating. We floated open-ended questions and structured capsule surveys of varying lengths and topics (
Table 1). Over time, we stopped targeting the questions solely to people with employer-sponsored insurance. Instead, we asked audience members to indicate their coverage situation and then examined whether their responses were consistent across different coverage types. Also, by the middle of the project, all social media outreaches used a capsule survey design. Each outreach presented about a dozen questions with a mixed format—typically, a multiple-choice item, followed by an invitation to tell us more in an open-response item. The multiple-choice format allowed audience members to respond quickly by using a cell phone if that was their preference. Although the feedback did not offer formal empirical validation of our study data, it provided us with “gut checks.” Community comments also generated hypotheses for potential exploration in future research (e.g., on the positive and negative aspects of employment for wellness or on cost conversations with providers).
Outreaches to the DBSA Peer Council or DBSA Advocacy Network were an excellent complement to DBSA’s social media presence, which is estimated to reach millions of individuals. Compared with the social media activities, these outreach efforts reached smaller numbers of individuals (about 300 and 2,500, respectively), identifiable to and having enduring relationships with DBSA. They included a higher proportion of people with professional experience of bipolar disorder (e.g., clinicians). Because they received multiple e-mails over time about our project, most people in these two groups had longitudinal familiarity with it. Peer Council and Advocacy Network outreaches thus fell somewhere on a continuum between the queries for the vast DBSA Facebook audience and for our intimate stakeholder advisor panel. Because we used e-mail to contact these two groups, we tried to limit the burden (i.e., frequency and length) of contacts and selected questions taking advantage of the deeper DBSA involvement of these groups. That said, even for Facebook- and blog-based outreaches, we felt that four per year was a prudent maximum to avoid audience fatigue with our project and interference with other DBSA priorities for community interaction.
Response tallies ranged from zero to several hundred per outreach, likely reflecting several factors, including which platform was used, level of interest in specific topics, whether people felt they could comment productively, occasional technical glitches, and fluctuations in competing demands for audience attention. We struggled more than anticipated to generate feedback about our project or its results. Response volume was greater when we asked for people’s own experiences. Negative reactions were rare (e.g., someone pointing to the role of academic experts in creating prominent problems in U.S. health care). Instead, we were encouraged by frequent, unsolicited “cheerleading” comments about the importance of research that involves peers and shines light on issues like health care affordability.
Dissemination of and engagement around our actual findings are ongoing; plans include joint appearances at scientific conferences and posting links to articles and lay synopses on DBSA platforms. Solid stakeholder involvement throughout a project ensures that there will be multiple partners at the project’s end who are excited to spread its messages and help identify potential opportunities for translation into practice.
Discussion and Conclusions
Health services research using secondary administrative data is increasingly common, as is patient-centered or patient-engaged research, yet reports of projects bridging administrative data and patient engagement have been rare. Our research examined the use and affordability of health services among people who have bipolar disorder and obtain health care through employer-based insurance. To support its success, we built in layers of stakeholder engagement. In a fairly traditional way, we strengthened our understanding of the subject matter by involving technical experts in psychiatry, health care delivery, insurance, and employer oversight of health benefits. More novel for our research team and for health services research that relies on retrospective data sources was the involvement of individuals with lived experience of bipolar disorder; this engagement was multipronged, longitudinal, and dynamic. DBSA, as a major national advocacy group with sophisticated operations already focused on similar topics, was an invaluable partner for this project. DBSA had the ability to open multiple avenues of essential engagement promptly and professionally. Our project engaged peers and their family members before, during, and after the project, as coinvestigator, advisor panel members, in-depth interview subjects, and ad hoc commenters. Having a variety of engagement components was itself beneficial, allowing hundreds of peer stakeholders to contribute to our project according to their levels of interest, ability, and engagement with DBSA. This variety yielded both range and depth in additional perspectives.
We met a goal of fostering engagement that was “more than token” (
7,
57) and affected the research and everyone involved in it. Career researchers could better grasp the real-life context around observable health system events and the current concerns of an activated community. Engagement strengthened the research questions, methods, and interpretation of results and stimulated new lines of inquiry. In research that emphasizes electronic data sources, with no direct patient contact, the addition of patient voices is especially valuable. Patient voices are also especially needed in research on a stigmatized condition like bipolar disorder—to boost empathy and correct preconceptions. Further, simply incorporating periodic communication with external parties added value to our project; it enforced deliberate self-checks on our progress and improved communication within the team.
For those in DBSA, engaging in our project was an opportunity to be heard and shape the outside narrative about their community, learn and participate in research practice, and teach about advocacy practice. An often-overlooked aspect of recovery is the empowerment that one receives through amplification of one’s own voice or that of one’s community—especially for stigmatizing conditions. Knowing that researchers are invested in understanding the peer perspective beyond data analytics supports this key component of wellness for many individuals. The DBSA national organization benefited from acquiring skills outside of its areas of core competency. Its engagement created awareness about the power of research to bring about systemic change and positioned DBSA to capitalize on future research opportunities.
Ideally, our engagement between academics and peers would have begun earlier. The claims-based study design was largely determined in advance, building on previous studies focused on other health conditions. The quantitative design also had less inherent flexibility because of the limited set of available measures. Stakeholder engagement was a major force, however, shaping our qualitative study protocol. Another limitation of our engagement may have been heavy reliance on a single national partner organization. This approach was practical but potentially limited the scope of additional perspectives and may not have fully represented peers with bipolar disorder. The multiplicity of voices reaching us from that single organization tempered this concern somewhat. We appreciate that a fully participatory “co-production” approach (
12,
15,
58,
59) to engagement would look different—it likely would involve larger numbers of patient representatives, including people from nonprofessional employer settings, in more phases and activities of the project, with more patient community control of project direction. We do not know what we may have missed by choosing a relatively pragmatic path for engagement.
The academic and advocacy worlds have different cultures and priorities yet share a desire for high-quality research results that may translate into future improvements in care. Our example shows how a complex, longitudinal, and flexible partnership can enhance research while respecting the distinct roles and needs of those involved. Embracing and including people experienced in living with or caring for someone living with a serious illness is an ethical and good-practice imperative for responsive research.
Acknowledgments
The authors thank the other members of their stakeholder advisor panel (Gregory E. Simon, M.D., M.P.H., Francisca Azocar, Ph.D., Denise D’Aunno, M.B.A., Kenneth Dolan-Del Vecchio, M.S.W., Kristin A. Olbertson, J.D., Ph.D., Ken Duckworth, M.D., and James Sabin, M.D.) for their continuous engagement with this study, and hundreds of members of the Depression and Bipolar Support Alliance online and advocacy community who supported this study with their experiences and insights. Additional research team members (Stephen B. Soumerai, Sc.D., Fang Zhang, Ph.D., Robert F. LeCates, M.A., Xin Xu, M.S., Carina Araujo-Lane, M.S.W., and Jamie Wallace, M.P.H., of Harvard Medical School and Harvard Pilgrim Health Care Institute) were essential to engagement efforts and project success.