Cultural factors affect all aspects of mental health care, from the patterning of patient symptoms to the illness classifications of clinicians and the language used to co-construct illness narratives (
1). Nevertheless, generating scientific evidence to train clinicians in cultural competence has been a challenge. Studies of cultural competence training have not employed standardized cultural assessment tools, leading to differential appraisals of their effects on clinical outcomes (
2). One promising tool is the
DSM-5 Cultural Formulation Interview (CFI), which is disseminated free of cost by the American Psychiatric Association (APA) (
3). The CFI is a 16-item, semistructured questionnaire created through a literature review on clinically based cultural assessments, a field trial with 321 patients and 75 clinicians, and expert consensus. The clinicians in the field trial preferred active, case-based, behavioral simulations that allowed them to practice using the interview by taking turns as patients and clinicians rather than passive training methods, such as watching videos (
4). This paradigm is known as “active learning,” an interactive process in which participants develop skills by reflecting upon the content rather than by passive attendance at didactic lectures (
5).
Subsequent studies have examined whether the method of training affects whether the CFI can be used by psychiatric residents as a cultural competence tool. Investigators who distributed the CFI and a written summary of its contents from
DSM-5 as the training method found that residents without prior social science exposure felt that the CFI did not address cultural issues (
6). However, active learning improved perceptions of the CFI and its training. In a study of 22 residents, a lecture on health disparities was combined with a group activity in which participants shared personal reflections on cultural identity (
7); scores on the cultural knowledge subscale of the cultural competence assessment tool improved. In another study, 16 residents reported increased comfort with the CFI after practicing with it and receiving feedback (
8). These findings raise questions about how to create a training intervention that elicits active learning and can be applied to all providers, not just residents. This column discusses the creation of a CFI online training module developed with consumers in recovery, clinicians, providers, the New York State Office of Mental Health (OMH), and the APA. It also describes the profile of participants who completed this training. Notably, this study is the first to examine CFI training with providers other than psychiatric residents.
Description of the Module
The Center of Excellence for Cultural Competence (CECC) and the Center for Practice Innovations (CPI) at OMH’s New York State Psychiatric Institute (NYSPI) developed the module after receiving copyright permission to use the CFI from the APA in 2013. The CECC contributed content expertise by identifying providers to model CFI use. The CPI recruited consumers in recovery who shared their experiences by responding to CFI questions. The CPI contracted with a vendor to create the module. Activities included using instructional design principles to organize content and create interactive experiences for learners, filming expert commentary and consumer CFI sessions, and creating the Web interface. The CPI’s learning management system stored deidentified demographic and evaluation data. The CECC and CPI completed an application for physicians, social workers, and substance use counselors to receive continuing education through OMH. The NYSPI Institutional Review Board approved the study.
The training module’s learning objectives are to understand the
DSM-5 definition of culture and how the CFI puts this into practice, understand the theory and content behind the four domains and 16 questions of the CFI, and identity barriers and possible solutions for implementing the CFI in the trainees’ setting. The module consists of brief presentations delivered by the senior author on the importance of cultural assessments in psychiatry, professional recommendations for clinicians to demonstrate cultural competence, and the CFI’s four domains. Afterward, participants complete a demographic survey. When learners click on each domain, videos appear of clinicians interviewing consumers. CFI questions also appear onscreen simultaneously to optimize multimodal audiovisual learning (
4). Segments from five video interviews with different clinicians and consumers demonstrate the range of information that the CFI can elicit. At intervals, participants complete surveys to reflect on the CFI questions they like most, barriers to using the CFI in their practice, and strategies for overcoming such barriers, in line with research on implementing the CFI in real-world settings (
9). Participants can save their responses for personal reference. This strategy introduces aspects of active learning, given that participants must complete content-based questions to proceed throughout the module rather than only consume videos passively. The module can be completed in an hour.
The module went live in March 2017. It is available without cost to providers in New York and on a low-cost, sliding scale worldwide by contacting
[email protected]. Thus far, participants have been recruited through the CPI listserv, which reaches approximately 20,000 people.
Participants’ Characteristics and Responses
As of January 2, 2018, 423 providers completed the module, including eight (2%) psychiatrists, 27 (6%) psychologists, 18 (4%) nurses, 194 (46%) social workers, and 176 (42%) other providers (such as counselors, peer providers, case managers, and therapists). Of these, 320 (76%) had a master’s degree; 44 (10%), a bachelor’s degree; and 36 (9%), a doctorate. Of 423 providers, 339 (80%) worked in outpatient settings, with an equal number (N=23, 5%) in inpatient and assertive community treatment settings. Participants estimated the amount of cross-cultural training they had received over the past five years, with 79 (19%) reporting less than five hours; 134 (32%), five to 10 hours; 98 (23%), 11 to 25 hours; 70 (17%), 26 to 50 hours, and 42 (10%), 50 or more hours. Three hundred (71%) took the training for continuing education credit. Fifty-eight participants (14%) identified their ethnicity as Latino; in terms of race, 296 identified as white (70%), 63 as black (15%), 17 (4%) as Asian, 10 (2%) as Native American, and 35 (9%) as mixed or other. Ninety percent were born in the United States.
Participants evaluated the module on a 5-point Likert scale, from 1, strongly disagree, to 5, strongly agree. The module received an overall mean±SD evaluation score of 4.13±.80 from the 423 learners. The lowest mean score was for wanting further CFI training (3.82±.92). Two items tied for the highest mean score—training that met stated objectives (4.26±.77) and better understanding of the type of information obtained through the CFI (4.26±.76). The training appeared to elicit agreement on the item that participants could provide DSM-5 definitions for culture and cultural assessment, which received a mean score of 4.17. [A table providing the scoring breakdown and mean score per item is available in an online supplement to the column.]
One learner’s optional response typified other answers about how the training helped emphasize culture, “It comes down to finding out who the person really is and not simply making unqualified judgments or stereotypes.” Another learner wrote, “The questions focusing on how the [clients perceive] themselves and how they think that their support network perceives them [were helpful in cultural assessment].”
Finally, participants completed a closed-ended questionnaire on whether the training would influence their implementation of the CFI. Of 423 participants, 247 (58%) indicated that this module would result in practice changes, 164 (39%) responded that the module would “change the management and/or treatment of my patients/clients,” and 83 (20%) responded that the module would help “create/revise protocols, policies, and/or procedures.” Many (N=174, 41%) responded that the module “validated my current practice and that no changes will be made,” and three people (1%) were unsure about changing their practice because of systemic barriers, such as needing more time.
Discussion and Future Directions
This column contributes to scholarship on cultural competence by presenting data on the largest sample of mental health providers trained in a standardized format through the CFI. A training module requiring participants to demonstrate active engagement with content elicited agreement about the CFI’s usefulness as a cultural competence tool. Possible caveats include the following: more than 85% of participants were social workers, mental health counselors, or other providers besides physicians (
6–
8) and may have been biased toward valuing cultural assessments. Over a quarter had more than 25 hours of cross-cultural training over the past five years, exceeding the typical training for general psychiatric residents, the only population studied since the CFI’s release in
DSM-5. Our participants worked in public mental health settings and chose this module for continuing education rather than being mandated to use it. Social desirability bias may partly be responsible for the training’s positive responses. It is possible that psychiatrists chose not to participate because they see their roles as limited to medication management. Future work could examine whether training and evaluation outcomes differ for providers with less cultural competence training and experience. The APA and the American Association of Directors of Psychiatric Residency Training could consider publicizing the module to reach more attending psychiatrists and trainees, respectively.
Our work partly responded to prior critiques. Cultural competence initiatives across medicine suffer from a lack of standardization, in both content and evaluation (
2). The CFI offers one standardized tool for training in cultural competence—which we believe is a lifelong process—and the evaluation questions listed in the
online supplement can be used for posttraining assessment. Almost 60% of participants indicated that the CFI could change their practice or policies, and over 41% responded that the module validated their practice. However, these are self-reported scores, and studies on CFI implementation are needed. Such studies could explore how trainees incorporate information from the module into their practice, either through ethnographies of patient-clinician interactions or independent ratings of taped patient-clinician interviews. The CFI fidelity instrument can help researchers and administrators determine whether certain training packages affect the quality of implementation (
3). For example, is the online training module sufficient to ensure fidelity? Does it need supplementation with more active types of learning, such as behavioral simulations or expert feedback? How can programs sustain long-term fidelity while minimizing costs? Do training and fidelity change on the basis of practice setting, patient cohort, provider experience, or type of health care organization? The standardization of a CFI training module and fidelity instrument enables clinical trials to test the CFI’s mechanisms of action by examining its relationships to patient satisfaction, symptoms, and quality of life (
10). In recognition that cultural competence is in the public interest, we encourage others to use the module and share their experiences.