Skip to main content
Full access
Articles
Published Online: 15 January 2021

Clinicians’ Cognitive and Affective Biases and the Practice of Psychotherapy

Abstract

Objective:

Cognitive and affective biases are essentially connected to heuristic shortcuts in thinking. These biases ordinarily function outside of conscious awareness and potentially affect clinical assessment, reasoning, and decision making in general medicine. However, little consideration has been given to how they may affect clinicians in the conduct of psychotherapy. This article aims to illustrate how such biases may affect assessment, formulation, and conduct of psychotherapy; describe strategies to mitigate these influences; and draw attention to the need for systematic research in this area.

Methods:

Cognitive and affective biases potentially influencing clinical assessment, reasoning, and decision making in medicine were identified in a selective literature review. The authors drew from their experiences as psychotherapists and psychotherapy supervisors to consider how key biases may influence psychotherapists’ conduct of psychotherapy sessions.

Results:

The authors reached consensus in selecting illustrative biases pertinent to psychotherapy. Included biases related to anchoring, ascertainment, availability, base-rate neglect, commission, confirmation, framing, fundamental attribution error, omission, overconfidence, premature closure, sunk costs, and visceral reactions. Vignettes based on the authors’ combined experiences are provided to illustrate how these biases could influence the conduct of psychotherapy.

Conclusions:

Cognitive and affective biases are likely to play important roles in psychotherapy. Clinicians may reduce the potentially deleterious effects of biases by using a variety of mitigating strategies, including education about biases, reflective review, supervision, and feedback. How extensively these biases appear among psychotherapists and across types of psychotherapy and how their adverse effects may be most effectively alleviated to minimize harm deserve systematic study.

Highlights

Cognitive and affective biases, stemming in part from intuitive, fast-thinking processes, can contribute to illogical thinking, affect medical decision making, and adversely affect the conduct of psychotherapy.
Cognitive and affective bias-related processes are likely to mediate experiences of countertransference.
Debiasing strategies applicable in general medical settings can also be applied to reduce the adverse consequences of biases in the conduct of psychotherapy.
That psychotherapists’ judgments regarding their patients are sometimes illogical and can potentially skew what transpires during psychotherapy is not news. As Macdonald and Mellor-Clark (1) aptly put it, “Human nature confers a vulnerability to biases, blind spots, and self-enhancing illusions, which frequently distort our capacity to make rational sense of ourselves and our environment. Freud would hardly be surprised!” Psychotherapists are not immune to these vulnerabilities. Although varieties of countertransference and other factors related to therapists’ subjective judgments have been discussed for more than 100 years, a specific focus on heuristics, cognitive biases, and affective biases among psychotherapists has been limited. This article invites attention to these issues, illustrates how such biases may adversely affect the conduct of psychotherapy, describes strategies to mitigate these influences, and calls for systematic research in this area.
Conceptions of countertransference have expanded over the years. Early Freudian formulations emphasized countertransference as representing psychoanalysts’ unconscious reactions to patients, determined by their own life histories and the contents of their unconscious. These reactions include “personal countertransference,” unconscious hostile and/or erotic feelings toward patients that interfere with objectivity and limit therapists’ effectiveness. Later interpretations expanded to include all of a therapist’s reactions to his or her patients (2, 3). Countertransference may now refer to numerous biases, including those related to gender (4), cultural ethnocentrism (5, 6), religion (7), social class, and other broadly held biases, because they affect how therapists experience patients in psychotherapy (811).
Furthermore, the concept of the “intersubjective field,” based in object-relations and self-psychology psychoanalytic theories, has highlighted the intimate emotional interpersonal dance that continuously occurs between therapist and patient. The intersubjective field reflects personality elements and histories brought to therapeutic encounters separately by therapists and patients that join to create new kinds of therapeutic relationships (12). This concept acknowledges that therapists’ subjectivity is requisite and undoubtedly provides key avenues for understanding patients—but it is subjective. Overall, discussions of countertransference and intersubjectivity have focused largely on the contents of these biases but far less on their structural characteristics, that is, exactly how they come to pass.
In contrast, the work of Daniel Kahneman and Amos Tversky and their collaborators and successors focuses on the structural characteristics of heuristics. These mental shortcuts heavily influence everyday decision making and how related cognitive and affective biases come about (13, 14). Heuristics represent cognitive strategies that are automatically and unconsciously used, particularly in decision making (15). Cognitive biases refer to predispositions to think in ways that lead to failures in judgment (i.e., errors that occur when heuristics miss their marks) (16). Affective biases refer to the various ways that emotions and feelings affect judgment (16); emotions and motivation can also influence and provoke cognitive biases (14).
To contrast fast, intuitive thinking (type 1) with slow, logical deliberative thinking (type 2), a dual-process model of thinking has been proposed (1720). Intuitive thinking is largely driven by heuristics and is subject to numerous systematic errors. This model considers how everyday decision-making processes can be warped by fast thinking, which may lead to such problems as jumping to erroneous conclusions, lazy and prejudicial thinking, and related cognitive biases.
Croskerry and others (2126) have demonstrated the impact of cognitive and affective biases on diagnostic reasoning, clinical decision making, and quality of care in emergency and general medicine settings. In systematic reviews, Saposnik et al. (27) and Blumenthal-Barby and Krieger (28) have reported on cognitive biases linked to diagnostic inaccuracies and suboptimal management of clinical problems.
This work in is highly pertinent to the practice of psychotherapy, in which clinicians constantly make moment-to-moment decisions during initial diagnostic assessments and ongoing psychotherapy about what to attend to, focus on, and highlight. Psychotherapists rapidly shift their attention between what is being discussed and what is being avoided and constantly readjust perspectives for themselves and their patients. Consequently, psychotherapists’ intuitions and instantaneous decision-making processes are likely to be influenced by the same biases; studies have shown the quality of psychotherapists’ intuitions to be highly variable (29, 30). At any moment, why does a psychotherapist attend to or ignore one specific issue rather than another?
Within this context, this article attempts to extend the work on how biases can affect medical assessment and decision making in the practice of psychotherapy. We illustrate how cognitive and affective biases may adversely affect the conduct of psychotherapy, describe strategies to mitigate these influences, and call for systematic research in this area.

Methods

We conducted a selective PubMed literature search to investigate how cognitive and affective biases may affect psychotherapy practice, targeting the terms “cognitive bias” and “psychotherapy” in the title or abstract. The search yielded 12 publications containing these terms. Notably, none of these articles addressed cognitive biases of therapists (11 concerned cognitive biases of patients, and one described cognitive-bias-like states in rats).
To illustrate the impact of cognitive and affective biases on the practice of psychotherapy, we decided to limit our selection of biases to those reported in the general medical literature (2128). Through discussion, we reached consensus on a list of biases we saw as most pertinent to psychotherapy. We then drew from our experiences as psychotherapists and psychotherapy supervisors to create prototypical vignettes.

Results

We initially selected 17 cognitive and affective biases from the medical literature for consideration. Four of these were similar enough to be combined, reducing our final set to 13, as detailed in the case presentations below and in Table 1. The first case presented below, in which one of the authors (J.Y.) was involved as a colleague, is based on a prior publication (31). The other examples are composites based on clinical elements from our collected experiences rather than on individual cases. The vignettes demonstrate how each type of bias can alter the moment-to-moment conduct of psychotherapy sessions and shape the course of the psychotherapy over time. The biases we enumerate below provide an illustrative, not exhaustive, list.
TABLE 1. Additional examples of cognitive and affective biases pertinent to the conduct of psychotherapy
Bias typeExample
Anchoring: the tendency to perceptually lock onto initial salient features in the patient’s presentation within the diagnostic process and, failing to adjust this initial impression in the light of later information, becoming ossified into a certain way of thinkingEarly in a multiyear psychotherapy, in a rare moment of brooding despair, Mr. C. mused about whether life was worth the effort. Dr. D. fastened on this remark and for years tended to think of Mr. C. as potentially suicidal, even though nothing in his past or intercurrent history suggested suicidal concerns.
Ascertainment: the tendency to selectively sample data on the basis of how the clinician’s thinking is shaped by prior expectationOn the basis of her hunch that childhood trauma played a major role in Ms. L.’s development and personality formation, Dr. M. devoted hours to inquiring about possible childhood abuse, neglect, and mistreatment, even when Ms. L. felt she had offered all possible details. At the same time, Dr. M. asked very little about other pertinent life events, interpersonal interactions, and behaviors that proved to be of great significance in Ms. L.’s history and current difficulties.
Base-rate neglect: the tendency to ignore the true prevalence of disease, either inflating or underestimating the occurrence of conditions under considerationBecause the rate of tobacco smoking and intermitted marijuana use among his psychotherapy patients was so high, Dr. F. often neglected to include these issues in his diagnostic formulations or problem lists to be addressed during long-term therapy, whereas they were undeniably ongoing causes for health concern and chronically maladaptive coping mechanisms.
Commission: the tendency to act rather than wait, see, and reflect; more likely to occur among overconfident cliniciansKnown for being an active psychotherapist, Dr. O. routinely had difficulty restraining herself from offering numerous suggestions and giving advice to patients without fully hearing them out. Although some patients appreciated receiving these suggestions (many unsolicited), others clearly did not. Several patients asked whether this was supposed to be how psychotherapy went—they thought that the idea was for patients to figure things out for themselves.
Confirmation: the tendency to look for confirming evidence to support a hunch or belief, even when substantial evidence exists to refute it, rather than to look for disconfirmationHaving determined that Mr. Q.’s wife was the major contributor to their marital discord, Dr. R. kept asking about her role in their various disputes, underlining what he heard as the wife’s contributions, and supporting Mr. Q.’s contentions that their problems were all his wife’s fault. At the same time, Dr. R. consistently overlooked, disregarded, or minimized the equal or greater roles Mr. Q. played in provoking and sustaining these disputes.
Fundamental attribution error: the tendency to overemphasize dispositional or personality-based explanations for behaviors observed in others (judging the “kind” of persons they are) and under-emphasizing situational explanations for their behavior; at the same time, people tend to explain their own behaviors as resulting from situations rather than from their personal dispositionsIn treating Mr. X., a man in his early 30s, for moodiness, depressive symptoms, and marital difficulties, Dr. W. tended to attribute all of Mr. X.’s difficulties to an underlying “depressive character,” which Dr. W. depicted as negativistic and pessimistic even though before the current episode of depression Mr. X. had been easygoing and had no history of clinical depression. Dr. W. tended to minimize the impact of two facts: Mr. X. had come under the thumb of a harsh manager at work, and shortly after Mr. X. married a few years ago, his wife developed chronic medical illnesses and had become increasingly withdrawn, depressed, irritable, and nonfunctioning.
Omission: the tendency to wait and see or to avoid and neglect difficult issues; more likely seen among self-doubting clinicians; contrasts with commission biasDr. P. often felt unsure of himself and felt comfortable sinking into the role of the “silent” therapist who said little. Consequently, he often failed to follow up on important leads or inquire about sensitive issues regarding substance use and sexuality. He rationalized his avoidance by saying that if it were important enough, the patient would bring the issue up spontaneously. He seemed unaware that his failure to inquire or follow up often gave patients the excuse they needed to keep hiding some important issues from themselves as well as from Dr. P.
Overconfidence: the universal tendency to believe one knows more than he or she does; reflects a tendency to act on incomplete information, intuitions, or hunches; too much faith is placed in opinion instead of in carefully gathered evidenceCharacteristically someone to “go with his hunches” during the course of his psychotherapies, Dr. N. was known for boldly declaiming interpretations as facts, some of which showed brilliant insights and some of which were entirely wrong. Not directly tied to overconfidence per se, even when his errors were subsequently pointed out, he would often attempt to rationalize and justify his mistakes.
Premature closure: the tendency to stop thinking after discovering one explanation or certainty; other closely related biases are the search satisficing bias (the tendency to stop searching once something at all is found), the unpacking principle bias (the tendency to fail to elicit all relevant information in establishing a differential diagnosis), representativeness restraint bias (the tendency to look for the commonplace but not consider the unusual), and vertical line failure bias (the tendency to maintain narrowly focused, orthodox styles and to think in silos, failing to ask, “What else might this be?”)After learning that Ms. G. had experienced a sexual assault in early childhood, Dr. H. stopped asking about many other facets of early development, other untoward life events, and about Ms. G.’s other problematic interactions with her parents and caregivers. He assumed that the key to all of Ms. G.’s difficulties were attributable to this traumatic event. Thereafter, Dr. B. repeatedly tried to focus Ms. G. on “uncovering” the roots of her many diverse problems as connected to this event.
Visceral: the tendency for affective arousal, both positive and negative “gut reaction” feelings toward patients, to influence ongoing interactions; aligns with classic considerations of countertransferenceDr. U. found himself drawn to Ms. V. because of her generally sunny disposition, intermittent childlike sad expressions, and physical attractiveness. As a result, during their therapy he regularly overlooked or minimized her accounts of acting-out behavior, lying, and shoplifting and neglected to encourage her to face these aspects of her character or to see them as problematic.

Availability Bias: Vignette

The availability bias (closely related to “recency bias”) is the tendency to judge things as more likely if they readily come to mind. An excellent example of the availability bias has been provided by Gitlin (31), in describing his reactions to the suicide of a patient. For months after one of his psychotherapy patients committed suicide, Dr. Gitlin became preoccupied with thoughts that many of his patients might be suicidal. Subsequently, whenever patients remarked about feeling depressed, Dr. Gitlin would interrogate them about whether they were having suicidal fantasies, often with little basis to substantiate his impression. As detailed in his report, at one point a patient eventually became so exasperated with Dr. Gitlin’s persistent questions about overdosing—which she had never done or even threatened to do—that she told him, “Look, I can’t promise that I won’t kill myself, but I promise that if I do it, it won’t be with your pills. So, leave me alone already!’”
Dr. Gitlin’s visceral reactions, emotional arousal, and increased vigilance concerning suicide contributed to his heightened tendency to widely seek, and possibly see, signs of suicidality in all his patients, more so than was usual in his previous practice. After his patient’s admonition, Dr. Gitlin handled this threat of therapeutic rupture by acknowledging to the patient that he had become overly sensitive about suicide because of his recent experience. The patient’s blunt feedback also allowed him to continue to treat her successfully and helped him become aware of his bias. Dr. Gitlin also sought additional feedback from a mentor and, with his own psychotherapist, engaged in additional reflection about his reactions. He realized that his heighted attention to risks of suicide among his patients was associated with concern about his mentor’s opinion of him and his own doubts about his professional competence.

Framing Bias: Vignette

Framing bias (closely related to the “context error bias”) explains that the ways we perceive a problem may be strongly influenced by the way in which the problem is initially framed (e.g., on the basis of the patient’s previous diagnoses). One of the authors (J.K.) began treating a prominent corporate executive because the executive’s wife, convinced of his infidelity, demanded that he seek counseling. The patient was intelligent, confident, and eloquent. From the beginning of the first visit, he adamantly and repeatedly maintained his innocence and provided multiple examples of his wife’s instability. Although the psychiatrist empathically posed multiple questions about behaviors that may have been construed as unfaithful, the patient convincingly described that he agreed to these sessions to humor his wife, who was misguided, if not delusional. After the third session, the patient stated that, in fact, nothing in his life supported his wife’s claims, and he was ending treatment. The clinician tended to agree with the patient, and no further visits were scheduled. Two weeks later, the wife called, desperately requesting to meet with the psychiatrist, who agreed to see her. Although depressed and anxious about her failing marriage, she was not seeking treatment but was planning to move away after finding her husband at their home in bed with his secretary. The wife explained that she wanted validation and confirmation from the psychiatrist that her behavior was not disturbed.
After this meeting, the psychiatrist recognized that he had been gullible and taken in by the patient. On reflection, he realized that from the first contact, the patient’s status, reputation, and convincing portrayal of his wife’s psychological instability had biased the psychiatrist’s subsequent thinking, leading him to assume that the patient was telling the truth.

Sunk Costs Bias: Vignette

The sunk costs bias is the tendency, after making considerable investment, to continue putting effort and resources into ventures that appear increasingly unlikely to succeed (i.e., the unwillingness to let go of a failing strategy). For clinicians, these investments include time and energy. This bias was originally associated with financial investments.
Dr. A., a psychiatric resident, described a 15-year-old girl he had been treating for 6 months in weekly psychotherapy as anxious, rigid, and selfish. After initiating psychotherapy in one setting, they continued working together after his transfer to another clinic and supervisor. Dr. A. reported that his previous supervisor felt that he needed more time with this patient to make headway, especially because he was seeing her only weekly. Dr. A. had spent many sessions attempting to build an alliance and understand his patient, exploring her interests, playing board games, and discussing her immersion in video games. Although Dr. A. felt he knew a lot about her interests, he could not describe much about her inner life. Her parents were willing to have her continue therapy, pleased that their daughter was less resistant to this therapy than to previous therapeutic attempts. Although the patient’s anxiety remained high and she still avoided school, her parents felt the investment of time would start to pay off.
The new supervisor asked to review some video-recorded sessions and noted that the patient had many features suggestive of autism. He wondered whether the difficulties Dr. A. and the previous supervisor had experienced in questioning the patient’s lack of therapeutic response were partly related to the sunk costs bias.

Discussion

On the basis, primarily, of the work of Croskerry and his colleagues in general medicine (2126), we have illustrated how cognitive and affective biases related to heuristic mental shortcuts initially studied by Kahneman and Tversky (13, 14) may have an impact on the conduct of psychotherapy. Personal skews and biases may intrude and shape clinicians’ attentional foci during every session and in turn may account for significant differences in the moment-to-moment interactions initiated by psychotherapists. In addition to the biases noted above, others may be added to this list as well. For example, a “self-serving” bias may apply to the conduct of psychotherapy, affecting decision making related to a range of competing interests, including the clinician’s intellect, face-saving, longing for intimacy, and financial considerations (32). Hindsight biases may lead to “I-told-you-so” moments, in which clinicians selectively recall prior remarks that seemed to predict an outcome, conveniently neglecting those that may have communicated contrary messaging (33, 34).
By distorting judgments, cognitive and affective biases can impair the development of successful therapist-patient relationships from the outset of treatment and can contribute to risks of making patients feel misunderstood and to ruptures in the therapeutic alliance during treatment. Therapist anxiety may increase tendencies to fall back on fast thinking and susceptibility to specific biases (e.g., the anchoring bias may result in distorted imprinting on unimportant or distracting issues). Cognitive and affective biases can contribute to therapist gullibility in cases where therapists might unquestioningly believe patient’s distortions or lies, for example about marital fidelity or substance misuse. Biases may distort and preempt how a therapist hears the patient’s affect and concerns, leading to failure in authentic attunement with the patient.
Omission bias bears specific mention in relation to psychodynamic psychotherapy. Acts of omission by the therapist may be erroneously justified by misinterpretations of the classical psychoanalytic psychotherapist’s role, where the therapist remains silent and passive in order to offer a blank screen onto which patients project feelings, fantasies, and wishes. Justifying omissions to foster the emergence of transference neurosis is a common problem among novice psychodynamic therapists.
Overconfidence bias may reflect narcissistic blindness or overcompensation for self-doubt in the psychotherapeutic role. It may be easier to dazzle patients with “brilliant” interventions than to struggle with the difficulties of tolerating ambiguity and engaging in the self-questioning so necessary for effective clinical work.
Premature closure bias (and the closely related “vertical line failure biases”) have been noted in relation to many types of psychotherapy. Clinicians affected by this bias may be close minded, sometimes because of allegiance to narrow theoretical models, which blind them to other ways of thinking. Procrustean approaches, in which observations about patients are distorted to fit the theory, may result when therapists lack attunement to the bigger clinical picture. By avoiding narrow-minded thinking, clinicians who are adaptive experts assume broad-based understandings of their patients’ problems and are open to examining all models, in contrast to experts who rely primarily on their own familiar, well-practiced routines (35). Notably, additional clinical experience alone does not guarantee that clinicians will be more immune to cognitive biases, such as premature closure (36).
Visceral biases lie at the core of countertransference reactions in any type of treatment situation. Problems are more likely to occur when clinicians immediately act on their visceral reactions rather than reflecting on what these reactions are signaling. When therapists viscerally respond to patients by experiencing telltale signs such as boredom, sleepiness, irritation, erotic feelings, repugnance, anger, overhelpfulness, strong idealization, or feeling threatened, for example, these signals can pave the way toward greater accuracy in treatment by pulling therapists deeper into patients’ inner worlds. Exploring whether such signals may also be experienced by others with whom patients interact outside therapeutic settings can enrich therapists’ understanding of their patients. By carefully acknowledging and selectively sharing their own feelings, therapists’ may help some patients to better identify and deal with feelings that the patients have difficulty tolerating.
In some of the clinical examples provided in this article, therapeutic stalemates or near ruptures occurred when therapists failed to see or acknowledge their own roles in the difficulty, at least initially. Almost all such ruptures can be addressed, but only if clinicians are open to acknowledging their contributions and are willing to confront them. As illustrated, other biases result in failures to perform adequate initial assessments, for example failure to fully inquire about biological and social factors and to accurately understand key events precipitating the patient’s application for treatment, all of which may highlight underlying core issues.
Attempts to align cognitive and affective biases with customary views of countertransference are complicated by the lack of clear and widely accepted definitions for these terms and by the fact that studies concerning biases and countertransference have been developed through different intellectual traditions, virtually in separate silos. Whereas the broadest definitions of countertransference may subsume cognitive and affective biases, Croskerry et al. (23) have considered countertransference, emotional biases, and fundamental attribution errors to be separate sources of emotional influence on clinical performance. Factors contributing to clinician biases may include hardwiring (genetics, temperament), regulation by emotions, overlearning (repetitive exposure), implicit learning, and deliberate but erroneous use of biases that have become established through previous inferior decision making (37). Each of these processes is also likely to contribute vulnerabilities to broadly defined countertransference.
How might adverse effects of clinicians’ cognitive and affective biases on the conduct of psychotherapy be alleviated? Because these biases are deeply entrenched, mitigation is difficult and unlikely to occur easily or to be sustained with single applications of one-size-fits-all techniques (26). A systematic review (38) identified 60 mitigation strategies, the majority of which were shown to be at least partially successful. These debiasing strategies have used combinations of cognitive, technological, affective, and motivational approaches. Cognitive approaches have aimed to increase individuals’ awareness and critical thinking, technological approaches have used graphs and statistics to inform individuals about problems concerning base-rate neglect or framing biases, affective approaches have focused on or induced feelings associated with biases, and motivational approaches have attempted to hold individuals accountable for the results of their biases (38). Overall, in the context of solving real-world problems, case-based learning appears to be more effective than simple presentation of abstract rules (39, 40).
Several techniques developed for general medical settings may be applicable for psychotherapy. Among the suggested cognitive bias mitigating approaches are debiasing approaches, such as being more skeptical, affective debiasing, metacognition, mindfulness and reflection, slowing down strategies, rebiasing, personal accountability, educating intuition, and cultural training (4143); detailed instruction and education concerning cognitive biases (44); formal feedback (1); consideration of alternatives; increased attention to certain types of ignored data (Bayesian thinking) (45, 46); and decreasing reliance on memory (45). These strategies are consistent with long-standing traditions of psychotherapy education, personal reflection, and supervision. The majority of these strategies are aimed at helping clinicians slow down, reflect, and think deliberately. We recommend the following techniques.
First, we suggest psychological immunization that is based on educating psychotherapists about the existence of these biases early during their training so that they may be aware of the dangers when initially forming professional identities as psychotherapists. All psychotherapy training programs, regardless of theoretical orientation, can review these biases and their potential impact at the beginning and can address them repeatedly throughout training. Learning to detect and minimize the adverse effects of countertransference, including biases, is a key element of psychotherapists’ professionalization and is essential for establishing safe, empathic, and nonjudgmental environments. Because cognitive and affective biases occur more often when therapists’ cognitive resources are stressed or limited, clinicians should be educated about cognitive resources or load and should monitor their own sleep, physical health, stress levels, and time pressures as well as their strong emotions, throughout their careers (23).
Second, psychotherapists can practice several techniques for metacognitive reflection, reviewing how they conduct psychotherapy by recollecting, writing, and reflecting on session-by-session progress notes and by reviewing audio and video recordings of psychotherapy sessions.
Third, at all career stages, psychotherapists can benefit from individual or group supervision, where countertransference-related issues are identified and discussed. Formal feedback can mitigate the influences of bias on therapy (1). (As in one of the cases presented above, even direct informal feedback from patients can be impactful.) Especially during training, there may be no substitute for seminars led by seasoned teachers using process notes and video recordings, in which trainees present ongoing therapy cases to groups of peers. Interpersonal process recall offers a specific technique for microscopically reviewing psychotherapy processes and may be especially helpful for detecting the intrusion of biases (47). Especially useful are examinations of complex cases that have warning signs of potential bias, where the case is not proceeding as expected or where the therapeutic alliance is slipping. In accord with the concept of “slow medicine” in internal medicine, which advocates not rushing into new treatments or paths until they are substantiated, the overriding purpose is to help therapists think before they speak or act. To our knowledge, no formal tools or self-assessment measures have yet been developed to assist with efforts to identify cognitive biases in the conduct of psychotherapy, but calls for their development in other health settings have appeared in the literature (38). Such tools could help supervisors more systematically attend to biases among trainees.
Finally, this preliminary report raises numerous questions for further study. For example, can we develop formal tools, including self-assessment measures, to better identify cognitive biases in the conduct of psychotherapy? How do psychotherapists differ in their propensities for various biases and the frequency with which these occur in their psychotherapies? How do different psychotherapy approaches and techniques vary in their vulnerabilities to psychotherapists’ biases? How do bias differences translate to specific countertransference vulnerabilities, including those related to ethnocentricities and gender biases? What accounts for the variances among all these characteristics? If we can identify these biases during training and supervision, how can mitigating strategies be used to best alter their adverse impacts? Additionally, how do all these factors influence the outcome of psychotherapy?

Conclusions

In summary, we have called attention to the potential impact of cognitive and affective biases that accompany heuristic mental shortcuts on the conduct of psychotherapy. Vignettes showing pertinent cognitive and affective biases have been used to illustrate how these biases may affect psychotherapists’ perceptions and decision making, both moment to moment and over time. Although heuristics are clearly valuable tools of thought, several practical strategies are available to help mitigate the negative impact of co-occurring cognitive and affective biases on psychotherapy practice. Because these considerations are preliminary, much remains to be investigated regarding the frequency, variability, modifiability, and ultimate clinical significance of these biases in the conduct and outcomes of psychotherapy.

References

1.
Macdonald J, Mellor-Clark J: Correcting psychotherapists’ blindsidedness: formal feedback as a means of overcoming the natural limitations of therapists. Clin Psychol Psychother 2015; 22:249–257
2.
Parth K, Datz F, Seidman C, et al: Transference and countertransference: a review. Bull Menninger Clin 2017; 81:167–211
3.
Betan E, Heim AK, Zittel Conklin C, et al: Countertransference phenomena and personality pathology in clinical practice: an empirical investigation. Am J Psychiatry 2005; 162:890–898
4.
Fisher EH: Gender bias in therapy? An analysis of patient and therapist causal explanations. Psychotherapy 1989; 26:389–401
5.
Degrie L, Gastmans C, Mahieu L, et al: How do ethnic minority patients experience the intercultural care encounter in hospitals? A systematic review of qualitative research. BMC Med Ethics 2017; 18:2
6.
Yamamoto J, James QC, Palley N: Cultural problems in psychiatric therapy. Arch Gen Psychiatry 1968; 19:45–49
7.
Bienenfeld D, Yager J: Issues of spirituality and religion in psychotherapy supervision. Isr J Psychiatry Relat Sci 2007; 44:178–186
8.
Devereaux D: The issue of race and the client-therapist assignment. Issues Ment Health Nurs 1991; 12:283–290
9.
Beutler LE, Malik M, Alimohamed S, et al: Therapist variables; in Bergin and Garfield’s Handbook of Psychotherapy and Behavior Change, 5th ed. Edited by Lambert MJ. New York, Wiley & Sons, 2004
10.
Lingiardi V, Muzi L, Tanzilli A, et al: Do therapists’ subjective variables impact on psychodynamic psychotherapy outcomes? A systematic literature review. Clin Psychol Psychother 2018; 25:85–101
11.
Sánchez-Bahíllo Á, Aragón-Alonso A, Sánchez-Bahíllo M, et al: Therapist characteristics that predict the outcome of multipatient psychotherapy: systematic review of empirical studies. J Psychiatr Res 2014; 53:149–156
12.
Stolorow RD, Atwood GE: The intersubjective perspective. Psychoanal Rev 1996; 83:181–194
13.
Tversky A, Kahneman D: Judgment under uncertainty: heuristics and biases. Science 1974; 185:1124–1131
14.
Kahneman D: Thinking, Fast and Slow. New York, Farrar, Straus and Giroux, 2011
15.
Gigerenzer G, Goldstein DG: Reasoning the fast and frugal way: models of bounded rationality. Psychol Rev 1996; 103:650–669
16.
Balogh EP, Miller BT, Ball JR (eds): Improving Diagnosis in Health Care.Washington, DC, National Academies Press, 2015
17.
Norman GR, Monteiro SD, Sherbino J, et al: The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med 2017; 92:23–30
18.
Epstein S: Integration of the cognitive and the psychodynamic unconscious. Am Psychol 1994; 49:709–724
19.
Evans JS, Stanovich KE: Dual-process theories of higher cognition: advancing the debate. Perspect Psychol Sci 2013; 8:223–241
20.
Helfrich CD, Rose AJ, Hartmann CW, et al: How the dual process model of human cognition can inform efforts to de-implement ineffective and harmful clinical practices: a preliminary model of unlearning and substitution. J Eval Clin Pract 2018; 24:198–205
21.
Croskerry P: Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002; 9:1184–1204
22.
Croskerry P: The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003; 78:775–780
23.
Croskerry P, Abbass A, Wu AW: Emotional influences in patient safety. J Patient Saf 2010; 6:199–205
24.
Ely JW, Graber ML, Croskerry P: Checklists to reduce diagnostic errors. Acad Med 2011; 86:307–313
25.
Croskerry P, Cosby K, Graber ML, et al: Diagnosis: Interpreting the Shadows. Boca Raton, FL, CRC Press, 2017
26.
Croskerry P: The Cognitive Autopsy: A Root Cause Analysis of Medical Decision Making. Oxford, UK, Oxford University Press, 2020
27.
Saposnik G, Redelmeier D, Ruff CC, et al: Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak 2016; 16:138
28.
Blumenthal-Barby JS, Krieger H: Cognitive biases and heuristics in medical decision making: a critical review using a systematic search strategy. Med Decis Making 2015; 35:539–557
29.
Ægisdóttir S, White MJ, Spengler PM, et al: The meta-analysis of clinical judgment project: fifty-six years of accumulated research on clinical versus statistical prediction. Couns Psychol 2006; 34:341–382
30.
Spengler PM, White MJ, Ægisdóttir S, et al: The meta-analysis of clinical judgment project: effects of experience on judgment accuracy. Couns Psychol 2009; 37:350–399
31.
Gitlin MJ: A psychiatrist’s reaction to a patient’s suicide. Am J Psychiatry 1999; 156:1630–1634
32.
Coleman MD: Emotion and the self-serving bias. Curr Psychol 2011; 30:345–354
33.
Blank H, Nestler S, von Collani G, et al: How many hindsight biases are there? Cognition 2008; 106:1408–1440
34.
Coolin A, Erdfelder E, Bernstein DM, et al: Explaining individual differences in cognitive processes underlying hindsight bias. Psychon Bull Rev 2015; 22:328–348
35.
Mylopoulos M, Woods NN: Having our cake and eating it too: seeking the best of both worlds in expertise research. Med Educ 2009; 43:406–413
36.
Krupat E, Wormwood J, Schwartzstein RM, et al: Avoiding premature closure and reaching diagnostic accuracy: some key predictive factors. Med Educ 2017; 51:1127–1137
37.
Stanovich KE: Rationality and the Reflective Mind. New York, Oxford University Press, 2011
38.
Ludolph R, Schulz PJ: Debiasing health-related judgments and decision making: a systematic review. Med Decis Making 2018; 38:3–13
39.
Aczel B, Bago B, Szollosi A, et al: Is it time for studying real-life debiasing? Evaluation of the effectiveness of an analogical intervention technique. Front Psychol 2015; 6:1120
40.
Croskerry P, Singhal G, Mamede S: Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013; 22(suppl 2):ii58–ii64
41.
Croskerry P, Singhal G, Mamede S: Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf 2013; 22(suppl 2):ii65–ii72
42.
Croskerry P: Cognitive bias mitigation; in Diagnosis: Interpreting the Shadows. Edited by Croskerry P, Cosby KS, Graber ML, et al. London, CRC Press, 2017
43.
Kolodner JL: Educational implications of analogy. A view from case-based reasoning. Am Psychol 1997; 52:57–66
44.
Royce CS, Hayes MM, Schwartzstein RM: Teaching critical thinking: a case for instruction in cognitive biases to reduce diagnostic errors and improve patient safety. Acad Med 2019; 94:187–194
45.
Jenkins MM, Youngstrom EA: A randomized controlled trial of cognitive debiasing improves assessment and treatment selection for pediatric bipolar disorder. J Consult Clin Psychol 2016; 84:323–333
46.
Arkes HR: Impediments to accurate clinical judgment and possible ways to minimize their impact. J Consult Clin Psychol 1981; 49:323–330
47.
Grant J, Schofield MJ, Crawford S: Managing difficulties in supervision: supervisors’ perspectives. J Couns Psychol 2012; 59:528–541

Information & Authors

Information

Published In

Go to American Journal of Psychotherapy
Go to American Journal of Psychotherapy
American Journal of Psychotherapy
Pages: 119 - 126
PubMed: 33445958

History

Received: 19 June 2020
Revision received: 3 August 2020
Revision received: 12 October 2020
Accepted: 27 October 2020
Published online: 15 January 2021
Published in print: August 01, 2021

Keywords

  1. Cognitive and Affective Bias
  2. Countertransference
  3. Psychotherapy
  4. Clinical decision-making

Authors

Details

Joel Yager, M.D. [email protected]
Department of Psychiatry, University of Colorado School of Medicine, Aurora (Yager, Kelsay); Department of Psychiatry, Boonshoft School of Medicine, Wright State University, Dayton, Ohio (Kay).
Jerald Kay, M.D.
Department of Psychiatry, University of Colorado School of Medicine, Aurora (Yager, Kelsay); Department of Psychiatry, Boonshoft School of Medicine, Wright State University, Dayton, Ohio (Kay).
Kimberly Kelsay, M.D.
Department of Psychiatry, University of Colorado School of Medicine, Aurora (Yager, Kelsay); Department of Psychiatry, Boonshoft School of Medicine, Wright State University, Dayton, Ohio (Kay).

Notes

Send correspondence to Dr. Yager ([email protected]).

Competing Interests

The authors report no financial relationships with commercial interests.

Metrics & Citations

Metrics

Citations

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

For more information or tips please see 'Downloading to a citation manager' in the Help menu.

Format
Citation style
Style
Copy to clipboard

View Options

View options

PDF/EPUB

View PDF/EPUB

Login options

Already a subscriber? Access your subscription through your login credentials or your institution for full access to this article.

Personal login Institutional Login Open Athens login
Purchase Options

Purchase this article to access the full text.

PPV Articles - APT - American Journal of Psychotherapy

PPV Articles - APT - American Journal of Psychotherapy

Not a subscriber?

Subscribe Now / Learn More

PsychiatryOnline subscription options offer access to the DSM-5-TR® library, books, journals, CME, and patient resources. This all-in-one virtual library provides psychiatrists and mental health professionals with key resources for diagnosis, treatment, research, and professional development.

Need more help? PsychiatryOnline Customer Service may be reached by emailing [email protected] or by calling 800-368-5777 (in the U.S.) or 703-907-7322 (outside the U.S.).

Media

Figures

Other

Tables

Share

Share

Share article link

Share