Sunday, February 5, 2023

Continuing education workshops do not produce sustained skill development—quite the opposite; any modest improvement in performance erodes over time without further coaching

The implications of the Dodo bird verdict for training in psychotherapy: prioritizing process observation. Henny A. Westra. Psychotherapy Research, Dec 16 2022. https://doi.org/10.1080/10503307.2022.2141588

Abstract: Wampold et al.’s 1997 meta-analysis found that the true differences between bona fide psychotherapies is zero, supporting the Dodo bird conjecture that “All have won and must have prizes”. Two and half decades later, the field continues to be slow to absorb this and similar uncomfortable discoveries. For example, entirely commensurate with Wampold’s conclusion is the meta-analytic finding that adherence to a given model of psychotherapy is unrelated to therapy outcomes (Webb et al., 2010). Despite the clear implication that theoretical models should not be the main lens through which psychotherapy is viewed if we are aiming to improve outcomes, therapists continue to identify themselves primarily by their theoretical orientation. And a major corollary of Wampold’s conclusions is that despite the evidence for non superiority of a given model, our focus in training continues to be model-driven. This article seeks to elaborate the training implications of Wampold et al.’s conclusion, with a rationale and appeal to incorporate process-centered training.

Consider these similarly uncomfortable findings regarding the state of training. We assume, rather than verify the efficacy of our training programs. Yet, there is no evidence that continuing education workshops for example, produce sustained skill development—quite the opposite. Large effects on self-report are found but any modest improvement in performance erodes over time without further coaching (Madson et al., 2019). Perhaps most concerning, psychotherapists do not appear to improve with experience and in fact, the evidence suggests that skills may decline slightly over time (Goldberg et al., 2016). Not surprisingly then, while the number of model-based treatments has proliferated, the rate of client improvement has not followed suit (Miller et al., 2013). Could stagnant training methods may be related to stagnant patient outcomes?

We need innovations in training that better align our training foci and methods with factors empirically supported as influencing client outcomes. Process researchers have long observed that trained process coders (typically for research purposes) make better therapists due to their enhanced attunement (e.g., Binder & Strupp, 1997). While such training is not yet available in training programs, it arguably should be based on emerging developments in the science of expertise (Ericsson & Pool, 2016) and the urgent need to bring outcome information forward in real time so that it can be used to make responsive adjustments to the process of therapy. In fact, such information could be considered “routine outcome monitoring in real time” (Westra & Di Bartolomeo, 2022).

To elaborate, Tracey et al. (2014) provocatively argued that acquiring expertise in psychotherapy may not even be possible. This is because the ability to predict outcomes is crucial to shaping effective performance. Yet there is a lack of feedback available to therapists regarding the outcomes of their interventions and such information, if it comes at all, comes too late to make a difference in the moment. Therapists are essentially like blind archers attempting to shoot at a target. The development of Routine Outcome Monitoring (ROM) measures capable of forecasting likely outcomes is a major advance in correcting this blindness and improving predictive capacity. However, in order to be effective for skill development, feedback needs to occur more immediately so that the relationship between the therapist action and the client response (or nonresponse) can be quickly ascertained and adjustments made in real time. Interestingly, while ROM has been helpful in improving failing cases, it has not been effective in enhancing clinical skills more generally (Miller et al., 2013).

Learning to preferentially attend to, extract and continuously integrate empirically supported process data may prove to be the elusive immediate feedback that has been lacking in psychotherapy training but that is crucial to developing expertise. Observable process data that has been validated through process science as differentiating good from poor patient outcomes, could be considered “little outcomes”; which in turn are related to session outcomes and ultimately treatment outcome (Greenberg, 1986). Moreover, thin-slicing research supports that it is possible to make judgements about important outcomes from even tiny slices of expressive behavior (Ambady & Rosenthal, 1992). If one considers real time process information as micro-outcomes, properly trained clinicians, just like expert-trained process coders, may no longer have to be blind. For example, a therapist trained to identify and monitor resistance and signals of alliance ruptures, can be continuously tracking these important phenomena and responsively adjusting to safeguard the alliance. Or a therapist who is sensitive to markers of low and high levels of experiencing (Pascual-Leone & Yeryomenko, 2017) and client ambivalence (Westra & Norouzian, 2018) can not only optimize the timing of their interventions but also continuously watch the client for feedback on the success of their ongoing efforts.

Being steeped in process research gives one a unique perspective on the promise of process observation to advance clinical training. Our lab recently took our first foray into studying practicing community therapists. As we coded the session videotapes, we became aware that we possessed a unique skill set that was absent in therapist’s test interviews. Therapists seemed to be guided solely by some model of how to bring about change but failed to simultaneously appreciate the ebb and flow of the relational context of the work. They seemed absorbed in their own moves (their model) but not aware that they were in a dance and must continually track and coordinate the process with their partner. It seemed that we had incidentally trained ourselves to detect and use these process signals. Our training was different and very unique; it was more akin to deliberate practice focused on discrimination training for detecting empirically supported processes.

In short, information capable of diagnosing the health of the process and critically, of forecasting eventual outcomes is arguably hiding in plain sight if one can acquire the requisite observational capacity to harvest it. And transforming an unpredictable environment into a predictable one makes expertise possible to acquire (Kahneman, 2011). Importantly, extracting such vital information relies on observational skill, rather than patient report, end of session measures, or longer-term outcome; thus, such real time data extraction is immediately accessible and can complement existing outcome monitoring (Westra & Di Bartolomeo, 2022). Moreover, process markers are often opaque; requiring systematic observational training for successful detection. Without proper discrimination and perceptual acuity training, this gilded information remains obscured. Thus, heeding Wampold et al.’s call to refocus our efforts must include innovations in training; innovations that harness outcome information. We need more process research to further uncover the immediately observable factors capable of differentiating poor and good outcomes, but existing process science gives us a good start. And since process-centered training is transtheoretical, it can exist alongside models of therapy—learning to see while doing (Binder & Strupp, 1997). Training in psychotherapy has primarily prioritized intervention (models) and now it may be time to emphasize observation.

No comments:

Post a Comment