Skip to content

Director’s Blog: Making the Most of our Interventions Research

on

Why are interventions that are shown to be effective not routinely disseminated? One concern is that the interventions developed in a research environment may not always be good fits with community settings.

Typically, intervention development and efficacy trials are conducted in academic or laboratory settings with carefully screened patients who often are free of medical conditions and other co-occurring problems (e.g., substance use disorders). Yet people in community settings more often than not have coexisting disorders, complicating their diagnosis and treatment. In addition, staff involved in efficacy trials usually are highly trained, carefully selected, and closely supervised. Contrast that with some segments of the public sector, where it is estimated that as much as 40 percent of mental health providers do not have graduate or professional degrees. Moreover, research-derived psychosocial therapies are sometimes developed without consideration of people's typical service use patterns. Studies have found that patients often attend many fewer sessions than are usually prescribed by research therapies. Finally, these intervention studies rarely take into account the realities of billing and reimbursement constraints for both the privately and publicly insured.

Consequently, the traditional academic or "regulatory-model" approach yields information that might be scientifically valid but it can limit the practical value of research results as well as the ultimate utility of these interventions for use in clinical practice. This gap between lab and practice might be so wide that the research-derived interventions might not be transportable or might not prove effective in community and practice settings, regardless of how strongly they perform in efficacy studies. We end up with "orphan" interventions that are ill-suited for real-world conditions. And even in the optimized setting of a research trial, the results may be statistically significant but not clinically meaningful.

How can we close the gap? One approach, recommended nearly a decade ago by Hoagwood, Burns, and Weisz (2002) and reinforced by the NIMH Strategic Plan, called for a clinic-based treatment development approach. Weisz and colleagues have suggested a deployment-focused model of intervention development and testing that incorporates information about typical patients, providers, and settings, as well as the perspectives of multiple stakeholder groups (e.g., consumers, family members, providers, administrators, and payers) early in the intervention testing process. Simon suggested that the gap between research and practice will be bridged best by integrating practice into research, which is routinely done in pediatric oncology. The Mental Health Research Network  attempts to do this by creating a learning health care system to ensure that patients become partners in research. Obviously, the development of practice-ready interventions will require researchers to consider not only the characteristics of the intervention being tested and those of the patients, but also the characteristics of providers and the usual settings in which care is provided.

To be clear, we are seeking to support research on interventions that can be disseminated broadly, will change provider behavior, and impact clinical outcomes in the world. We are seeking approaches that are relevant to underserved clinical populations, can be readily taught to the existing workforce with minimal cost, can be monitored for quality inexpensively, and can be tweaked through cost-effective supervision practices. Finally, we want researchers to be mindful of economic considerations that drive broad implementation, and recognize that even blockbuster interventions will lack impact unless someone agrees to pay for them.


iSubstance Abuse and Mental Health Services Administration (2007). An Action Plan for Behavioral Health Workforce Development .

iiSoutham-Gerow, M.A., Ringeisen, H.L., & Sherrill, J.T. (2006). Intergrating interventions and services research: Progress and prospects. Clinical Psychology: Science and Practice, 13, 1-8.

iiiNorquist, G., Lebowitz, B., & Hyman, S. (1999). Expanding the frontier of treatment research. Prevention & Treatment, 2, np.

ivHoagwood, K., Hibbs, E., Brent, D., & Jensen, P. (1995). Introduction to the special section: Efficacy and effectiveness in studies of child and adolescent psychotherapy. Journal of Consulting and Clinical Psychology, 63, 683-687.

vWeisz, J.R., Chu, B.C., Polo, A.J. (2004). Treatment dissemination and evidence-based practice: Strengthening intervention through clinician-researcher collaboration. Clinical Psychology: Science and Practice, 11, 300-307.

viHoagwood K., Burns B.J., Weisz J. (2002). A profitable conjunction: from science to service in children's mental health, In Community-Based Interventions for Youth with Severe Emotional Disturbances. Edited by Burns B.J., Hoagwood K. New York, Oxford University Press.

viiWeisz, J.R., Jensen, A.L., & McLeod, B.D. (2005). Development and dissemination of child andadolescent psychotherapies: Milestones, methods, and a new deployment-focused model. In E.D. Hibbs & P.S. Jensen (Eds.) Psychosocial Treatments for child and adolescent disorders: Empirically-based approaches, 2nd edition. Washington, DC: American Psychological Association.