Outcomes in the context of empirical practice

Rami Benbenishty, Ph.D., Professor, Paul Baerwald School of Social Work, Hebrew University

Discussions of outcomes in human services remind me of a story of a couple who was planning their new dream home. Unfortunately, they could never start building the house. They could not agree what tropical plans will be in their rooftop garden. I would like to suggest that outcomes should be addressed as an integral part of a larger discussion of the informational foundation of empirical and accountable practice. I argue that we need an overall strategy that will lay a solid foundation for addressing outcome issues.

As an overall organizing principle for this strategy I suggest the definition of empirical practice as one that: is being continuously informed and guided by empirical evidence that is systematically guided and processed. This general definition guides us to look for much wider array of information of empirical practice as one that: This general definition guides us to look for a much wider array of information need of practitioners, than a focus on outcomes. It implies that knowing the characteristics of our clients, and knowing what we do and with whom, is essential for building effective practice. Instead of asking "how effective are we," we are asking a series of questions that will help us become more effective. Furthermore, we are addressing information needs of all partners to care: practitioners, supervisors, administrators, managers and policy makers. Empirical practice and move from a focus on an individual worker to a "worker within an agency." I therefore propose that we think of ways to support empirical practice ON THE AGENCY LEVEL. The effort should be to plan a coordination and aggregation of efforts of all individual practitioners within an agency.

Given the task and the context, I suggest several guiding principles.

A. Monitor clients, intervention AND outcomes. The current excitement around outcomes diverts attention from the need to have a good understanding of clients and of intervention that interact to shape outcomes.

B. INTEGRATE systematic monitoring into practice. De-emphasize large scale one-shot studies and make systematic gathering and processing of information an integral part of everyday practice.

C. Inform practice continuously. An essential element of integrating systematic monitoring into practice is to inform practitioners of what is being learned on a regular basis as a regular part of practice. The responsibility for learning from experience shifts to the agency level. This is a shift from 2 current models: the practitioner-researcher model that focuses on learning from experience of the individual worker, and the research dissemination model, that counts on academic research filtering to individual practitioners.

D. Build on local generalizations. One of the implications of the above argument is that we count on limited generalizations. We build on knowledge that is relevant to the immediate environment; its generalizability to other agencies is not clear.

E. Integrate information technology into practice. The only practical way today is to follow the principles outlined above is to make computers and other technology part of practice. I suggest to create an environment that supports all levels of the agency. It is based on an overall view of the information needs of all partners to care. It is driven by the need to support quality practice, rather than report to outside agencies. It structures data gathering, processes information and provides feedback on a daily basis. This environment both learns and teaches.

An agency that is able to create this environment is more prepared than others to face the challenge of accountable practice. It will have a better chance to provide quality service. It will also stand a better chance of surviving the challenges facing human services in an era of shrinking political and financial support.

Return to Program

Return to Abstracts