One of the difficulties in social work lies in finding forms of evaluation that suit the various kinds of work people actually do. There is a need for methods that allow a valid and rigorous evaluation of process as well as outcomes, particularly in new areas of work. A valuable example of such evaluation evolved at St. Luke's, a voluntary agency in Victoria, Australia, that used design evaluation to articulate a new model of practice. Design evaluation involved having the evaluator work with staff to describe work practices as a series of stages with associated principles. This process of documenting, clarifying, and illuminating the model led to its progressive refinement and concurrent improved services delivery to clients. Testing the program model with clients provided some valuable feedback about its effectiveness. This form of evaluation is complementary to outcomes evaluation and can also provide the information needed for a subsequent outcomes evaluation.
Key words: evaluation; family work; practice research; social work practice
Currently, evaluation is very much on the agenda in health and social work. There are greater expectations of agencies and individual workers to justify funding, methodologies, and effectiveness (Martin & Kettner, 1997). However, one of the dilemmas generated in fields like social work is just how to evaluate our work in ways that are congruent with the way we go about it. There are limitations with using only methods of outcomes-based research in that they may be too narrowly defined to draw out key issues and processes (Weiss, 1991). A variety of methods is needed that are acceptably rigorous, particularly where practice is still being developed (Halmi, 1996; Patton, 1987). Ideally, such evaluation would aim "to maximize the success of intervention rather than only assessing its impact" (McCroskey & Nelson, 1989, pp. 585). It would also avoid polarizing qualitative and quantitative evaluation by viewing these as complementary (Weiss, 1991). In talking about the use of social science research in community o rganization, Rothman (1974) advocated "a situation in which the science and art of community organization may inform each other" (p. 408). He also identified the importance of naming processes in exploring a range of possible community intervention strategies (Rothman, 1995).
Another dilemma is how to make sure that evaluation benefits an agency. Evaluation is more likely to be used by practitioners if the evaluator works closely with them in doing the research (Owen, 1995) and if the evaluator finds the evaluation method that fits best with a particular situation in a particular agency (Patton, 1987). For some agencies, an additional dilemma is wanting evaluation methods that fit with social work philosophy and practice. This might mean, for example, evaluation that is respectful and empowering and includes participation of those being researched (McDermott, 1996).
These were the dilemmas facing St. Luke's, a large voluntary agency in Victoria, Australia, in evaluating a new method of working with families. St. Luke's offers a wide range of services to individuals and families. In the late 1980s and early 1990s, it made significant changes to its practice, introducing a solution-focused, competency-based approach as well as an integrated model of services delivery. This meant that families were able to access a range of resources through a generalist worker rather than seeing a different worker for each service (Scott & O'Neill, 1996). However, these changes happened gradually and with minimal documentation. Subsequent pressure on funding added extra weight to the need to justify and formalize the model that had evolved. In designing the evaluation, Professor John Owen from the Centre for Program Evaluation at the University of Melbourne was asked to join the reference group for the evaluation. The reference group was made up of agency representatives and stakeholders from other organizations. …