Views of Evidence-Based Practice: Social Workers' Code of Ethics and Accreditation Standards as Guides for Choice

By Gambrill, Eileen | Journal of Social Work Education, Fall 2007 | Go to article overview

Views of Evidence-Based Practice: Social Workers' Code of Ethics and Accreditation Standards as Guides for Choice


Gambrill, Eileen, Journal of Social Work Education


THERE HAS BEEN CONSIDERABLE interest in evidence-based practice (EBP) over the past years, including discussion of its role in social work (Briggs & Rzepnicki, 2003; Gambrill, 2006; Gibbs, 2003; Howard, McMillen, & Pollio, 2003; Mullen, Shlonsky, Bledsoe, & Bellamy, 2005; Thyer & Kazi, 2004). The realities of EBP and the difficulties in dealing with these realities depend on the view of EBP. Descriptions of EBP differ in their breadth and attention to ethical issues, ranging from the broad, systemic philosophy and related evolving process and technology envisioned by its originators (e.g., Gray, 1997; Sackett, Richardson, Rosenberg, & Haynes, 1997) to narrow views (using practice guidelines) and total distortions (Gambrill, 2003). Rosen and Proctor (2002) used the term EBP "primarily to denote that practitioners will select interventions on the basis of their empirically-demonstrated links to the desired outcomes" (p. 743). Reid (2001) stated that EBP "consists of requiring practitioners to use empirically-based treatments" (p. 278). As described by its originators, EBP involves much more, including considering client values and expectations as well as local constraints (see Table 1).

Social workers' Code of Ethics (National Association of Social Workers, 1999) and the Educational Policy and Accreditation Standards of the Council on Social Work Education (2001) can be used as guides to select among different views of EBP as well as to negotiate application obstacles that arise in all related venues including professional education. These guides describe obligations, such as drawing on practice- and policy-related research, critical thinking, competence, accountability, service to clients, informed consent, respect and integrity, promotion of social justice, and lifelong learning.

Such obligations provide a guide for responding ethically to controversies regarding questions such as, "What is evidence?" and "When do we have enough to recommend a policy or practice?" Social workers' Code of Ethics (National Association of Social Workers, 1999) stresses beneficence (helping), avoidance of harm, informed consent, autonomy (self-determination), and social justice. I suggest that only if evidentiary criteria are considered, can these interrelated obligations be honored. Only when specific, real-life examples are examined can trade offs be determined.

Evidence-Based Practice: A Process and Philosophy

Given the many different views of EBP, it is important to review the vision of EBP and policy presented in original sources (Gray, 1997; Sackett et al., 1997). EBP describes a philosophy and process designed to forward effective use of professional judgment in integrating information regarding each client's unique characteristics, circumstances, preferences, actions, and external research findings: "It is a guide for thinking about how decisions should be made" (Haynes, Devereaux, & Guyatt, 2002, p. 2). EBP involves the "conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual [clients]" (Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996, p. 2). Sackett et al. (1997) described a process for EBP and a professional educational format (problem-based learning) designed to help practitioners link evidentiary, ethical, and application issues (see also Straus, Richardson, Glasziou, & Haynes, 2005). This process entails "the integration of best research evidence with clinical expertise and [client] values" (Sackett, Straus, Richardson, Rosenberg, & Haynes, 2000, p. 1). It encourages us to ask, "How good is the evidence?" and "Could I be wrong?" Such questions illustrate the close connection between critical thinking and evidence-informed practice. Recently, more attention has been given to client preferences and actions because what clients do often differs from their stated preferences, and estimates of preferences are often wrong (Haynes, Devereaux, & Guyatt, 2002).

It is assumed that professionals often need information to make decisions: for example, information concerning risk assessment or what services are most likely to help clients attain outcomes they value. As Gray (2001a, p. 354) suggested, when evidence is not used, important failures in decision making occur: (a) ineffective interventions are used, (b) interventions that do more harm than good are used, (c) interventions that do more good than harm are not used, and (d) interventions that are ineffective or do more harm than good are not discontinued.

Clinical expertise refers to "the ability to use our clinical skills and past experience to rapidly identify each [client's unique characteristics, including] their individual risks and benefits of potential interventions and their personal circumstances and expectations" (Sackett et al., 2000, p. 1). Clinical expertise is used to integrate information from diverse sources. Client values refer to "the unique preferences, concerns, and expectations each [client] brings to an ... encounter and which must be integrated into ... decisions if they are to serve the [client]" (Sackett et al., 2000, p. 1). Evidence-based health care refers to making informed policy and management decisions (Gray, 2001a).

EBP is a process for handling the uncertainty surrounding decisions that must be made in real life, in real time. It is a way of dealing honestly with uncertainty (Chalmers, 2004). The uncertainty associated with decisions is acknowledged, not hidden. Transparency regarding the evidentiary status of services is a hallmark. For example, on the back cover of the seventh edition of Clinical Evidence (BMJ Publishing Group, 2002), the continually updated book distributed to physicians, it stated that "It provides a concise account of the current state of knowledge, ignorance, and uncertainty about the prevention and treatment of a wide range of clinical conditions." Sources of uncertainty include limitations in current knowledge, lack of familiarity with what knowledge is available, and difficulties in distinguishing among personal ignorance, lack of competence, and actual limitations of knowledge (Fox & Swazy, 1974). Uncertainties may be related to lack of information about problem-related causes and whether resources are available to help clients. A willingness to acknowledge that, "I don't know," combined with taking steps to see if needed information is available increase the likelihood that important uncertainties can be decreased or identified (Chalmers, 2004). This can help social workers to honor ethical obligations of involving clients as informed participants.

The idea of integrating practice and research in professional contexts is not new, nor is attention to ethical issues as they relate to evidentiary ones (Kirk & Reid, 2002). Although its philosophical roots are old, the blooming of EBP as a process attending to evidentiary, ethical, and application issues in all professional venues (i.e., education, practice-policy, and research) is fairly recent, facilitated by the Internet revolution. Critical thinking is integral to EBP, including related values of integrity (e.g., searching for evidence against favored views as well as evidence that favors such views) and consideration of well-argued alternative views (e.g., see Paul, 1993; Paul & Elder, 2004). In both critical thinking and EBP, attention is given to ethical issues; honesty and transparency (clear description of what is done to what effect) are emphasized in both. This applies to all venues of interest: professional education, practice and policy, and related research (its design, conduct, and reporting).

Philosophies of Evidence-Based Practice

EBP and social care involve a philosophy of ethics of professional practice and related enterprises, including education, research, and scholarly writing; a philosophy of science (i.e., epistemology--views about what knowledge is and how it can be gained); and a philosophy of technology. Ethics involves decisions regarding how and when to act; it involves standards of conduct. Epistemology involves views about knowledge and how to get it (or if one can). The philosophy of technology involves questions such as, "Should we develop technology?" "What values should we draw on to decide what to develop?" and "Should we examine the consequences of a given technology?" EBP offers practitioners and administrators a philosophy that is compatible with obligations described in professional codes of ethics as well as an evolving technology for integrating evidentiary, ethical, and practical issues. EBP encourages the integration of research and practice, for example, by highlighting the importance of clinicians critically appraising research reviews and developing a technology to help them to do so: "The leading figures in EBM ... emphasized that clinicians had to use their scientific training and their judgment to interpret [guidelines] and individualize care accordingly" (Gray, 2001b, p. 26). EBP encourages clinicians to think for themselves--to develop critical appraisal skills. Checklists have been created to help professionals appraise the quality of different kinds of research; examples of such checklists are "The CONSORT Statement" (Altman et al., 2001); "QUOROM Guidelines for Systematic Reviews and Meta-Analyses" (Moher et al., 1999); and STARD for diagnostic tests (Bossuyt et al., 2003).

The Process of Evidence-Based Practice

Steps in EBP include the following:

1. Converting information needs related to practice and policy decisions into well-structured questions;

2. Tracking down, with maximum efficiency, the best evidence with which to answer them;

3. "Critically appraising that evidence for its validity (closeness to the truth), impact (size of the effect), and applicability (usefulness in our clinical practice)" (p. 4);

4. Integrating the critical appraisal with our clinical expertise and with our client's unique characteristics and circumstances, including their values; and

5. "Evaluating our effectiveness and efficiency in [carrying out] Steps 1-4 and seeking ways to improve them both for next time" (Straus, Richardson, Glasziou, & Haynes, 2005, p. 4).

Evidence-based practitioners take advantage of efficient technology for conducting electronic searches to locate the current best evidence regarding a specific question. Different questions require different kinds of research methods to critically appraise proposed assumptions (e.g., Greenhalgh, 2006; Guyatt & Rennie, 2002; Straus et al., 2005). These differences are reflected in the use of different "quality filters" to search for research findings.

Sackett et al. (2000) distinguished among three different styles of EBP, all of which require integrating research findings with a client's unique personal and environmental circumstances. All require Step 4 but vary in how other steps are carried out. For problems encountered often, social workers should carry out both searching and critical appraisal of reports found. For Level 2 (i.e., problems encountered less often), Sackett et al. suggested seeking out critical appraisals prepared by others who describe and use explicit criteria for deciding what evidence they selected and how they decided whether it was valid. Here, Step 3 can be omitted and Step 2 restricted to sources that have already undergone critical appraisal. A third style applies to problems encountered very infrequently in which we "'blindly' seek, accept and apply the recommendations we receive from authorities" (Sackett et al., 2000, p. 5). As they noted, the trouble with this mode is that it is "blind" to whether the advice received from the experts "is authoritative (evidence-based, resulting from their operating in the 'appraising' mode) or merely authoritarian (opinion-based, resulting from pride and prejudice)" (p. 5). Lack of time may result in using Style 2 with most problems.

Problem-Based Learning as Described by Sackett et al. (2000) and Straus, Richardson, Glasziou, & Haynes (2005)

The importance of developing professionals who are lifelong learners is highlighted by increased recognition of the flawed nature of traditional means of knowledge dissemination, such as texts, peer review, and didactic continuing education formats, with all the implications of this for clients (Davis, Thomson, Oxman, & Haynes, 1995; see also Thomson O'Brien et al., 2003). Indeed, this was a key reason for developing problem-based learning (PBL), involving the process of EBP (see the previous list of steps). Students are placed in small groups of five or seven together with a tutor who is trained in group process as well as in EBP skills, such as posing well-structured questions and searching effectively and efficiently for related literature. This kind of PBL in medicine has spread throughout the world. (PBL focusing on the process of EBP differs from similarly named approaches.) A problem focus grounds content squarely on practice concerns, highlights key decisions and related questions and options, and links curriculum areas in a manner that reflects everyday practice needs. PBL provides repeated opportunities to confront information needs. (Focusing on client concerns in no way implies that client strengths are overlooked; the social worker would be a poor problem solver by not taking advantage of personal and environmental resources.) Enhancement of self-directed learning skills is a key goal. Examples of self-evaluation questions include the following: (a) Am I asking well-structured questions? (b) Am I searching at all? (c) Do I know the best sources of current evidence for decisions I make? and (d) Am I critically appraising external evidence at all?

Origins of Evidence-Based Decision Making

EBP and health care arose because of troubling gaps between available knowledge and what is used by professionals. Gray (2001a) suggested that "at present, the process is marked by the following characteristics:"

1. Overenthusiastic adoption of interventions of unproven efficacy or even proven ineffectiveness;

2. Failure to adopt interventions that do more good than harm at a reasonable cost;

3. Continuing to offer interventions or services demonstrated to be ineffective;

4. Adoption of interventions without adequate preparation such that the benefits demonstrated in a research setting cannot be reproduced in the ordinary service setting; and

5. Wide variation in the rates at which interventions are adopted or discarded. (p. 366)

Evidence-based decision making arose as an alternative to authority-based decision making in which consensus, anecdotal experience, or tradition are relied on to make decisions. Although misleading in the incorrect assumption that EBP means only that decisions made are based on evidence of the effectiveness of interventions, use of the term calls attention to the fact that available evidence may not be used or the current state of ignorance shared with clients. It is hoped that professionals who consider related research findings regarding decisions and inform clients about them provide more effective and ethical care than those relying on criteria such as anecdotal experience, available resources, or popularity.

Sackett et al. (2000) suggested four realizations for the rapid spread of evidence-based medicine, made possible by five recent developments. Realizations include (a) practitioner need for valid information about decisions made; (b) the inadequacy of traditional sources for acquiring this information (e.g., because they are out of date, frequently wrong, overwhelming in their volume, variable in their validity); (c) the gap between assessment skills and clinical judgment, "Which increase with experience and our up-to-date knowledge and performance which decline" (p. 2); and (d) lack of time to locate, appraise, and integrate this evidence (p. 2). There were increasing gaps between information available on the Internet that could be of value to clients and clinicians in making informed decisions and what was drawn on. Five developments allowed improvement in this state of affairs:

(1) The development of strategies for efficiently tracking down and appraising evidence (for its validity and relevance);

(2) The creation of systematic reviews and concise summaries of the effects of health care (epitomized by the Cochrane Collaboration);

(3) The creation of evidence-based journals of secondary publication;

(4) The creation of information systems for bringing the forgoing to us in seconds;

(5) The identification and application of effective strategies for life-long learning and improving performance. (Sackett et al., 2000, p. 3)

EBPs

Yet another view is using the term EBP to refer to use of practice guidelines and treatment manuals claimed to be effective, as found in two well-designed, randomized, controlled trials (see Norcross, Beutler, & Levant, 2006, as well as many Web sites, such as North Carolina Evidence-Based Practice Center [NCEBPC].) Related research has been criticized for (a) selecting clients using the psychiatric classification system (i.e., the Diagnostic and Statistical Manual of Mental Disorders, Text Revision [DSM-IV-TR]; American Psychiatric Association, 2000), which does not represent the spectrum of clinical problems; (b) assuming there is a specific intervention for a specific disorder, which, some argue, reflects a simplistic biomedical view of client concerns and related factors that ignores research regarding causal factors; (c) using weak control groups; (d) failing to report long-term effects; and (e) ignoring relevant clinical outcomes (see, e.g., Luyten, Blatt, Van Houdenhove, & Corveleyn, 2006; Westen, Novotny, & Thompson-Brenner, 2005). It is argued that many of these concerns overlook the complexity of client concerns and related factors as suggested by empirical research.

Cosmetic (Pseudo-EBP)

Yet another use of the term EBP refers to "business as usual," for example, redubbing incomplete, unrigorous, narrative reviews of research as evidence based.

There is a vast difference between the process and philosophy of evidence-informed practice and the use of treatment manuals and practice guidelines. The process and philosophy of EBP described by the originators are much more radical departures from business as usual compared with the EBPs or empirically-supported treatments (ESTs) approaches. This is the most critical choice that influences selection of educational formats and content. The former is a systemic approach, which has implications for researchers, educators, and administrators, as well as line staff, supervisors, and students. The uncertainty and complexity involved in making decisions are acknowledged as is the importance of being honest about ignorance as well as knowledge. There is a very different tone in literature about EBPs. In the latter, inflated claims, such as "well established," are common; research flaws are often hidden; application problems are underplayed; uncertainties, complexities of clinical concerns, and related factors are often overlooked; and a top-down approach is used that dismisses the importance of clients and practitioners acquiring critical appraisal skills and ignores local circumstances and individual differences in clients (e.g., Wilson & Alexandra, 2005). I have never seen the word ignorance used in this literature.

I suggest that the evolving process and philosophy of EBP as described in original sources (Gray, 1997; Sackett et al., 1997) are more likely to increase the quality of services that clients receive than a view of EBP as using EBPs-ESTs because of its systemic approach involving practitioners and clients as informed participants and its greater rigor of critical appraisal. It is unlikely that a piecemeal approach to change will make headway in integrating ethical, evidentiary, and application concerns, but in some schools this may be all that is possible. A narrow view ignores the process of EBP and related developments designed to help clients and practitioners address application problems, such as inflated claims in published articles (i.e., flaws in peer review).

Controversies Regarding Evidence

Both the origins of EPB and objections to it reflect different views of evidence. A key way in which competing views of EBP differ is in the degree of rigor in evaluating knowledge claims (e.g., see Djulbegovic, Morris, & Lyman, 2000). When do social workers have enough evidence to recommend a practice or policy or to decide that it does not work (Norcross et al., 2006; Weisburd, Lum, & Yang, 2003)? Do criteria for "having enough" differ in relation to different kinds of decisions? In practice, social workers must often move down a hierarchy of evidence in relation to the rigor of critical appraisal of a claim because most interventions have not been critically tested. Thus, the term "best evidence" could refer to tests that differ greatly in their ability to critically test a claim. Davies (2004) suggested that a broad view of evidence is needed to review policies, including (a) experience and expertise, (b) judgment, (c) resources, (d) values, (e) habits and traditions, (f) lobbyists and pressure groups, and (g) pragmatics and contingencies. Different views of evidence are illustrated by the different conclusions concerning the effectiveness of multisystemic family treatment (Henggeler & Lee, 2003), which is widely touted as effective. On the basis of a critical appraisal of related research following guidelines developed by the Campbell and Cochrane Collaborations, Littell (2005) concluded that such programs have few, if any, significant effects on measured outcomes compared with usual services or alternative treatment (see also Henggeler, Schoenwald, Borduin, & Swenson, 2006; Littell, 2006). Given the history of the helping professions (e.g., inflated claims of effectiveness and harming in the name of helping), is not the most ethical road to make measured, rather than inflated, claims so that professionals are not misled and, in turn, mislead clients? Inflated claims obscure uncertainties that, if shared, may influence client decisions. Rigorous compared with nonrigorous appraisals yield different results; the latter have reported more positive results (e.g., see Antman, Lau, Kupelnick, Mosteller, & Chalmers, 1992; Schulz, Chalmers, Hayes, & Altman, 1995).

Ethical Obligations and Accreditation Standards as Guides to Choosing a View of EBP

Social workers' professional codes of ethics and accreditation standards call for key characteristics of EBP, such as drawing on practice-policy-related research and involving clients as informed participants. Protecting clients from harm is noted as a value in the National Association of Social Workers' Code of Ethics (1999). One of the origins of evidence-based health care was the finding of striking variations in services offered to address a particular problem. Questions naturally arise, such as the following: "Are all variations equally effective?" and "Are some harmful?" EBP involves a systemic approach to improving services, including educating professionals who are lifelong learners, involving clients as informed participants, attending to management practices and policies that influence services, and attending to application challenges, such as the development of strategies for efficiently tracking down and appraising evidence. Only this systemic view allows social workers to honor their ethical obligations to clients (e.g., to involve them as informed participants) and to offer competent services (see Table 2). The philosophy and related process of EBP have implications for all individuals and institutions involved with helping clients, including educators and professional education programs. Research, practice, and educational issues are closely intertwined. For example, poor-quality reviews of research related to practice and policy questions may result in bogus practice guidelines, which result in poor-quality services. EBP encourages a focus on client concerns and hoped-for outcomes and consideration of individual differences in client characteristics and circumstances, including client values and expectations.

EBP encourages transparency of what is done to what effect in all venues of interest, including research and professional education. Transparency calls for blowing the whistle on pseudoscience, fraud, quackery, and professional propaganda (Jacobson, Foxx, & Mulick, 2005; Lilienfeld, Lynn, & Lohr, 2003). Increased transparency will highlight gaps between resources needed to attain hoped-for outcomes and what is used and, thus, may encourage advocacy on the part of clients and professionals for more effective services (e.g., see Domenighetti, Grilli, & Liberati, 1998). It will reveal services that are ineffective, allowing a more judicious distribution of scarce resources (Eddy, 1994). Increased transparency will reveal gaps between contributors to client problems, such as poverty, and interventions used and promoted as valuable. A key contribution of EBP is discouraging inflated claims of knowledge that mislead involved parties and hinder the development of knowledge. EBP calls for candid descriptions of the limitations of research studies and use of methods that critically test questions addressed; it calls for systematic reviews rather than incomplete, unrigorous reviews (e.g., see Cochrane and Campbell Collaboration review protocols). Ignoring practice- and policy-related research findings violates informed consent obligations and may result in wasting money on ineffective services, harming clients in the name of helping them, and forgoing opportunities to attain hoped-for outcomes. EBP involves sharing responsibility for decision making in a supportive context of recognized uncertainty (Katz, 2002). Related research in health care suggests that evidence-informed practice can improve the quality of services (Straus, Ball, Balcombe, Sheldon, & McAlister, 2005).

Obstacles

There are many obstacles to implementing EBP, including practical, political, economic, psychological, and ethical ones (e.g., see Green & Ruff, 2005; Haynes & Haines, 1998). A review (Oxman, Thomson, Davis, & Haynes, 1995) of 102 trials of interventions designed to help health professionals deliver services more effectively and efficiently showed that there are no quick remedies. External barriers include poor quality of research producing biased evidence, publication bias toward positive findings, failure of researchers to present evidence in forms useful to clinicians, and inaccessible libraries. Other barriers noted by Gray (1997, 2001a, 2001b) included out-of-date textbooks, biased editorials and reviews, inability to spot flaws in research, problems in translating data about groups into information relevant to an individual, and insufficient time. Attending to application barriers such as organizational obstacles is a hallmark of EBP. Oxman and Flottorp (1998) suggested three major kinds of barriers: prevailing opinions (e.g., standards of practice, opinion leaders, professional training, and advocacy, e.g., by pharmaceutical companies), practice environment (e.g., financial disincentives, organizational constraints, perception of liability, and client expectations), and knowledge and attitudes (e.g., clinical uncertainty, sense of competence, compulsion to act, and information overload). Gathering information about the frequency and exact nature of application barriers will be useful in planning how to decrease obstacles, and there is considerable literature here (Ely et al., 2002; Gira, Kessler, & Poertner, 2004; Greenhalgh, Robert, MacFarlane, Bate, & Kyriakidou, 2004).

Summary

Evidence-informed practice describes a philosophy and an evolving process designed to help practitioners gain needed information and to become lifelong learners. The uncertainty in making decisions is highlighted, efforts to decrease it are made, and clients are involved as informed participants. EBP is as much about the ethics of and pressures on academics and researchers as it is about the ethics and pressures on practitioners and agency administrators. EBP encourages attention to ethical obligations (to draw on practice-policy-related literature, to involve clients as informed participants, to focus on outcomes clients value, and to address application obstacles, such as dysfunctional agency cultures). Implications for professional education include encouraging students to acknowledge uncertainty, to appreciate the value of mistakes as a way to "educate their intuition," and to advocate on behalf of clients when agency procedures and policies work against, rather than for, clients. EBP calls for a number of new competencies, such as posing well-structured questions that facilitate an effective, efficient search for related research findings. Even when required skills and knowledge are available, many obstacles remain, and social work educators should help students develop skills for addressing these obstacles, for example, to form coalitions to advocate for needed services and to gain access to needed databases on the job. The process of EBP should forward social justice by creating a more judicious distribution of scarce resources by not wasting money on harmful or ineffective services, allowing more money to offer effective services.

Accepted: 02/07

References

Altman, D. G., Schulz, K. F., Moher, D., Egger, J., Davidoff, F., Elbourne, D., et al., for the CONSORT Group. (2001). The revised CONSORT statement for reporting randomized trials: Explanation and elaboration. Annals of Internal Medicine, 134, 663-694.

American Psychiatric Association. (2000). Diagnostic and statistical manual of mental disorders (4th ed., text rev.). Washington, DC: Author.

Antman, E. M., Lau, J., Kupelnick, B., Mosteller, F., & Chalmers, T. C. (1992). A comparison of results of meta-analyses of randomized controlled trials and recommendations of clinical experts: Treatments for myocardial infarction. Journal of the American Medical Association, 268, 240-248.

BMJ Publishing Group. (2002, June). Clinical evidence: The international source of the best available evidence for effective health care (7th issue). London: Author.

Bossuyt, P. M., Reitsma, J. B., Bruns, D. E., Gatsonis, C. A., Glasziou, P. P., Irwig, L. M., et al. (2003). Towards complete and accurate reporting of studies of diagnostic accuracy: The STARD initiative. British Medical Journal, 326, 41-44.

Briggs, H. E., & Rzepnicki, T. L. (Eds.). (2004). Using evidence in social work practice: Behavioral perspectives. Chicago: Lyceum.

Chalmers, I. (2004). Well-informed uncertainties about the effects of treatment. British Medical Journal, 328, 425-426.

Council on Social Work Education. (2001). Educational policy and accreditation standards. Alexandria, VA: Author.

Davies, P. (2004, February). Is evidence-based government possible? Jerry Lee Lecture, 4th Annual Campbell Collaboration Colloquium, Washington, DC.

Davis, D. A., Thomson, M. A., Oxman, A. D., & Haynes, R. B. (1995). Changing physician performance: A systematic review of the effect of continuing medical education strategies. Journal of the American Medical Association, 274, 700-705.

Djulbegovic, B., Morris, L., & Lyman, G. H. (2000). Evidentiary challenges to evidence-based medicine. Journal of Evaluation in Clinical Practice, 6(2), 99-109.

Domenighetti, G., Grilli, R., & Liberati, A. (1998). Promoting consumer's demand for evidence-based medicine. International Journal of Technology Assessment in Health Care, 14(11), 97-105.

Eddy, D. M. (1994). Principles for making difficult decisions in difficult times. Journal of the American Medical Association, 271, 1792-1798.

Ely, J. W., Osheroff, J. A., Ebell, M. H., Chambliss, M. L., Vinson, D. C., Stevermer, J. J., et al. (2002). Obstacles to answering doctors' questions about patient care with evidence: Qualitative study. British Medical Journal, 324, 710-718.

Fox, R. C., & Swazey, J. P. (1974). The courage to fail: A social view of organ transplants and dialysis. Chicago: University of Chicago Press.

Gambrill, E. (2003). Evidence-based practice: Sea change or the emperor's new clothes? Journal of Social Work Education, 39, 3-23.

Gambrill, E. (2006). Social work practice: A critical thinker's guide (2nd ed.). New York: Oxford University Press.

Gibbs, L. (2003). Evidence-based practice for the helping professions. Pacific Grove, CA: Brooks/Cole.

Gira, E. C., Kessler, M. L., & Poertner, J. (2004). Influencing social workers to use research evidence in practice: Lessons from medicine and the allied health professions. Research on Social Work Practice, 14, 68-79.

Gray, J. A. M. (1997). Evidence-based health care: How to make health policy and management decisions. New York: Churchill Livingstone.

Gray, J. A. M. (2001a). Evidence-based health care: How to make health policy and management decisions (2nd ed.). New York: Churchill Livingstone.

Gray, J. A. M. (2001b). Evidence-based medicine for professionals. In A. Edwards & G. Elwyn (Eds.), Evidence-based patient choice: Inevitable or impossible? (pp. 19-33). New York: Oxford University Press.

Green, M. L., & Ruff, T. R. (2005). Why do residents fail to answer their clinical questions? A qualitative study of barriers to practicing evidence-based medicine. Academic Medicine, 80, 176-182.

Greenhalgh, T. (2006). How to read a paper: The basics of evidence based medicine (3rd ed.). London: BMJ Books.

Greenhalgh, T., Robert, G., MacFarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82, 581-629.

Guyatt, G. H., & Rennie, D. (2002). Users' guides to the medical literature: A manual for evidence-based clinical practice. Chicago: American Medical Association.

Haynes, B., & Haines, A. (1998). Barriers and bridges to evidence-based clinical practice. British Medical Journal, 317, 273-276.

Haynes, R. B., Devereaux, P. J., & Guyatt, G. H. (2002). [Editorial]. Clinical expertise in the era of evidence-based medicine and patient choice. ASCP Journal Club, 136(A11), 1-2.

Henggeler, S. W., & Lee, T. (2003). Multisystemic treatment of serious clinical problems. In A. E. Kazdin & J. R. Weisz (Eds.), Evidence-based psychotherapies for children and adolescents (pp. 301-324). New York: Guilford.

Henggeler, S. W., Schoenwald, S. K., Borduin, C. M., & Swenson, C. C. (2006). Methodological critique and meta-analysis as Trojan horse. Children and Youth Services Review, 28, 447-457.

Howard, M. O., McMillen, C. J., & Pollio, D. E. (2003). Teaching evidence-based practice: Toward a new paradigm for social work education. Research on Social Work Practice, 13, 234-259.

Jacobson, J. W., Foxx, R. M., & Mulick, J. A. (2005). Controversial therapies for developmental disabilities: Fad, fashion, and science in professional practice. Mahwah, NJ: Erlbaum.

Katz, J. (2002). The silent world of doctor and patient. Baltimore, MD: John Hopkins University Press.

Kirk S. A., & Reid, W. J. (2002). Science and social work: A critical appraisal. New York: Columbia University Press.

Lilienfeld, S. O., Lynn, S. J., & Lohr, J. M. (2003). Science and pseudoscience in clinical psychology. New York: Guilford.

Littell, J. (2005). Lessons from a systematic review of effects of multisystemic therapy. Children and Youth Services Review, 27, 445-463.

Littell, J. H. (2006). The case for multisystemic therapy: Evidence or orthodoxy? Children and Youth Services Review, 28, 458-472.

Luyten, P., Blatt, S. J., Van Houdenhove, B., & Corveleyn, J. (2006). Depression research and treatment: Are we skating to where the puck is going to be? Clinical Psychology Review, 26, 985-999.

Moher, D., Cook, D. J., Eastwood, S., Olkin, I., Rennie, D., & Stroup, D. F. (1999). Improving the quality of reports of meta-analyses of randomised controlled trials: The QUOROM statement. Quality of reporting of meta-analyses. Lancet, 354, 1896-1900.

Mullen, E. J., Shlonsky, A., Bledsoe, S., & Bellamy, J. L. (2005). From concept to implementation: Challenges facing evidence-based social work. The Policy Press, 1, 61-84.

National Association of Social Workers. (1999). Code of ethics. Silver Spring, MD: NASW Press.

Norcross, J. C., Beutler, L. E., & Levant, R. F. (Eds.). (2006). Evidence-based practices in mental health: Debate and dialogue on the fundamental questions. Washington, DC: American Psychological Association.

Oxman, A. D., & Flottorp, S. (1998). An overview of strategies to promote implementation of evidence-based health care. In C. Silagy & A. Haines (Eds.), Evidence-based practice in primary care (pp. 91-109). London: BMJ Books.

Oxman, A. D., Thomson, M. A., Davis, D. A., & Haynes, R. B. (1995). No magic bullets: A systematic review of 102 trials of interventions to improve professional practice. Canadian Medical Association Journal, 153, 1423-1431.

Paul, R. (1993). Critical thinking: What every person needs to know to survive in a rapidly changing world (3rd ed.). Santa Rosa, CA: Foundation for Critical Thinking.

Paul, R. W., & Elder, L. (2004). Critical thinking: Tools for taking charge of your professional and personal life. Upper Saddle River, NJ: Prentice-Hall.

Reid, W. J. (2001). The role of science in social work: The perennial debate. Journal of Social Work, 1, 273-293.

Rosen, A., & Proctor, E. K. (2002). Standards for evidence-based social work practice. In A. R. Roberts & G. J. Greene (Eds.), The social worker's desk reference (pp. 743-747). New York: Oxford University Press.

Sackett, D. L., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (1997). Evidence-based medicine: How to practice and teach EBM. New York: Churchill Livingstone.

Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B., & Richardson, W. S. (1996). Evidence-based medicine: What it is and what it isn't. British Medical Journal, 312, 71-72.

Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). New York: Churchill Livingstone.

Schulz, K. F., Chalmers, I., Hayes, R. J., & Altman, D. G. (1995). Empirical evidence of bias. Dimensions of methodological quality associated with estimates of treatment effects in controlled clinical trials. Journal of the American Medical Association, 27, 408-412.

Straus, S. E., Ball, C., Balcombe, N., Sheldon, J., & McAlister, F. A. (2005). Teaching evidence-based medicine skills can change practice in a community hospital. Journal of General Internal Medicine, 20, 340-343.

Straus, S. E., Richardson, W. S., Glasziou, P., & Haynes, R. B. (2005). Evidence-based medicine: How to practice and teach EBM (3rd ed.). New York: Churchill Livingstone.

Thomson O'Brien, M. A., Freemantle, N., Oxman, A. D., Wolf, F., Davis, D. A., & Herrin, J. (2003). Continuing education meetings and workshops: Effects on professional practice and health care outcomes [Cochrane review]. Cochrane, 1, 118.

Thyer, B. A., & Kazi, M. A. F. (Eds.). (2004). International perspectives on evidence-based practice in social work. Birmingham, England: Venture Press.

Weisburd, D., Lum, C. M., & Yang, S.-M. (2003). When can we conclude that treatments or programs "don't work"? Annals of the American Academy of Political and Social Science, 587, 31-48.

Westen, S., Novotny, C. M., & Thompson-Brenner, H. (2005). EBP/EST: Reply to Crits-Christoph et al. (2005) and Weisz et al. (2005). Psychological Bulletin, 131, 427-433.

Wilson, D., & Alexandra, L. (2005). Guide for child welfare administrators on evidence-based practice. Washington, DC: National Association of Public Child Welfare Administrators, American Public Human Services Association.

Eileen Gambrill

University of California, Berkeley

Eileen Gambrill is Hutto Patterson Professor of Child and Family Studies, School of Social Welfare, University of California, Berkeley.

Address correspondence to Eileen Gambrill at School of Social Welfare, University of California, Berkely, 120 Haviland Hall, Berkeley, CA 94720; e-mail: gambrill@berkeley.edu.

TABLE 1. Examples of Competencies Related to EBP

1. Identify information needs related to decisions.

2. Pose well-structured questions related to information needs.

3. Efficiently and effectively track down research findings related to
information needs.

4. Critically appraise different kinds of research reports.

5. Accurately determine the extent to which research findings apply to
a particular client (at whatever system level).

6. Consider the values and expectations of clients when making
decisions.

7. Involve clients as informed participants, including an accurate
description of services to be provided and their risks and benefits as
well as alternatives and their risks and benefits.

8. Accurately describe the evidentiary status of recommended services.

9. Use services that have been critically tested and found to maximize
the likelihood of attaining hoped-for outcomes when ethical. (See
number 6.)

10. Select service providers who use evidence-informed services (i.e.,
they have been found via rigorous appraisal to help clients attain
outcomes sought).

11. Evaluate progress in a timely, valid manner.

12. Use valid assessment methods that maximize the likelihood of
choosing services likely to result in hoped-for outcomes.

13. Avoid making inflated claims about the effectiveness of services
(e.g., there is no evidence to support such claims).

14. Identify human service propaganda.

15. Readily acknowledge deficiencies in current background knowledge.

16. Involve concerned others to alter agency practices that pose an
obstacle to high-quality services.

TABLE 2. Contributions of Evidence-Based Practice to Honoring Ethical
Obligations

Ethical Obligation              Contribution

                                Professional helpers

Help clients and avoid harm     Encourage use of and facilitate access
                                to practice- and policy- related
                                research findings to maximize the
                                likelihood of success and minimize the
                                likelihood of harm.

Maximize autonomy/              Involve clients as informed
self-determination              participants regarding risks and
                                benefits of both recommenced methods
                                and alternatives. Accurately describe
                                the evidentiary status of recommended
                                methods and alternatives.

Respect and integrity           Consider client values and
                                preferences. (See above also.)

Competence                      Have and use the knowledge and skills
                                required to provide services that
                                maximize success; keep up to date
                                with practice-related research
                                findings.

Accountability                  Arrange for accurate ongoing feedback
                                about progress.

Promotion of social justice     Advocate for changes in social
                                conditions that contribute to
                                personal problems.

Lifelong learning               Develop and use tools and processes
                                such as problem-based learning that
                                help practitioners become lifelong
                                learners who keep up to date with
                                practice- and policy-related research
                                and share this with clients.

                                Researchers

Accurately describe research    Be honest brokers of knowledge and
findings                        ignorance.

Use research methods that       Encourage a good match between
critically test questions       research methods used and questions
posed                           pursued.

Involve clients in the design   Attend to outcomes of value to
and critique of research        clients.

                                Educators

Help students to become         Use educational formats that create
lifelong learners               lifelong learners.

Be honest, up-to-date brokers   Accurately inform students regarding
of knowledge and ignorance      both preferred and alternative
                                well-argued views and related
                                research findings.

Involve students as informed    Accurately describe biases, special
participants                    interests, and scope of knowledge.

Competence                      Possess knowledge of and effectively
                                transmit up-to-date research findings
                                regarding vital practice and policy
                                questions.

Help students to acquire        Honor social work values of service
decision-making skills that     and avoiding harm.
maximize help and minimize
harm to clients

The rest of this article is only available to active members of Questia

Already a member? Log in now.

Notes for this article

Add a new note
If you are trying to select text to create highlights or citations, remember that you must now click or tap on the first word, and then click or tap on the last word.
One moment ...
Default project is now your active project.
Project items

Items saved from this article

This article has been saved
Highlights (0)
Some of your highlights are legacy items.

Highlights saved before July 30, 2012 will not be displayed on their respective source pages.

You can easily re-create the highlights by opening the book page or article, selecting the text, and clicking “Highlight.”

Citations (0)
Some of your citations are legacy items.

Any citation created before July 30, 2012 will labeled as a “Cited page.” New citations will be saved as cited passages, pages or articles.

We also added the ability to view new citations from your projects or the book or article where you created them.

Notes (0)
Bookmarks (0)

You have no saved items from this article

Project items include:
  • Saved book/article
  • Highlights
  • Quotes/citations
  • Notes
  • Bookmarks
Notes
Cite this article

Cited article

Style
Citations are available only to our active members.
Buy instant access to cite pages or passages in MLA, APA and Chicago citation styles.

(Einhorn, 1992, p. 25)

(Einhorn 25)

1. Lois J. Einhorn, Abraham Lincoln, the Orator: Penetrating the Lincoln Legend (Westport, CT: Greenwood Press, 1992), 25, http://www.questia.com/read/27419298.

Cited article

Views of Evidence-Based Practice: Social Workers' Code of Ethics and Accreditation Standards as Guides for Choice
Settings

Settings

Typeface
Text size Smaller Larger Reset View mode
Search within

Search within this article

Look up

Look up a word

  • Dictionary
  • Thesaurus
Please submit a word or phrase above.
Print this page

Print this page

Why can't I print more than one page at a time?

Help
Full screen

matching results for page

    Questia reader help

    How to highlight and cite specific passages

    1. Click or tap the first word you want to select.
    2. Click or tap the last word you want to select, and you’ll see everything in between get selected.
    3. You’ll then get a menu of options like creating a highlight or a citation from that passage of text.

    OK, got it!

    Cited passage

    Style
    Citations are available only to our active members.
    Buy instant access to cite pages or passages in MLA, APA and Chicago citation styles.

    "Portraying himself as an honest, ordinary person helped Lincoln identify with his audiences." (Einhorn, 1992, p. 25).

    "Portraying himself as an honest, ordinary person helped Lincoln identify with his audiences." (Einhorn 25)

    "Portraying himself as an honest, ordinary person helped Lincoln identify with his audiences."1

    1. Lois J. Einhorn, Abraham Lincoln, the Orator: Penetrating the Lincoln Legend (Westport, CT: Greenwood Press, 1992), 25, http://www.questia.com/read/27419298.

    Cited passage

    Thanks for trying Questia!

    Please continue trying out our research tools, but please note, full functionality is available only to our active members.

    Your work will be lost once you leave this Web page.

    Buy instant access to save your work.

    Already a member? Log in now.

    Author Advanced search

    Oops!

    An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.