INTEGRATING EVIDENCE-BASED PRACTICE AND SOCIAL …

[Pages:21]INTEGRATING EVIDENCE-BASED PRACTICE AND SOCIAL WORK FIELD EDUCATION

Tonya Edmond Washington University

Deborah Megivern Washington University

Cynthia Williams Washington University

Estelle Rochman Washington University

Matthew Howard University of Michigan-Ann Arbor

The social work academic community is currently considering and critiquing the idea of evidence-based practice (EBP). Given the vital part that practicum education plays in the social work profession, understanding the views of field instructors on this subject is essential. The George Warren Brown School of Social Work at Washington University surveyed 283 field instructors within 180 agencies and found that the majority (87%, N=235) viewed it as a useful practice idea. However, most of the indicators employed to assess use of scientific evidence in social work practice revealed that it occurs too infrequently. A lack of time was reported as the greatest obstacle.

RECENTLY, the social work academic community has been considering and critiquing the idea of evidence-based practice (EBP), an important paradigm shift designed to promote the consistent use of scientifically validated information and effective interventions in social work practice (Cournoyer & Powers, 2002; Gambrill, 2003; Gilgun, 2005; McNeece & Thyer, 2004; Mullen & Streiner, 2004; Rosen, 2003; Thyer, 2002). Evidence-based practice may be thought of as aprocessundertakenby professionals wherein the scientific status of potential interventions is investigated and a thorough explication of the results is shared with clients, so that practitioner and client together can select the most appropriate steps for addressing a specific

problem (Franklin & Hopson, 2004; Gambrill, 1999; Kessler, Gira, & Poertner, 2005).

Eirst introduced in medicine and allied health professions, EBP has been advocated in social work as an alternative to "authoritybased practice," or practice based solely on the expertise and experience of practitioners (Gambrill, 1999,2003; Gibbs & Gambrill, 2002; Upshur & Tracy, 2004). Preliminary research suggests that EBP-trained medical professionals provide higher-quality and more effective services than those who rely on traditional, expertise-based methods (Choudry, Fletcher, & Soumerai, 2005; Norman & Eva, 2005). For example, research has shown that practitioners do not automatically learn from experience and

Journal of Social Work Education Vol. 42, No. 2 (Spring/Summer 2006). ? Copyright 2006

Council on Social Work Education, inc. All rights reserved.

377

378

JOURNAL OF SOCIAL WORK EDUCATION

may be prone to relying on obsolete or ineffective interventions without the introduction of strategies to advance professional knowledge and skill development (Batalden, 2001; Bickman, 1999,2002; Norman & Eva, 2005; Wakefield & Stuart, 1996). Training that emphasizes EBP offers practitioners a set of skills that supports lifelong knowledge development, while more traditional training (e.g., case consultation with supervisors, colleagues, or faculty) is more likely to teach theory and skills that become outdated in time (Batalden, 2001; Coomarasamy & Khan, 2004; Eddy, 2005; Gibbs & Gambrill, 2002; Zlotnik & Galambos, 2004).

The push toward scientifically supported interventions, and away from practices based primarily on practitioners' ideology or preferences, has been drivenby internal professional/ ethical concerns about the effectiveness of social workpractice (Gilgun, 2005; Perez, 1999;Powell, 2003), external pressures such as demands for service accountability from government (Goldman & Azrin, 2003; Petrosino, Boruch, Soydan, Duggan, & Sanchez-Meca, 2001; Raines, 2004), and funding sources (e.g., requiring that treatments have a demonstrated evidence base for reimbursement) (Fox, 2005; Steinberg & Luce, 2005). Numerous observers have concluded that EBP has become institutionalized throughout health, education, and social services as everstronger infrastructure is developed to support it (Kessler et al., 2005; Petrosino et al, 2001; Steinberg & Luce, 2005).

Barriers to EBP

The movement to inform social work practice using scientific research and evaluation is not new (Fischer, 1973; Rosen, 1996). Prior to the development of EBP within social

work, many proclaimed the need for practitioners to use scientific methods to evaluate their practice, while keeping current with the latest innovations from research (Kirk, 1999; Thyer, 1996; Whittaker, 2002). Some have suggested that EBP is the natural evolution of thinking about the scientific practitioner (Steinberg & Luce, 2005; Thyer, 2002). However, while few social workers would discount the importance of research innovations, the actual utilization of scientific research in everyday practice faces many barriers (see Mullen, Shlonsky, Bledsoe, & Bellamy, 2005, for a full review of barriers to the implementation of EBP). Just a few examples of such barriers include lack of available evidence, uneven access to research, practitioner resistance, and constraints on providers' time (Gibbs & Gambrill, 2002; Gira, Kessler, & Poertner, 2004; Raines, 2004; Rosen, 2003; Wambach, Haynes, & White, 1999).

Putting research into practice appears to be difficult for professionals across many disciplines (Cabana et al., 1999; Glasgow, Lichtenstein, & Marcus, 2003; Humphris, Littlejohns, Victor, O'Halloran, & Peacock, 2000; National Institute of Medicine, 2001; Persons, 1995; Rycroft-Malone et al., 2002). When the medical field began emphasizing EBP in the early 1990s, analysts speculated that less than half of medical practice was based on scientific evidence (Eddy, 2005; Hunt, 2001). Likewise, Beutler (2000) estimated that most interventions in clinical psychology have not been based on solid scientific evidence. Social work interventions are even less likely to be based on a review of evidence than those from medicine or psychology (Gambrill, 2001; Proctor & Rosen, 2004). Furthermore, researchers

EVIDENCE-BASED PRACTICE IN FIELD EDUCATION 3 7 9

have found that most social workers do not consistently use research to inform their practice (Mullen et al., 2005).

The challenges of translating research into practice have created several ongoing tensions between researchers and practitioners over: (1) the definition of evidence (Crisp, 2004; Raines, 2004; Shlonsky & Gibbs, 2004; Witkin & Harrison, 2001); (2) implementation of evidence into "best practices" (Ferlie, Fitzgerald, & Wood, 2000; Gonzales, Ringeisen, & Chambers, 2002; Hoagwood, 2002; Hoge, Huey, & O'Connell, 2004);and (3) development of empirically-based practice guidelines (Howard & Jenson, 1999; Jackson, 1999; Kirk, 1999; Nathan, 1998; Richey & Roffman, 1999; Williams & Lanigan, 1999).

Over the past decade, researchers in the EBP "revolution" (Cournoyer & Powers, 2002, p. 798) have tried to systematically resolve some of the barriers that limit broader EBP implementation (Addis, 2002; Addis, Wade, & Hatgis, 1999; Gellis & Reid, 2004; Grol & Grimshaw, 1999; Haynes & Haines, 1998; Mullen et al., 2005). The underutilization of research findings by practitioners in their everyday practice has kindled extensive efforts to increase EBP across the helping professions (Eddy, 2005; Gilgun, 2005; Gira, Kessler, & Poertner, 2004; Shlonsky & Gibbs, 2004). Most of the major professional organizations and federal research funding agencies, such as the National Institute of Health, joined the EBP movement by endorsing improved translation of research into practice to reduce the gap between evidence-based "best practices" and usual treatment (Gonzales et al., 2002; RycroftMalone et al., 2002; Thyer, 2002). For example a 1998 National Institute of Mental Health report spurred the investigation of the "best practices" for translating research into practice.

One challenge for EBP is how to organize and disseminate new information from research findings into manageable, user-friendly summaries. Medical researchers responded to this challenge by developing the Cochrane Collaboration, an Internet-based library of rigorously conducted "systematic reviews" of research evidence on specific medical topics (Guyatt, Sinclair, Cook, & Glasziou, 1999). Social scientists have responded in turn with the Campbell Collaboration, intended to be an Internet library of systematic reviews of existing evidence on social and educational interventions (Petrosino et al., 2001). However, at this point, the Campbell Collaboration is producing more plans for systematic reviews than actual completed products (Goldman & Azrin, 2003; Mullen et al., 2005; Petrosino et al., 2001).

Encouraging social work practitioners to rely on evidence to guide their practice is made difficult by the current paucity of scientific research underpinning many social work interventions (Crisp, 2004; Grayson & Gomersall, 2003; Kessler et al., 2005; MacDonald, 1998; Rosen, Proctor, & Staudt, 2003). Furthermore, EBP places a premium on random clinical trials (RCTs) to validate practices and demonstrate their efficacy, but feasibility issues and ethical constraints limit how often social work interventions have been investigated using this experimental method (Fraser, 2003; Gilgun, 2005). While RCTs continue to be the "gold standard" for scientific evidence, other types of research are being given more consideration to validate the effectiveness of interventions in the field (Crisp, 2004; Kessler et al., 2005; Upshur & Tracy, 2004; Victora, Habicht, & Bryce, 2004). Burgeoning interest in EBP has produced plans for new lines of research in previously

380

JOURNAL OF SOCIAL WORK EDUCATION

understudied fields that will provide scientific evidence in the future (Goldman & Azrin, 2003; Zlotnik & Galambos, 2004).

Practitioners in health, education, and social welfare have expressed some reluctance to adopt EBP for fear it would lead to mechanistic "cookbook-style" interventions without appreciating the tacit knowledge developed through "practice wisdom" (Addis et al., 1999; Klein & Bloom, 1995; Timmermans & Mauck, 2005). Proponents of EBP point out that practice wisdom or the accumulation of experience by practitioners, is not disregarded by evidence-based practitioners (Goldman & Azrin, 2003; Zayas, Gonzales, & Hanson, 2003), but instead is greatly valued in the difficult tasks of matching suitable interventions with the idiosyncratic circumstances of individual clients and in evaluating the effectiveness of a specific implementation of an intervention (Eddy, 2005; Gibbs & Gambrill, 2002; Raines, 2004; Rosen, 2003).

Education and Training for EBP

Education and training programs across the helping professions have been identified as being behind the pace of developments in the Held, including responding to the call for increased EBP training (Hoge, Huey, & O'Connell, 2004; Raines, 2004). If the social work profession is to adopt EBP as the guiding methodology for intervention, the education of social workers will need to be strategically changed so that all aspects of coursework, field practicum, and professional development include training in the steps of EBP, including defining specific practice questions, locating relevant scientific information, critical appraisal of the evidence, and evaluation of practice (Howard, McMillen,

& PoUio, 2003). Experts in adult education argue that experiential learning provides the most influential long-term knowledge and skill development (Knight, 2001; Miller, Kovacs, Wright, Corcoran, & Rosenblum, 2005). In particular, field education has been identified by master of social work graduates as the most memorable part of formal training for "the development of practice-based skills and for socializing students into the professional role" (Bogo, Regehr, Hughes, Power, & Globerman, 2004, p. 417).

Almost a decade ago, Schneck (1995) wrote, "Field education must also be viewed in the larger context of advancing the quality of social work practice itself" (p. 8). Field experience provides students with the opportunity to apply what they learn in classroom instruction including the critical EBP-related ability to integrate theory and practice (Berg-Weger & Birkenmaier, 2000; Bogo & Globerman, 1999; Bogo & Vayda, 1998; Knight, 2001; Mishna & Rasmussen, 2001; Power & Bogo, 2002). Students often struggle to apply theories from class with working with actual clients. They highly value observing experienced social workers model their practice and decision-making skills, and thenhaving a chance to test their own skills with constructive feedback (Knight, 2001; Fortune, McCarthy, & Abramson, 2001).

There is a pervasive sense among field education experts that this aspect of social work training is undervalued (Knight, 2001; Reisch & Jarman-Rohde, 2000). In a review of research articles. Lager & Robbins (2004) found that less than 1% of social work articles were dedicated to field education. Yet, researchers studying the best methods for implementing EBP have identified field education as essential for students to learn how to apply EBP skills

EVIDENCE-BASED PRACTICE IN FIELD EDUCATION 3 8 1

in real world settings (Hatala & Guyatt, 2002). Thus, understanding how evidence-based practice is understood and taught by social work field instructors is vitally important for assuring that students will receive EBP training in classroom and field settings.

In May of 2001, the George Warren Brown (GWB) School of Social Work became the first school within the profession to adopt evidence-based practice as a guiding focus of its curriculum. As explicated by Howard et al. (2003), "Curriculum modifications were instituted such that formal instruction in the methods critical to evidence-based practice were integrated throughout the foundation and concentration-level coursework" (p. 6). In the article, Howard et al. (2003) define evidence-based practice, describe GWB's process of adopting and implementing evidence-based practice instruction, and discuss potential limitations associated with adopting EBP. Among the concerns noted, was the potential adverse effect it might have on our relationships with members of thepractice community and adjunct faculty. They cautioned that

schools of social work developing evidenced-based practice curricula will need to carefully consider service issues relating to field education.... Schools of social work have typically, and justifiably regarded their field instructors as practice experts. To move away from that assumption may compromise relationships with agencies that are valued by the school and its students, (pp. 21-22)

This concern about maintaining positive relationships between schools and field sites

merits further consideration, because field instructors and agencies provide instruction for students voluntarily, and typically without work release time or compensation (Globerman & Bogo, 2002,2003). Meanwhile, potential field instructors are dealing with greater time and resource constraints precipitated by recent trends in social service delivery (Lager & Robbins, 2004; Reisch & Jarman-Rohde, 2000). Demands on social workers have become so significant that some have questioned how effectively they can be expected to serve as field educators (Dettlaff & Wallace, 2002; Globerman & Bogo, 2003). These circumstances have led Reisch and Jarman-Rohde (2000) to conclude that future students will be expected to learn more independently, even as their cases become more complicated. Of course, there are benefits to individuals and organizations for accepting students, such as a ready-made pool of trained potential employees, extra workers who can take clients from bloated caseloads, and innovations brought from the university (Globerman ?& Bogo, 2003).

Bogo and Globerman (1999) have studied interorganizationalrelationshipsbetweenfield agencies and social work schools, and they have identified four primary issues affecting social workers' willingness tobecomefieldeducators: (1) commitment to education, (2) organizational resources/support, (3) effective interpersonal relationships,and (4) the natureof collaborative relationships (reciprocalbenefits). Historically, field educators reported being motivated to accept students primarily by their personal valuation of generative activities, but more recent research has found that external factors are more commonly reported, including organizational valuation of education, expectations

382

JOURNAL OF SOCIAL WORK EDUCATION

of employing agencies, and recognition from the university and agency (Globerman & Bogo, 2003). Social workers who serve as Held educators feel they have something unique to offer students and they are motivated by the value they feel from schools of social work (Globerman & Bogo, 2003), so EBP must not be seen as a challenge to expertise or as a harsh critique of current practice methods.

Not only are the relationships between the school and field agencies potentially at risk, but so is the relationship between students and field instructors. This relationship has been identified by researchers, students, and alumni as the key component of field education, so potential sources of conflict require diligent action (Bogo, 1993). Research on the sources of field instructor-student conflict is rare, but one study found that differences in beliefs about effective interventions were a significant source of conflict (Giddings, Vodde, & Cleveland, 2003).

Students could potentially be receiving contradictory messages from the school and their field instructor about the relevance of EBP to practicum activities (Mishna & Rasmussen, 2001; Savaya, Peleg-Oren, Stange, & Geron, 2003). If the school does a thorough job of teaching students EBP methods in their fields of interests they could conceivably be more knowledgeable about such methods than their field instructors. Consequently, field instructors could feel uncomfortable, inadequate or embarrassed about not knowing about the latest and best EBP methods in their practice area and feel that their credibility and authority have been undermined. And while significant advances have occurred with regard to EBP, especially in areas such

as substance abuse (Howard, 2002; Walker, Howard, Walker, Lambert, & Suchinsky, 1995), given the breadth of social work practice issues and client populations, there remain large gaps that might reinforce the notion that practice decisions should be based on tradition and authority. Clearly, strategies for integrating EBP into field education are needed.

Initially, as GWB began implementing EBP into the curriculum, we turned to community advisory boards comprised of practitioners, faculty, and students for recommendations about how to integrate EBP into our practicum sites. Our field education program consulted the practicum-advisory committee, which recommended that we begin by conducting a survey of practicum sites and Held instructors in an effort to deepen our understanding of their views and attitudes toward EBP, and the degree to which EBP appears to be currently in use within these sites. Thus, we undertook a survey in the fall of 2002 to gather information about the degree to which Held instructors supported and used EBP, and had access to and used professional resources to update and strengthen their practice.

Method

Design and Instrument

To survey MSW social work practicum site supervisors, known as "Held instructors" in our system, the authors constructed a 25-item selfadministered questionnaire that contained both open- and closed-ended questions designed to collect information regarding EBP, resource utilization, professional title, credentials, and practice area. The closed-ended questions were a combination of dichotomous responses (i.e..

EVIDENCE-BASED PRACTICE IN FIELD EDUCATION 3 8 3

yes/no), 5-point Likert-type scales ranging from strongly agree to strongly disagree, and 4-point scales ranging from always to never.

The definition of EBP was drawn from inhouse documents written by GWB faculty and summarized for the respondents to enhance consistency in interpretation of the term. Two faculty members that wrote these documents reviewed drafts of the questionnaire prior to its administration. The questionnaire was pilot-tested with four practicum field instructors and reviewed by the GWB practicum-advisory committee, and revised based on their critique. The survey was confidential rather than anonymous and this study was reviewed and exempted by our university's Human Subjects Review committee.

Sample

The sampling frame for the survey consisted of a list of 761 affiliated field instructors located within 418 local, national, and international agencies that had previously been approved as practicum sites. Questionnaires were mailed to everyone listed in the sampling frame, along with a cover letter explaining the purpose of the survey and a self-addressed stamped return envelope. Of the 761 questionnaires mailed, 161 (21%) were returned as undeliverable, addressee unknown or through indications from the agency that the identified Held instructor was no longer there, or unable to provide practicum opportunities. Consequently, the list of potential respondents was reduced to 600 field instructors within 399 agencies. The initial mailing only yielded a 13% response rate (78 returned questionnaires). The authors employed several follow-up measures in an effort to obtain a better response rate.

which increased the response rate of field instructors to 47%, and included data from 180 different agencies (45%).

Data Analysis

As this is a descriptive study, the statistical analysis was comprised mainly of descriptive statistics, frequencies, and chi-squares. In addition, t tests and ANOVAs were run, and post hoc analysis was conducted when appropriate to determine which means differed significantly.

Results

Description of Respondents

The descriptive characteristics of interest in this project were professional title, credentials, practice area and attendance at a field education conference on EBP. A third of the respondents identified themselves as administrators/managers and nearly a quarter of them as clinical social workers. The other respondents identified as medical social workers (9%, n=25), school social workers (8%, n=21), case managers (4%, ?=11), researchers (1%, w=4), policy analysts (1%, ?=3) or some other type of social worker or professional from a different discipline (20%, n=55). Although this last category represents one fifth of the sample, unfortunately the responses were too varied to analyze separately, or make comparisons across disciplines. For example, it included seven LPC/counselors, five attorneys, four community workers, two gerontologists, one psychologist, one psychiatrist, and one medical doctor.

The majority of the respondents (58%, n=161) were Licensed Clinical Social Workers (LCSW), 15% (n=42) had obtained Academy of

384

JOURNAL OF SOCIAL WORK EDUCATION

Certified Social Workers (ACSW) certification, 29% (n=81) classified themselves as "other," which included credentials outside of the social work profession, 1.5% (n=4) were certified as a Diplomat in Clinical Social Work (DCSW) by NASW, and 1% (n=3) were Qualified Clinical Social Workers (QCSW). The total exceeds 100% since it is possible to hold more than one of these credentials. Of the field instructors who responded to the question regarding credentials (M=271), 18% (n=49) indicated that they had none of those listed above. All of the credential categories were collapsed into dichotomous yes/no categories to allow for the use of chi-square to examine whether having or not having credentials was associated with specific questionnaire responses.

In an effort to be comprehensive, 35 different practice areas were listed, and respondents were asked to select their primary practice area. There was also the option of selecting "other," which generated the largest response, with 25% (n=71) of respondents selecting this option. Given the wide range of practice areas selected, it was not possible to meaningfully collapse categories and analyze differences in responses based on area of practice. A list of respondents' practice areas has been summarized in Table 1.

In the fall of 2001 after adopting EBP as a teaching theme, GWB held a conference entitled "Practice Makes Perfect: The Evidence-Based Route to Your Best Social Work Outcomes?" as an avenue tobegin working with our practicum sites around the idea of employing EBP. Just 16% (?=44) of the respondents had attended that conference, which means that in all likelihood, at the time of the survey only a small percentage ofthe respondents were aware ofthe

fact that GWB had adopted EBP as a teaching theme. There were no significant differences between those who attended the conference and those who did not in terms of their views on the usefulness of EBP, or in their current level of implementing it in their practice.

Evidence-Based Practice

The primary interest in this survey was in the degree of support for and use ofEBPby GWB practicum field instructors, the assessment of which seemed crucial given the influence that practicum instructors have on MSW students. A definition for EBP was constructed by summarizing information provided by Howard et al. (2003), and in the questionnaire read as follows:

Evidence-Based Practice is the conscientious and judicious use of current best practice in decision-making about interventions at all system levels. Conscientious includes both consistently applying evidence, and continuing to learn as new evidence becomes available. Judicious includes balancing client characteristics, preferences, and life circumstances against relevant research/practice guidelines (expert consensus, research-based treatment recommendations).

Evidence Based Practice involves four steps: (1) formulating specific answerable questions regarding practice situations and identifying practice information needed, (2) finding and critically appraising the best scientific evidence, (3) applying the practice-relevant scientific evidence in the treatment process, and

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download