Acknowledgments: The activities described in this article were primarily supported by funding from the Edna McConnell Clark Foundation. Authors are also grateful to Sister Paulette LoMonaco and EBP workgroup members: Rachel Forsyth, Kathy Gordon, Ellen O'Hara-Cicero, Lina Pasquale, Valerie Segal, Kerrie Thompson, Diana Torres, and Jennifer Zanger for their contributions to the process described in this article. We thank Elise Cappella, Elizabeth DiLuzio, and Kathy Gordon for their comments on drafts of this article.Conversations about how to promote the greater use of research and empirically supported work within social service organizations are often missing a critical voice: staff working within these organizations. Supporting this view, a recent report from the William T. Grant Foundation-marking its sixth year in promoting investigations of the use of research evidence-highlighted gaps in understanding of user perspectives and contexts (Maciolek, 2015). Additionally, Tseng (2012) has observed that much of the current effort to increase research utilization is dominated by a "producerpush" orientation.1 Within this orientation, strategies to increase use of research evidence focus on producing rigorously evaluated models, increasing communication about these models (e.g., through online clearinghouses), and defining research-driven steps to implementation, with less attention paid to the processes within a social service organization that can promote or hinder research uptake.The value of examining research utilization from the perspective of staff within social service organizations is further supported by recent articles on evidence-based practice (EBP) that emphasize the impact of organizational context and culture on successful utilization (Ehrhart, Aarons, & Farahnak, 2014; Gray, Joy, Plath, & Webb, 2015; Johnson-Motoyama & Austin, 2008). For example, Aarons and Palinkas (2007) emphasize the influence of organizational factors such as leadership, resource allocation, and training on successful implementation of evidence-based models. In their article on how organizational contexts can support practitioners to be researchminded, McBeath and Austin (2015) also identify institutional, cultural, leadership, and professional development factors as key. They underscore the criticality of framing research utilization within an organization as mission-relevant and as connected to growth, learning, and innovation.With the goal of furthering the understanding of the role organizational culture and context play in research utilization, this article focuses on our efforts to integrate EBP into decision-making processes for program planning and evaluation. We describe the development of an agency-wide approach to promote the consistent consideration of research and data. Drawing from research utilization, EBP, and implementation science literature, our efforts entailed clarifying the meaning of EBP, demystifying terminology, and developing tools for incorporating research knowledge and models into program designs.Organizational ContextServing as the case study for this article, Good Shepherd Services (GSS) is a multi-service organization that strives to connect children, youth, and families living in under-resourced neighborhoods with opportunities for success. Based in New York City, GSS provides a network of youth and family development, education, and child welfare services. In fiscal year 2016, GSS served 30,365 people across 87 programs with a budget of $88.7 million and 703 full-time and 479 part-time staff.The effort to define an agency-wide EBP approach began in 2011 with a confluence of internal and external factors that increased staff focus on data and research. Internally, GSS had already embraced the identity of a Learning Organization with a commitment to critical reflection, knowledge sharing, and ongoing improvement from a strengths-based perspective (Senge, 1990). …
[1]
W. C. Barrett.
Solution-Based Casework
,
2020
.
[2]
B. McBeath,et al.
The Organizational Context of Research-Minded Practitioners
,
2015
.
[3]
Stephen A. Webb,et al.
What Supports and Impedes Evidence-Based Practice Implementation? A Survey of Australian Social Workers
,
2015
.
[4]
Gregory A Aarons,et al.
Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS)
,
2014,
Implementation Science.
[5]
Abraham Wandersman,et al.
Toward an Evidence-Based System for Innovation Support for Implementing Innovations with Quality: Tools, Training, Technical Assistance, and Quality Assurance/Quality Improvement
,
2012,
American journal of community psychology.
[6]
V. Tseng.
The Uses of Research in Policy and Practice and commentaries
,
2012
.
[7]
B. Thyer,et al.
The quest for evidence-based practice: A view from the United States
,
2011
.
[8]
Ronald W. Thompson,et al.
The Transition Status of Youth Departing Residential Care
,
2010
.
[9]
Bonnie Spring,et al.
Toward a transdisciplinary model of evidence-based practice.
,
2009,
The Milbank quarterly.
[10]
M. Austin,et al.
Evidence-Based Practice in the Social Services
,
2006,
Journal of evidence-based social work.
[11]
D. DeGarmo,et al.
Early development of delinquency within divorced families: evaluating a randomized preventive intervention trial.
,
2005,
Developmental science.
[12]
Abraham Wandersman,et al.
Getting to outcomes: a results-based approach to accountability
,
2000
.
[13]
D. Sackett,et al.
Evidence based medicine: what it is and what it isn't
,
1996,
BMJ.
[14]
เถลิงศก โสมทิพย์.
THE FIFTH DISCIPLINE The Art & Practice of the Learning Organization
,
2012
.
[15]
S. Bloom.
CREATING SANCTUARY: HEALING FROM SYSTEMATIC ABUSES OF POWER
,
2003
.