• Technical Support
  • Find My Rep

You are here

Research on Social Work Practice

Research on Social Work Practice

Preview this book

  • Description
  • Aims and Scope
  • Editorial Board
  • Abstracting / Indexing
  • Submission Guidelines

There is a growing movement in social work toward a more empirical selection of therapies and interventions because, to be effective, you have to know what works. As the community of practitioners, scholars and students interested in applying scientific methods of analysis to social work problems continues to grow, the need for a publication dedicated to social work practice outcomes has never been greater. Research on Social Work Practice is the first professional social work journal to focus on evaluation research and on validating methods of assessment in social work practice.

Vital Information Research on Social Work Practice is a disciplinary journal devoted to the publication of empirical research concerning the assessment methods and outcomes of social work practice.  Social work practice is broadly interpreted to refer to the application of intentionally designed social work intervention programs to problems of societal or interpersonal importance.  Interventions include behavior analysis and therapy; psychotherapy or counseling with individuals; case management; education; supervision; practice involving couples, families, or small groups; advocacy; community practice; organizational management; and the evaluation of social policies.

The journal primarily serves as an outlet for the publication of:

  • Original reports of evidence-based evaluation studies on the outcomes of social work practice.
  • Original reports of empirical studies on the development and validation of social work assessment methods.
  • Original evidence-based reviews of the practice-research literature that convey direct applications (not simply implications) to social work practice.  The two types of review articles considered for publication are: 1) reviews of the evidence-based status of a particular psychosocial intervention; and 2) reviews of evidence-based interventions applicable to a particular psychosocial problem.

Comprehensive Coverage Each issue of Research on Social Work Practice brings you the latest scholarship to help bridge the gap between research and practice. Regular features include: Outcome Studies New Methods of Assessment Scholarly Reviews Invited Essays Book Reviews

In-Depth Special Issues Research on Social Work Practice frequently supplements its broad coverage with in-depth studies of topics of particular concern through Special Issues or Special Sections. Previous examples include:

  • Research on Social Work Practice in Chinese Communities (Vol.12, n.4)
  • Honoring Walter W. Hudson (Vol.12, n.1)
  • Flexner Revisited (Vol.11, n.2)
  • Research on Social Work Practice in Ireland (Vol.10, n.6)
  • Technology and Social Work (Vol.10, n.4)
  • Australian Social Work Research (Vol.10, n.2)

By connecting practice and research in an artful and readable fashion, RSWP has provided a synergy for the helping professions — the vital recognition that without research, practice is blind; and without practice, research is mute. — Martin Bloom Professor, School of Social Work, University of Connecticut In the relatively few years since its inception, Research on Social Work Practice has become one of the most highly respected and frequently cited journals in our field. Researchers, practitioners, and students have all found its contents to be invaluable in their work. — Dianne Harrison Montgomery Dean and Professor, School of Social Work, Florida State University   The unique manner in which the editors cover the broad spectrum of research on social work practice is destined to make the journal become a classic in the field. This is a must reading for all engaged in any level of practice research. — Moses Newsome, Jr. Dean, School of Social Work, Norfolk State University Past-President, Council on Social Work Education This journal is a member of the Committee on Publication Ethics (COPE) .

Research on Social Work Practice , sponsored by the Society for Social Work and Research, is a disciplinary journal devoted to the publication of empirical research concerning the methods and outcomes of social work practice. Social work practice is broadly interpreted to refer to the application of intentionally designed social work intervention programs to problems of societal and/or interpersonal importance, including behavior analysis or psychotherapy involving individuals; case management; practice involving couples, families, and small groups; community practice education; and the development, implementation, and evaluation of social policies.

Florida State University, USA
Hong Kong Baptist University, Hong Kong
Southern Connecticut State University, USA
Wayne State University, USA
Keimyung University, The Republic of Korea
Hunter College, USA
University of North Carolina at Chapel Hill, USA
Florida State University, USA
Temple University, USA
University of Hong Kong
California State University - San Bernardino, USA
East China University of Science & Technology, China
Florida State University, USA
The University of North Carolina at Greensboro, USA
Dartmouth College, USA
University of Scranton, USA
University of Alabama, USA
Utica College, USA
Assiut University, Egypt
Abilene Christian University, USA
University of Wolverhampton, UK
University at Fredonia, SUNY, USA
Wright State University, USA
Virginia Commonwealth University, USA
University of South Dakota, USA
Howard University, USA
University of Texas at Arlington, USA
Icahn School of Medicine at Mount Sinai, USA
University of Edinburgh, UK
Arizona State University, USA
University of Utah, USA
University of Cincinnati, USA
University of Greenwich, UK
Monash University, Australia
University of New South Wales, Australia
University of Birmingham, UK
The Chinese University of Hong Kong
Texas State University, USA
Hong Kong Polytechnic University, Hong Kong
Troy University at Dothan, USA
Columbia University, USA
University of Alabama, USA
University of Connecticut, USA
North Carolina Central University, USA
University of North Carolina at Chapel Hill, USA
Beijing Normal University, China
Hong Kong Baptist University, Hong Kong
  • Applied Social Sciences Index & Abstracts (ASSIA)
  • Asia Pacific Database
  • Central Asia: Abstracts & Index
  • Clarivate Analytics: Current Contents - Physical, Chemical & Earth Sciences
  • Corporate ResourceNET - Ebsco
  • Current Citations Express
  • EBSCO: Vocational & Career Collection
  • MasterFILE - Ebsco
  • Middle East: Abstracts & Index
  • North Africa: Abstracts & Index
  • OmniFile: Full Text Mega Edition (H.W. Wilson)
  • ProQuest: CSA Sociological Abstracts
  • Psychological Abstracts
  • Social Care Online
  • Social SciSearch
  • Social Sciences Citation Index (Web of Science)
  • Social Services Abstracts
  • Social Work Abstracts
  • Southeast Asia: Abstracts & Index
  • Standard Periodical Directory (SPD)
  • TOPICsearch - Ebsco
  • Wilson Social Sciences Index Retrospective

Guidelines for Authors

Research on Social Work Practice (RSWP) is a peer-reviewed disciplinary journal devoted to the publication of empirical research concerning the outcomes of social work practice. Social work practice is broadly interpreted to refer to the application of intentionally designed social work intervention programs to problems of societal and/or interpersonal importance. Interventions include, but are not limited to, behavior analysis and therapy, psychotherapy or counseling with individuals, cognitive therapy, case management/care coordination, education, supervision, practice involving couples, families, or small groups, advocacy, community practice, organizational management, and the evaluation of social policies. At least one author of a submitted article must be a professional social worker, and/or the interventions evaluated must have been provided by professional social workers.

The journal will primarily serve as an outlet for the publication of:

1. Original reports of empirically-based evaluation studies on the outcomes of social work practice;

2. Systematic reviews or meta-analyses of the practice-research literature that convey direct applications (not simply implications) to social work practice. The only two types of systematic reviews considered for publication are:

A. Systematic reviews of the evidence-based status of a particular psychosocial intervention or assessment method, or B. Systematic reviews of different psychosocial interventions applicable to clients with a particular psychosocial problem.

The journal welcomes empirical research appropriately derived from a variety of etiological and intervention theories, as well as studies which focus on evaluations not based upon formal theoretical frameworks. Studies using diverse methodologies, such as group or single-system research designs, qualitative approaches, mixed methods approaches, and interdisciplinary works are welcome to be submitted. Replication studies are welcome, as are well-designed studies with negative findings or reports of treatment failures. Authors are encouraged to submit only articles of the highest quality for editorial review and possible publication. The submission of seriously flawed or marginal studies is discouraged. Reports of inferential statistics involving significant differences must be accompanied by suitable measures of effect sizes and their appropriate confidence intervals, and include a discussion of the practical impact indicated by these effects.

Articles reporting original research involving data collection from human beings must include a statement indicating the source of Institutional Review Board Approval (blinded in the original submission) or a clear statement addressing why IRB review was not necessary.

Manuscripts which do not fit into one of the above two categories should not be submitted, and if received will be promptly returned to the author un-reviewed. Occasionally other types of submissions are published in the journal (e.g., guest editorials, conference proceedings, research center descriptions), but these are usually invited and accepted at the discretion of the Editor.

Inappropriate Submissions: The journal does not usually publish narrative case studies, surveys, program descriptions, theoretical, philosophical or conceptual works, correlational investigations, historical reviews, retrospective predictor studies, purely methodological articles, descriptive studies, or needs assessments. The journal no longer accepts for review psychometric studies, reports of the development and validation testing of measurement methods useful for research or practice . Authors are urged to submit such studies to the many other social work journals which do not have the intervention-research focus of Research on Social Work Practice . The journal publishes occasional special issues devoted to a particular topic and readers with an interest in proposing a topic for such a special issue and to serve as a Guest Editor for that issue are welcome to contact the Editor.

Authors are encouraged to make pre-publication use of a data-depository ( http://www.nature.com/sdata/policies/repositories#general ) to ensure post-publication access to their data and to indicate this in the submitted manuscript. At a minimum, reports of original data-based research should include a statement from the authors indicating where qualified researchers may obtain a copy of the data and data-coding manual (this is usually the corresponding author). This stipulation is to encourage transparency in the reporting process and to promote re-analysis and replication efforts by independent scholars.

Authors whose native language is not English are encouraged to have their submission carefully edited by English language experts prior to submission. Sage Publications Inc. offers such a service, which can be located at: http://languageservices.sagepub.com/en/

Authors not familiar with current APA style are encouraged to review the free online style guides provided by the American Psychological Association, which can be located at:

http://www.apastyle.org/index.aspx?_ga=1.161514751.2121075784.1468782120 . Submissions out of compliance with APA style will be returned un-reviewed.

As part of our commitment to ensuring an ethical, transparent and fair peer review process Sage is a supporting member of ORCID, the Open Researcher and Contributor ID . 

ORCID provides a unique and persistent digital identifier that distinguishes researchers from every other researcher, even those who share the same name, and, through integration in key research workflows such as manuscript and grant submission, supports automated linkages between researchers and their professional activities, ensuring that their work is recognized. 

We encourage all authors and co-authors to link their ORCIDs to their accounts in our online peer review platforms. It takes seconds to do: click the link when prompted, sign into your ORCID account and our systems are automatically updated. We collect ORCID iDs during the manuscript submission process and your ORCID iD then becomes part of your accepted publication’s metadata, making your work attributable to you and only you. Your ORCID iD is published with your article so that fellow researchers reading your work can link to your ORCID profile and from there link to your other publications.

If you do not already have an ORCID iD please follow this link to create one or visit our ORCID homepage to learn more.

Research on Social Work Practice (RSWP)  may accept submissions of papers that have been posted on pre-print servers; please alert the Editorial Office when submitting and include the DOI for the preprint in the designated field in the manuscript submission system. Authors should not post an updated version of their paper on the preprint server while it is being peer reviewed for possible publication in the journal. If the article is accepted for publication, the author may re-use their work according to the journal's author archiving policy.

If your paper is accepted, you must include a link on your preprint to the final version of your paper.

Visit the Sage Journals and Preprints page for more details about preprints.

Guidelines for Preparing Quantitative Outcome Studies

The journal requires that accepted quantitative manuscripts be formatted in compliance with the Journal Article Reporting Standards (JARS) found in the sixth edition of the APA Publication Manual . Note that apart from general guidelines, there are separate additional guidelines for reporting quasi-experimental and experimental studies, as well as for meta-analyses. There are also guidelines for reporting a study participant flow chart, which should be included in nomothetic outcome studies. Mixed methods papers including quantitative analyses should have these elements of the article compliant with these guidelines. Causal inferences, if any, should be made conservatively and not go beyond the limits imposed by the presented methods and data.

Single-case research studies which build upon traditional case narrative reports by adding the systematic and empirical measurement of clinically relevant variables (e.g., client’s problems or strengths) before, during and after treatment begins, are welcome submissions. Outcome measures must have acceptable levels of reliability and validity, the intervention must be well-described, and any causal inferences drawn must not go beyond those legitimately derived from the data. Data must be presented in the form of line graphs. The guidelines by Kratochwill et al. (2010) are recommended in this regard.

Articles reporting the results of a quasi-experimental outcome study must follow the standards found in the Transparent Reporting of Evaluation Studies using Nonrandomized Designs (TREND) checklist. Include a completed TREND Checklist as an appendix to your paper. See http://www.cdc.gov/trendstatement/ .

Articles reporting a randomized controlled trial must follow the Consolidated Reporting Standards for Randomized Trials (CONSORT), and include a completed CONSORT Checklist. See http://www.consort-statement.org/consort-statement/ . The authors of outcome studies evaluating non-pharmacological interventions (e.g., psychosocial treatments) are urged to familiarize themselves with relevant guidelines useful for reporting such studies. Grant et al. (2013) is a recommended resource for authors to consult, as is Boutron, Ravaud and Moher (2012).

Authors submitting a randomized clinical trial (RCT) or quasi-experimental outcome study for review and publication are strongly encouraged to have pre-registered their study protocol in a suitable clinical trials registry , such as clinicaltrials.gov. The article by Harrison and Mayo-Wilson (2014) can provide guidance regarding the rationale for and process of pre-registering their protocol. The submitted article should include a statement giving the reference to any clinical trials registry they have submitted their protocol to.

Guidelines for Preparing Systematic Reviews and Meta-Analyses

RSWP welcomes well-crafted empirically-based reviews of the treatment literature. Such manuscripts should present either the evidence regarding a particular psychosocial intervention , various interventions for a particular psychosocial problem or a critical review of treatment studies focused on a particular disorder, problem or condition. Review articles should have a clear social work focus, and cite the relevant social work literature, if any exists, in addition to pertinent findings from the broader behavioral and social sciences. Manuscripts of this type should provide the reader with clear and compelling applications to practice, not untested implications.

Articles claiming to be a Systematic Review should adhere to the guidelines for preparing systematic reviews developed by the Cochrane Collaboration (Higgins & Green, 2009) or the Campbell Collaboration (2014). In addition, the authors of systematic reviews and meta-analyses must follow the guidelines found in the PRISMA Statement ( Preferred Reporting Items for Systematic Reviews and Meta-analyses ), found at: http://www.prisma-statement.org/ .

If the article does not follow these standards, the paper should be titled as A Narrative Review, or simply A Review , and the specific term Systematic Review should be avoided.

Authors submitting a systematic review for review and publication are strongly encouraged to have pre-registered the review protocol in a suitable registry, such as PROSPERO ( www.crd.york.ac.uk/PROSPERO ). The article by Stewart, Moher and Shekelle (2012) can provide guidance regarding the rationale for and process of pre-registering systematic review protocols. The submitted article should include a statement giving the reference to any registry in which the protocol is published.

The EQUATOR Network (Enhancing the QUAlity and Transparency of Health Research) is a recommended resource for authors preparing studies for submission to RSWP which deal with the general topic of health care. See http://www.equator-network.org/ .

Completed copies of relevant TREND, CONSORT or PRISMA checklists should be included as a separate supplemental file when submitting the manuscript online.

Guidelines for Preparing Qualitative Studies

RSWP welcomes well-written rigorous qualitative outcome studies. Studies of the processes of an intervention, absent credible evidence that the intervention actually produces positive effects, are not invited for submission. Authors are encouraged to judiciously take advantage of the journal’s lack of a page limitation and craft a manuscript that details the context and methods to provide transparency of the study. The qualitative methodology used must be consistent throughout the study. The sampling, data collection, and analysis should make sense considering the chosen research question and the method. Authors should describe strategies employed to ensure the trustworthiness and credibility of the study, and provide a replicable audit trail. Qualitative data analysis software may be appropriately used in the analysis, but is not required. For suggestions on creating well-written qualitative article consult Fawcett et al. (2014), Staller and Krumer-Nevo (2013) and Pratt (2009).

How to submit a manuscript: The journal requires authors to use the MANUSCRIPT CENTRAL web-based portal to submit their manuscripts. The submission portal is available via http://mc.manuscriptcentral.com/rswp

Use of the Journal Article Reporting Standards: All submissions are required to be prepared using the formatting standards found in the 6th Edition (2010) of the APA Publication Manual. Authors of data-based papers are specifically asked to adhere to the relevant Journal Article Reporting Standards (JARS). The Editor is available to consult with you about any questions you may have regarding complying with these standards. They have been adopted to help promote consistency in research reporting, to try and further elevate the standards of work appearing in Research on Social Work Practice , and to ultimately improve the credibility of research findings available to the profession and the public. The abstracts of research articles must include the following headings: Purpose:, Methods:, Results:, Conclusions:. Manuscripts not adhering to current APA style conventions will be returned to the authors un-reviewed and with a request to revise their paper and to resubmit it. A very common error is for authors to inappropriately include the issue number following the volume number, in citations to articles appearing in journals paginated by year. See the APA manual if you are not clear when issue numbers should and should not be included. Some bibliographic software programs automatically include issue numbers, and these should be manually deleted, if necessary.

All manuscripts should include an abstract on a separate page that contains no more than 150 words, and also a separate title page (designated as Title Page) which includes: 1) title of the article; 2) corresponding author's full name, current position, affiliation, institutional and email address, telephone and fax numbers; 3) co-author(s)' full name(s) and affiliation(s); 4) up to five key words as they should appear if they were to be published. Manuscripts will not be considered for submission if they do not include these elements. Tables and/or Figures are to be included when necessary to depict the results. There is no specific limit on the total number of pages, tables or figures.

Authors submitting manuscripts are protected by common law against the unauthorized use of their unpublished work. Specifically, an unpublished manuscript is considered to be a confidential or privileged paper. All reviewers will be asked to destroy or return the manuscript after their review is completed; in addition, reviewers will be asked not to circulate, quote, cite, or refer to the unpublished work in any way unless specific permission is granted by the author.

Artwork Submissions

High-resolution figures should be uploaded as separate electronic files, with callouts for each in the text. Figure legends should include full explanations of the figures and be typewritten double-spaced with numbers corresponding to those on the figure files themselves. All figures must be specifically referred to in the text and numbered in order of appearance in the text. Acceptable file formats for figures include TIFF, EPS, and JPEG, and PDF Microsoft Application Files are acceptable for vector art (line art). Permission for use of the copyrighted material is the responsibility of the author. All artwork must be camera ready.

Tables should be numbered consecutively corresponding to in-text citation. Each table should be prepared on a separate page at the end of the text document and preferably should be no larger than a single page. Include a brief descriptive title of the table and a footnote with explanation of any abbreviations. All tables must be specifically referred to in the text for placement and numbered in order of appearance in the text. Elements in tables should be separated by tabs, not cells or lines.

Conflict of Interest

Authors are required to disclose any commercial, financial, or other associations that could pose a conflict of interest in connection with their submitted article and these must be disclosed on the title page at the time of submission.

Financial Disclosure/Funding

Authors should list all funding sources (and ID numbers, as appropriate) related to the study and to the article preparation.

Once a manuscript is accepted for publication, the corresponding author will be required to complete an electronic copyright transfer form. From SageTRACK website “Corresponding Author Center” choose the correct manuscript from “Manuscripts with Decisions” and from the ACTION box on the far right side, choose “Contributor Form.” After reading the form and completing the appropriate boxes, clicking the “I accept” box will confirm appropriate copyright transfer.

Authors are required to submit written permission from the original publisher to reprint copyright-protected material, including quoted material of 300 words or more from a single source (journal article or book).

Submission of a manuscript implies commitment to publish in this journal. Authors submitting manuscripts to the journal must not simultaneously submit them to another journal, nor should manuscripts have been published elsewhere in substantially similar content. All authors of a submitted manuscript must be made aware of and consent to the submission.

Publish Ahead of Print With OnlineFirst

OnlineFirst is a feature in which completed articles are published online prior to their inclusion in a print issue, offering authors the advantage of making their research accessible to the public in a more timely manner. Only online subscribers can view these PDFs, but abstracts are available to the public to view for free. Each OnlineFirst manuscript is citable by the publication date of the manuscript’s first online posting and the Digital Object Identifier (DOI), providing a persistent, permanent way to identify manuscripts published in the online environment. You can cite OnlineFirst articles as follows:

Author’s last name, first initials. Article title. Journal title. Pre-published month day, year; DOI: 10.1177/ 0123456789123456

Once your article has completed the production process and before it is published in a print issue, it will be posted online. You can access RSWP OnlineFirst articles on the Web at http://rswp.sagepub.com/pap.dtl . Once posted online, articles may not be retracted or edited. If your article is not completed prior to its publication date, it will not go on OnlineFirst but will be posted online with the issue in which it is published.

The journal uses a blind peer review system to evaluate manuscripts, and the expertise of the Editorial Board members is augmented by the extensive use of Guest Reviewers. Most authors receive an initial editorial decision within two months of submission, accompanied by constructive peer commentary. Most articles eventually accepted for publication undergo extensive author-completed revisions, based on peer-review commentary, prior to acceptance. The journal has a modest backlog of accepted manuscripts, thus authors of accepted manuscripts can expect a lag of about 12 months or less, from final acceptance to print publication. However, the journal has a publish-ahead-of-print service in that the final, corrected and accepted version of their paper will be published electronically on the journal’s website, with a ‘doi’. This will permit its ready access to the community of scholars, students, and practitioners months ahead of print publication. These articles will be both citable and downloadable. Articles are published in the general order of their acceptance.

Boutron, I., Ravaud, P. & Moher, D. (2012). Randomized clinical trials of nonpharmacological treatments. New York: CRC Press.

Campbell Collaboration. (2014). Campbell Collaboration systematic review: Policies and guidelines. The Campbell Collaboration. Available from www.campbellcollaboration.org

Fawcett, S. E., Waller, M. A., Miller, J. W., Schwieterman, M. A., Hazen, B. T., & Overstreet, R. E. (2014). A trail guide to publishing success: Tips on writing influential conceptual, qualitative, and survey sesearch. Journal of Business Logistics , 35 (1), 1-16.

Grant, S., Montgomery, P., Hopewell, S., Macdonald, G., Hoher, D. & Mayo-Wilson, E. (2013). Developing a reporting guideline for social and psychological intervention trials. Research on Social Work Practice, 23, 595-602.

Harrison, B. A. & Mayo-Wilson, E. (2014). Trial registration: Understanding and preventing

bias in social work research. Research on Social Work Practice, 24, 372-376.

Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M. &

Shadish, W. R. (2010). Single-case designs technical documentation . Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/Document/229

Pratt, M. G. (2009). From the editors: For the lack of a boilerplate: Tips on writing up (and reviewing) qualitative research. Academy of Management Journal , 52 , 856-862.

Staller, K. M., & Krumer-Nevo, M. (2013). Successful qualitative articles: A tentative list of cautionary advice. Qualitative Social Work , 12 , 247-253.

Stewart, L., Moher, D. & Shekelle, P. (2012). Why prospective registration of systematic reviews makes sense. Systematic Reviews, 1 :7. doi:10.1186/2046-4053-1-7

  • Read Online
  • Sample Issues
  • Current Issue
  • Email Alert
  • Permissions
  • Foreign rights
  • Reprints and sponsorship
  • Advertising

Individual Subscription, E-access

Individual Subscription, Print Only

Institutional Backfile Purchase, E-access (Content through 1998)

Institutional Subscription, E-access

Institutional Subscription & Backfile Lease, E-access Plus Backfile (All Online Content)

Institutional Subscription, Print Only

Institutional Subscription, Combined (Print & E-access)

Institutional Subscription & Backfile Lease, Combined Plus Backfile (Current Volume Print & All Online Content)

Individual, Single Print Issue

Institutional, Single Print Issue

To order single issues of this journal, please contact SAGE Customer Services at 1-800-818-7243 / 1-805-583-9774 with details of the volume and issue you would like to purchase.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

The Pursuit of Quality for Social Work Practice: Three Generations and Counting

Enola proctor.

Shanti K. Khinduka Distinguished Professor and director of the Center for Mental Health Services Research at Washington University in St. Louis

Social work addresses some of the most complex and intractable human and social problems: poverty, mental illness, addiction, homelessness, and child abuse. Our field may be distinct among professions for its efforts to ameliorate the toughest societal problems, experienced by society’s most vulnerable, while working from under-resourced institutions and settings. Members of our profession are underpaid, and most of our agencies lack the data infrastructure required for rigorous assessment and evaluation.

Moreover, social work confronts these challenges as it is ethically bound to deliver high-quality services. Policy and regulatory requirements increasingly demand that social work deliver and document the effectiveness of highest quality interventions and restrict reimbursement to those services that are documented as evidence based. Social work’s future, its very survival, depends on our ability to deliver services with a solid base of evidence and to document their effectiveness. In the words of the American Academy of Social Work and Social Welfare (AASWSW; n.d.) , social work seeks to “champion social progress powered by science.” The research community needs to support practice through innovative and rigorous science that advances the evidence for interventions to address social work’s grand challenges.

My work seeks to improve the quality of social work practice by pursuing answers to three questions:

  • What interventions and services are most effective and thus should be delivered in social work practice?
  • How do we measure the impact of those interventions and services? (That is, what outcomes do our interventions achieve?)
  • How do we implement the highest quality interventions?

This paper describes this work, demonstrates the substantive and methodological progression across the three questions, assesses what we have learned, and forecasts a research agenda for what we still need to learn. Given Aaron Rosen’s role as my PhD mentor and our many years of collaboration, the paper also addresses the role of research mentoring in advancing our profession’s knowledge base.

What Interventions and Services Are Most Effective?

Answering the question “What services are effective?” requires rigorous testing of clearly specified interventions. The first paper I coauthored with Aaron Rosen—“Specifying the Treatment Process: The Basis for Effectiveness Research” ( Rosen & Proctor, 1978 )—provided a framework for evaluating intervention effectiveness. At that time, process and outcomes were jumbled and intertwined concepts. Social work interventions were rarely specified beyond theoretical orientation or level of focus: casework (or direct practice); group work; and macro practice, which included community, agency-level, and policy-focused practice. Moreover, interventions were not named, nor were their components clearly identified. We recognized that gross descriptions of interventions obstruct professional training, preclude fidelity assessment, and prevent accurate tests of effectiveness. Thus, in a series of papers, Rosen and I advocated that social work interventions be specified, clearly labeled, and operationally defined, measured, and tested.

Specifying Interventions

Such specification of interventions is essential to two professional responsibilities: professional education and demonstrating the effectiveness of the field’s interventions. Without specification, interventions cannot be taught. Social work education is all about equipping students with skills to deliver interventions, programs, services, administrative practices, and policies. Teaching interventions requires an ability to name, define, see them in action, measure their presence (or absence), assess the fidelity with which they are delivered, and give feedback to students on how to increase or refine the associated skills.

To advance testing the effectiveness of social work interventions, we drew distinctions between interventions and outcomes and proposed these two constructs as the foci for effectiveness research. We defined interventions as practitioner behaviors that can be volitionally manipulated by practitioners (used or not, varied in intensity and timing), that are defined in detail, can be reliably measured, and can be linked to specific identified outcomes ( Rosen & Proctor, 1978 ; Rosen & Proctor, 1981 ). This definition foreshadowed the development of treatment manuals, lists of specific evidence-based practices, and calls for monitoring intervention fidelity. Recognizing the variety of intervention types, and to advance their more precise definition and measurement, we proposed that interventions be distinguished in terms of their complexity. Interventive responses comprise discrete or single responses, such as affirmation, expression of empathy, or positive reinforcement. Interventive strategies comprise several different actions that are, together, linked to a designated outcome, such as motivational interviewing. Most complex are interventive programs , which are a variety of intervention actions organized and integrated as a total treatment package; collaborative care for depression or community assertive treatment are examples. To strengthen the professional knowledge base, we also called for social work effectiveness research to begin testing the optimal dose and sequencing of intervention components in relation to attainment of desired outcomes.

Advancing Intervention Effectiveness Research

Our “specifying paper” also was motivated by the paucity of literature at that time on actual social work interventions. Our literature review of 13 major social work journals over 5 years of published research revealed that only 15% of published social work research addressed interventions. About a third of studies described social problems, and about half explored factors associated with the problem ( Rosen, Proctor, & Staudt, 2003 ). Most troubling was our finding that only 3% of articles described the intervention or its components in sufficient detail for replication in either research or practice. Later, Fraser (2004) found intervention research to comprise only about one fourth of empirical studies in social work. Fortunately, our situation has improved. Intervention research is more frequent in social work publications, thanks largely to the publication policies of the Journal of the Society for Social Work and Research and Research on Social Work Practice .

Research Priorities

Social work faces important and formidable challenges as it advances research on intervention effectiveness. The practitioner who searches the literature or various intervention lists can find more than 500 practices that are named or that are shown to have evidence from rigorous trials that passes a bar to qualify as evidence-based practices. However, our profession still lacks any organized compendium or taxonomy of interventions that are employed in or found to be effective for social work practice. Existing lists of evidence-based practices, although necessary, are insufficient for social work for several reasons. First, as a 2015 National Academies Institute of Medicine (IOM) report—“Psychosocial Interventions for Mental and Substance Use Disorders: A Framework for Establishing Evidence-Based Standards” ( IOM, 2015 )—concluded, too few evidence-based practices have been found to be appropriate for low-resource settings or acceptable to minority groups. Second, existing interventions do not adequately reflect the breadth of social work practice. We have too few evidence-based interventions that can inform effective community organization, case management, referral practice, resource development, administrative practice, or policy. Noting that there is far less literature on evidence-based practices relevant to organizational, community, and policy practice, a social work task force responding to the 2015 IOM report recommended that this gap be a target of our educational and research efforts ( National Task Force on Evidence-Based Practice in Social Work, 2016 ). And finally, our field—along with other professions that deliver psychosocial interventions—lacks the kinds of procedure codes that can identify the specific interventions we deliver. Documenting social work activities in agency records is increasingly essential for quality assurance and third-party reimbursement.

Future Directions: Research to Advance Evidence on Interventions

Social work has critically important research needs. Our field needs to advance the evidence base on what interventions work for social work populations, practices, and settings. Responding to the 2015 IOM report, the National Task Force on Evidence-Based Practice in Social Work (2016) identified as a social work priority the development and testing of evidence-based practices relevant to organizational, community, and policy practice. As we advance our intervention effectiveness research, we must respond to the challenge of determining the key mechanisms of change ( National Institute of Mental Health, 2016 ) and identify key modifiable components of packaged interventions ( Rosen & Proctor, 1978 ). We need to explore the optimal dosage, ordering, or adapted bundling of intervention elements and advance robust, feasible ways to measure and increase fidelity ( Jaccard, 2016 ). We also need to conduct research on which interventions are most appropriate, acceptable, and effective with various client groups ( Zayas, 2003 ; Videka, 2003 ).

Documenting the Impact of Interventions: Specifying and Measuring Outcomes

Outcomes are key to documenting the impact of social work interventions. My 1978 “specifying” paper with Rosen emphasized that the effectiveness of social work practice could not be adequately evaluated without clear specification and measurement of various types of outcomes. In that paper, we argued that the profession cannot rely only on an assertion of effectiveness. The field must also calibrate, calculate, and communicate its impact.

The nursing profession’s highly successful campaign, based on outcomes research, positioned that field to claim that “nurses save lives.” Nurse staffing ratios were associated with in-hospital and 30-day mortality, independent of patient characteristics, hospital characteristics, or medical treatment ( Person et al., 2004 ). In contrast, social work has often described—sometimes advertised—itself as the low-cost profession. The claim of “cheapest service” may have some strategic advantage in turf competition with other professions. But social work can do better. Our research base can and should demonstrate the value of our work by naming and quantifying the outcomes—the added value of social work interventions.

As a start to this work—a beginning step in compiling evidence about the impact of social work interventions—our team set out to identify the outcomes associated with social work practice. We felt that identifying and naming outcomes is essential for conveying what social work is about. Moreover, outcomes should serve as the focus for evaluating the effectiveness of social work interventions.

We produced two taxonomies of outcomes reflected in published evaluations of social work interventions ( Proctor, Rosen, & Rhee, 2002 ; Rosen, Proctor, & Staudt, 2003 ). They included such outcomes as change in clients’ social functioning, resource procurement, problem or symptom reduction, and safety. They exemplify the importance of naming and measuring what our profession can contribute to society. Although social work’s growing body of effectiveness research typically reports outcomes of the interventions being tested, the literature has not, in the intervening 20 years, addressed the collective set of outcomes for our field.

Fortunately, the Grand Challenges for Social Work (AASWSW, n.d.) now provide a framework for communicating social work’s goals. They reflect social work’s added value: improving individual and family well-being, strengthening social fabric, and helping to create a more just society. The Grand Challenges for Social Work include ensuring healthy development for all youth, closing the health gap, stopping family violence, advancing long and productive lives, eradicating social isolation, ending homelessness, creating social responses to a changing environment, harnessing technology for social good, promoting smart decarceration, reducing extreme economic inequality, building financial capability for all, and achieving equal opportunity and justice ( AASWSW, n.d. ).

These important goals appropriately reflect much of what we are all about in social work, and our entire field has been galvanized—energized by the power of these grand challenges. However, the grand challenges require setting specific benchmarks—targets that reflect how far our professional actions can expect to take us, or in some areas, how far we have come in meeting the challenge.

For the past decade, care delivery systems and payment reforms have required measures for tracking performance. Quality measures have become critical tools for all service providers and organizations ( IOM, 2015 ). The IOM defines quality of care as “the degree to which … services for individuals and populations increase the likelihood of desired … outcomes and are consistent with current professional knowledge” ( Lohr, 1990 , p. 21). Quality measures are important at multiple levels of service delivery: at the client level, at the practitioner level, at the organization level, and at the policy level. The National Quality Forum has established five criteria for quality measures: They should address (a) the most important, (b) the most scientifically valid, (c) the most feasible or least burdensome, (d) the most usable, and (e) the most harmonious set of measures ( IOM, 2015 .) Quality measures have been advanced by accrediting groups (e.g., the Joint Commission of the National Committee for Quality Assurance), professional societies, and federal agencies, including the U.S. Department of Health and Human Services. However, quality measures are lacking for key areas of social work practice, including mental health and substance-use treatment. And of the 55 nationally endorsed measures related to mental health and substance use, only two address a psychosocial intervention. Measures used for accreditation and certification purposes often reflect structural capabilities of organizations and their resource use, not the infrastructure required to deliver high-quality services ( IOM, 2015 ). I am not aware of any quality measure developed by our own professional societies or agreed upon across our field.

Future Directions: Research on Quality Monitoring and Measure Development

Although social work as a field lacks a strong tradition of measuring and assessing quality ( Megivern et al., 2007 ; McMillen et al., 2005 ; Proctor, Powell, & McMillen, 2012 ), social work’s role in the quality workforce is becoming better understood ( McMillen & Raffol, 2016 ). The small number of established and endorsed quality measures reflects both limitations in the evidence for effective interventions and challenges in obtaining the detailed information necessary to support quality measurement ( IOM, 2015 ). According to the National Task Force on Evidence-Based Practice in Social Work (2016) , developing quality measures to capture use of evidence-based interventions is essential for the survival of social work practice in many settings. The task force recommends that social work organizations develop relevant and viable quality measures and that social workers actively influence the implementation of quality measures in their practice settings.

How to Implement Evidence-Based Care

A third and more recent focus of my work addresses this question: How do we implement evidence-based care in agencies and communities? Despite our progress in developing proven interventions, most clients—whether served by social workers or other providers—do not receive evidence-based care. A growing number of studies are assessing the extent to which clients—in specific settings or communities—receive evidence-based interventions. Kohl, Schurer, and Bellamy (2009) examined quality in a core area of social work: training for parents at risk for child maltreatment. The team examined the parent services and their level of empirical support in community agencies, staffed largely by master’s-level social workers. Of 35 identified treatment programs offered to families, only 11% were “well-established empirically supported interventions,” with another 20% containing some hallmarks of empirically supported interventions ( Kohl et al., 2009 ). This study reveals a sizable implementation gap, with most of the programs delivered lacking scientific validation.

Similar quality gaps are apparent in other settings where social workers deliver services. Studies show that only 19.3% of school mental health professionals and 36.8% of community mental health professionals working in Virginia’s schools and community mental health centers report using any evidence-based substance-abuse prevention programs ( Evans, Koch, Brady, Meszaros, & Sadler, 2013 ). In mental health, where social workers have long delivered the bulk of services, only 40% to 50% of people with mental disorders receive any treatment ( Kessler, Chiu, Demler, Merikangas, & Walters, 2005 ; Merikangas et al., 2011 ), and of those receiving treatment, a fraction receive what could be considered “quality” treatment ( Wang, Demler, & Kessler, 2002 ; Wang et al., 2005 ). These and other studies indicate that, despite progress in developing proven interventions, most clients do not receive evidence-based care. In light of the growth of evidence-based practice, this fact is troubling evidence that testing interventions and publishing the findings is not sufficient to improve quality.

So, how do we get these interventions in place? What is needed to enable social workers to deliver, and clients to receive, high-quality care? In addition to developing and testing evidence-based interventions, what else is needed to improve the quality of social work practice? My work has focused on advancing quality of services through two paths.

Making Effective Interventions Accessible to Providers: Intervention Reviews and Taxonomies

First, we have advocated that research evidence be synthesized and made available to front-line practitioners. In a research-active field where new knowledge is constantly produced, practitioners should not be expected to rely on journal publications alone for information about effective approaches to achieve desired outcomes. Mastering a rapidly expanding professional evidence base has been characterized as a nearly unachievable challenge for practitioners ( Greenfield, 2017 ). Reviews should critique and clarify the intervention’s effectiveness as tested in specific settings, populations, and contexts, answering the question, “What works where, and with whom?” Even more valuable are studies of comparative effectiveness—those that answer, “Which intervention approach works better, where, and when?”

Taxonomies of clearly and consistently labeled interventions will enhance their accessibility and the usefulness of research reports and systematic reviews. A pre-requisite is the consistent naming of interventions. A persistent challenge is the wide variation in names or labels for interventive procedures and programs. Our professional activities are the basis for our societal sanction, and they must be capable of being accurately labeled and documented if we are to describe what our profession “does” to advance social welfare. Increasingly, and in short order, that documentation will be in electronic records that are scrutinized by third parties for purposes of reimbursement and assessment of value toward outcome attainment.

How should intervention research and reviews be organized? Currently, several websites provide lists of evidence-based practices, some with links, citations, or information about dissemination and implementation organizations that provide training and facilitation to adopters. Practitioners and administrators find such lists helpful but often note the challenge in determining which are most appropriate for their needs. In the words of one agency leader, “The drug companies are great at presenting [intervention information] in a very easy form to use. We don’t have people coming and saying, ‘Ah, let me tell you about the best evidence-based practice for cognitive behavioral therapy for depression,’” ( Proctor et al., 2007 , p. 483). We have called for the field to devise decision aids for practitioners to enhance access to the best available empirical knowledge about interventions ( Proctor et al., 2002 ; Proctor & Rosen, 2008 ; Rosen et al., 2003 ). We proposed that intervention taxonomies be organized around outcomes pursued in social work practice, and we developed such a taxonomy based on eight domains of outcomes—those most frequently tested in social work journals. Given the field’s progress in identifying its grand challenges, its associated outcomes could well serve as the organizing focus, with research-tested interventions listed for each challenge. Compiling the interventions, programs, and services that are shown—through research—to help achieve one of the challenges would surely advance our field.

We further urged profession-wide efforts to develop social work practice guidelines from intervention taxonomies ( Rosen et al., 2003 ). Practice guidelines are systematically compiled, critiqued, and organized statements about the effectiveness of interventions that are organized in a way to help practitioners select and use the most effective and appropriate approaches for addressing client problems and pursuing desired outcomes.

At that time, we proposed that our published taxonomy of social work interventions could provide a beginning architecture for social work guidelines ( Rosen et al., 2003 ). In 2000, we organized a conference for thought leaders in social work practice. This talented group wrestled with and formulated recommendations for tackling the professional, research, and training requisites to developing social work practice guidelines to enable researchers to access and apply the best available knowledge about interventions ( Rosen et al., 2003 ). Fifteen years later, however, the need remains for social work to synthesize its intervention research. Psychology and psychiatry, along with most fields of medical practice, have developed practice guidelines. Although their acceptance and adherence is fraught with challenges, guidelines make evidence more accessible and enable quality monitoring. Yet, guidelines still do not exist for social work.

The 2015 IOM report, “Psychosocial Interventions for Mental and Substance Use Disorders: A Framework for Establishing Evidence-Based Standards,” includes a conclusion that information on the effectiveness of psychosocial interventions is not routinely available to service consumers, providers, and payers, nor is it synthesized. That 2015 IOM report called for systematic reviews to inform clinical guidelines for psychosocial interventions. This report defined psychosocial interventions broadly, encompassing “interpersonal or informational activities, techniques, or strategies that target biological, behavioral, cognitive, emotional, interpersonal, social, or environmental factors with the aim of reducing symptoms and improving functioning or well-being” ( IOM, 2015 , p. 5). These interventions are social work’s domain; they are delivered in the very settings where social workers dominate (behavioral health, schools, criminal justice, child welfare, and immigrant services); and they encompass populations across the entire lifespan within all sociodemographic groups and vulnerable populations. Accordingly, the National Task Force on Evidence Based Practice in Social Work (2016) has recommended the conduct of more systematic reviews of the evidence supporting social work interventions.

If systematic reviews are to lead to guidelines for evidence-based psychosocial interventions, social work needs to be at the table, and social work research must provide the foundation. Whether social work develops its own guidelines or helps lead the development of profession-independent guidelines as recommended by the IOM committee, guidelines need to be detailed enough to guide practice. That is, they need to be accompanied by treatment manuals and informed by research that details the effect of moderator variables and contextual factors reflecting diverse clientele, social determinants of health, and setting resource challenges. The IOM report “Clinical Practice Guidelines We Can Trust” sets criteria for guideline development processes ( IOM, 2011 ). Moreover, social work systematic reviews of research and any associated evidence-based guidelines need to be organized around meaningful taxonomies.

Advancing the Science of Implementation

As a second path to ensuring the delivery of high-quality care, my research has focused on advancing the science of implementation. Implementation research seeks to inform how to deliver evidence-based interventions, programs, and policies into real-world settings so their benefits can be realized and sustained. The ultimate aim of implementation research is building a base of evidence about the most effective processes and strategies for improving service delivery. Implementation research builds upon effectiveness research then seeks to discover how to use specific implementation strategies and move those interventions into specific settings, extending their availability, reach, and benefits to clients and communities. Accordingly, implementation strategies must address the challenges of the service system (e.g., specialty mental health, schools, criminal justice system, health settings) and practice settings (e.g., community agency, national employee assistance programs, office-based practice), and the human capital challenge of staff training and support.

In an approach that echoes themes in an early paper, “Specifying the Treatment Process—The Basis for Effectiveness Research” ( Rosen & Proctor, 1978 ), my work once again tackled the challenge of specifying a heretofore vague process—this time, not the intervention process, but the implementation process. As a first step, our team developed a taxonomy of implementation outcomes ( Proctor et al., 2011 ), which enable a direct test of whether or not a given intervention is adopted and delivered. Although it is overlooked in other types of research, implementation science focuses on this distinct type of outcome. Explicit examination of implementation outcomes is key to an important research distinction. Often, evaluations yield disappointing results about an intervention, showing that the expected and desired outcomes are not attained. This might mean that the intervention was not effective. However, just as likely, it could mean that the intervention was not actually delivered, or it was not delivered with fidelity. Implementation outcomes help identify the roadblocks on the way to intervention adoption and delivery.

Our 2011 taxonomy of implementation outcomes ( Proctor et al., 2011 ), became the framework for two national repositories of measures for implementation research: the Seattle Implementation Research Collaborative ( Lewis et al., 2015 ) and the National Institutes of Health GEM measures database ( Rabin et al., 2012 ). These repositories of implementation outcomes seek to harmonize and increase the rigor of measurement in implementation science.

We also have developed taxonomies of implementation strategies ( Powell et al., 2012 ; Powell et al., 2015 ; Waltz et al., 2014 , 2015) . Implementation strategies are interventions for system change—how organizations, communities, and providers can learn to deliver new and more effective practices ( Powell et al., 2012 ).

A conversation with a key practice leader stimulated my interest in implementation strategies. Shortly after our school endorsed an MSW curriculum emphasizing evidence-based practices, a pioneering CEO of a major social service agency in St. Louis met with me and asked,

Enola Proctor, I get the importance of delivering evidence based practices. My organization delivers over 20 programs and interventions, and I believe only a handful of them are really evidence based. I want to decrease our provision of ineffective care, and increase our delivery of evidence-based practices. But how? What are the evidence-based ways I, as an agency director, can transform my agency so that we can deliver evidence-based practices?

That agency director was asking a question of how . He was asking for evidence-based implementation strategies. Moving effective programs and practices into routine care settings requires the skillful use of implementation strategies, defined as systematic “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice into routine service” ( Proctor et al., 2013 , p. 2).

This question has shaped my work for the past 15 years, as well as the research priorities of several funding agencies, including the National Institutes of Health, the Agency for Healthcare Research and Quality, the Patient-Centered Outcomes Research Institute, and the World Health Organization. Indeed, a National Institutes of Health program announcement—Dissemination and Implementation Research in Health ( National Institutes of Health, 2016 )—identified the discovery of effective implementation strategies as a primary purpose of implementation science. To date, the implementation science literature cannot yet answer that important question, but we are making progress.

To identify implementation strategies, our teams first turned to the literature—a literature that we found to be scattered across a wide range of journals and disciplines. Most articles were not empirical, and most articles used widely differing terms to characterize implementation strategies. We conducted a structured literature review to generate common nomenclature and a taxonomy of implementation strategies. That review yielded 63 distinct implementation strategies, which fell into six groupings: planning, educating, financing, restructuring, managing quality, and attending to policy context ( Powell et al., 2012 ).

Our team refined that compilation, using Delphi techniques and concept mapping to develop conceptually distinct categories of implementation strategies ( Powell et al., 2015 ; Waltz et al., 2014 ). The refined compilation of 73 discrete implementation strategies was then further organized into nine clusters:

  • changing agency infrastructure,
  • using financial strategies,
  • supporting clinicians,
  • providing interactive assistance,
  • training and educating stakeholders,
  • adapting and tailoring interventions to context,
  • developing stakeholder relationships,
  • using evaluative and iterative strategies, and
  • engaging consumers.

These taxonomies of implementation strategies position the field for more robust research on implementation processes. The language used to describe implementation strategies has not yet “gelled” and has been described as a “Tower of Babel” ( McKibbon et al., 2010 ). Therefore, we also developed guidelines for reporting the components of strategies ( Proctor et al., 2013 ) so researchers and implementers would have more behaviorally specific information about what a strategy is, who does it, when, and for how long. The value of such reporting guidelines is illustrated in the work of Gold and colleagues (2016) .

What have we learned, through our own program of research on implementation strategies—the “how to” of improving practice? First, we have been able to identify from practice-based evidence the implementation strategies used most often. Using novel activity logs to track implementation strategies, Bunger and colleagues (2017) found that strategies such as quality improvement tools, using data experts, providing supervision, and sending clinical reminders were frequently used to facilitate delivery of behavioral health interventions within a child-welfare setting and were perceived by agency leadership as contributing to project success.

Second, reflecting the complexity of quality improvement processes, we have learned that there is no magic bullet ( Powell, Proctor, & Glass, 2013 ). Our study of U.S. Department of Veterans Affairs clinics working to implement evidence-based HIV treatment found that implementers used an average of 25 (plus or minus 14) different implementation strategies ( Rogal, et al., 2017 ). Moreover, the number of implementation strategies used was positively associated with the number of new treatment starts. These findings suggest that implementing new interventions requires considerable effort and resources.

To advance our understanding of the effectiveness of implementation strategies, our teams have conducted a systematic review ( Powell et al., 2013 ), tested specific strategies, and captured practice-based evidence from on-the-ground implementers. Testing the effectiveness of implementation strategies has been identified as a top research priority by the IOM (2009) . In work with Charles Glisson in St. Louis, our 15-agency-based randomized clinical trial found that an organizational-focused intervention—the attachment, regulatory, and competency model—improved agency culture and climate, stimulated more clinicians to enroll in evidence-based-practice training, and boosted clinical effect sizes of various evidence-based practices ( Glisson, Williams, Hemmelgarn, Proctor, & Green, 2016a , 2016b ). And in a hospital critical care unit, the implementation strategies of developing a team, selecting and using champions, provider education sessions, and audit and feedback helped increase team adherence to phlebotomy guidelines ( Steffen et al., in press ).

We are also learning about the value of different strategies. Experts in implementation science and implementation practice identified as most important the strategies of “use evaluate and iterative approaches” and “train and educate stakeholders.” Reported as less helpful were such strategies as “access new funding streams” and “remind clinicians of practices to use” ( Waltz et al., 2015 ). Successful implementers in Veterans Affairs clinics relied more heavily on such strategies as “change physical structures and equipment” and “facilitate relay of clinical data to providers” than did less successful implementers ( Rogal et al., 2017 ).

Many strategies have yet to be investigated empirically, as has the role of dissemination and implementation organizations—organizations that function to promote, provide information about, provide training in, and scale up specific treatments. Most evidence-based practices used in behavioral health, including most listed on the Substance Abuse and Mental Health Services Administration National Registry of Promising and Effective Practices, are disseminated and distributed by dissemination and implementation organizations. Unlike drugs and devices, psychosocial interventions have no Federal Drug Administration-like delivery system. Kreuter and Casey (2012) urge better understanding and use of the intervention “delivery system,” or mechanisms to bring treatment discoveries to the attention of practitioners and into use in practice settings.

Implementation strategies have been shown to boost clinical effectiveness ( Glisson et al., 2010 ), reduce staff turnover ( Aarons, Sommerfield, Hect, Silvosky, & Chaffin, 2009 ) and help reduce disparities in care ( Balicer et al., 2015 ).

Future directions: Research on implementation strategies

My work in implementation science has helped build intellectual capital for the rapidly growing field of dissemination and implementation science, leading teams to distinguish, clearly define, develop taxonomies, and stimulate more systematic work to advance the conceptual, linguistic, and methodological clarity in the field. Yet, we continue to lack understanding of many issues. What strategies are used in usual implementation practice, by whom, for which empirically supported interventions? What strategies are effective in which organizational and policy contexts? Which strategies are effective in attaining which specific implementation outcomes? For example, are the strategies that are effective for initial adoption also effective for scale up, spread, and sustained use of interventions? Social workers have the skill set for roles as implementation facilitators, and refining packages of implementation strategies that are effective in social service and behavioral health settings could boost the visibility, scale, and impact of our work.

The Third Generation and Counting

Social work faces grand, often daunting challenges. We need to develop a more robust base of evidence about the effectiveness of interventions and make that evidence more relevant, accessible, and applicable to social work practitioners, whether they work in communities, agencies, policy arenas, or a host of novel settings. We need to advance measurement-based care so our value as a field is recognized. We need to know how to bring proven interventions to scale for population-level impact. We need to discover ways to build capacity of social service agencies and the communities in which they reside. And we need to learn how to sustain advances in care once we achieve them ( Proctor et al., 2015 ). Our challenges are indeed grand, far outstripping our resources.

So how dare we speak of a quality quest? Does it not seem audacious to seek the highest standards in caring for the most vulnerable, especially in an era when we face a new political climate that threatens vulnerable groups and promises to strip resources from health and social services? Members of our profession are underpaid, and most of our agencies lack the data infrastructure required for assessment and evaluation. Quality may be an audacious goal, but as social workers we can pursue no less. By virtue of our code of ethics, our commitment to equity, and our skills in intervening on multiple levels of systems and communities, social workers are ideally suited for advancing quality.

Who will conduct the needed research? Who will pioneer its translation to improving practice? Social work practice can be only as strong as its research base; the responsibility for developing that base, and hence improve practice, is lodged within social work research.

If my greatest challenge is pursuing this quest, my greatest joy is in mentoring the next generation for this work. My research mentoring has always been guided by the view that the ultimate purpose of research in the helping professions is the production and systemization of knowledge for use by practitioners ( Rosen & Proctor, 1978 ). For 27 years, the National Institute of Mental Health has supported training in mental health services research based in the Center for Mental Health Services Research ( Hasche, Perron, & Proctor, 2009 ; Proctor & McMillen, 2008 ). And, with colleague John Landsverk, we are launching my sixth year leading the Implementation Research Institute, a training program for implementation science supported by the National Institute of Mental Health ( Proctor et al., 2013 ). We have trained more than 50 social work, psychology, anthropology, and physician researchers in implementation science for mental health. With three more cohorts to go, we are working to assess what works in research training for implementation science. Using bibliometric analysis, we have learned that intensive training and mentoring increases research productivity in the form of published papers and grants that address how to implement evidence-based care in mental health and addictions. And, through use of social network analysis, we have learned that every “dose” of mentoring increases scholarly collaboration when measured two years later ( Luke, Baumann, Carothers, Landsverk, & Proctor, 2016 ).

As his student, I was privileged to learn lessons in mentoring from Aaron Rosen. He treated his students as colleagues, he invited them in to work on the most challenging of questions, and he pursued his work with joy. When he treated me as a colleague, I felt empowered. When he invited me to work with him on the field’s most vexing challenges, I felt inspired. And as he worked with joy, I learned that work pursued with joy doesn’t feel like work at all. And now the third, fourth, and fifth generations of social work researchers are pursuing tough challenges and the quality quest for social work practice. May seasoned and junior researchers work collegially and with joy, tackling the profession’s toughest research challenges, including the quest for high-quality social work services.

Acknowledgments

Preparation of this paper was supported by IRI (5R25MH0809160), Washington University ICTS (2UL1 TR000448-08), Center for Mental Health Services Research, Washington University in St. Louis, and the Center for Dissemination and Implementation, Institute for Public Health, Washington University in St. Louis.

This invited article is based on the 2017 Aaron Rosen Lecture presented by Enola Proctor at the Society for Social Work and Research 21st Annual Conference—“Ensure Healthy Development for All Youth”—held January 11–15, 2017, in New Orleans, LA. The annual Aaron Rosen Lecture features distinguished scholars who have accumulated a body of significant and innovative scholarship relevant to practice, the research base for practice, or effective utilization of research in practice.

  • Aarons GA, Sommerfield DH, Hect DB, Silvosky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology. 2009; 77 (2):270–280. https://doi.org/10.1037/a0013223 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • American Academy of Social Work and Social Welfare (AASWSW) Grand challenges for social work (n.d.) Retrieved from http://aaswsw.org/grand-challenges-initiative/
  • Balicer RD, Hoshen M, Cohen-Stavi C, Shohat-Spitzer S, Kay C, Bitterman H, Shadmi E. Sustained reduction in health disparities achieved through targeted quality improvement: One-year follow-up on a three-year intervention. Health Services Research. 2015; 50 :1891–1909. http://dx.doi.org/10.1111/1475-6773.12300 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems. 2017; 15 (15):1–12. https://doi.org/10.1186/s12961-017-0175-y . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Evans SW, Koch JR, Brady C, Meszaros P, Sadler J. Community and school mental health professionals’ knowledge and use of evidence based substance use prevention programs. Administration and Policy in Mental Health and Mental Health Services Research. 2013; 40 (4):319–330. https://doi.org/10.1007/s10488-012-0422-z . [ PubMed ] [ Google Scholar ]
  • Fraser MW. Intervention research in social work: Recent advances and continuing challenges. Research on Social Work Practice. 2004; 14 (3):210–222. https://doi.org/10.1177/1049731503262150 . [ Google Scholar ]
  • Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, Chapman JE. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology. 2010; 78 (4):537–550. https://doi.org/10.1037/a0019160 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Glisson C, Williams NJ, Hemmelgarn A, Proctor EK, Green P. Increasing clinicians’ EBT exploration and preparation behavior in youth mental health services by changing organizational culture with ARC. Behaviour Research and Therapy. 2016a; 76 :40–46. https://doi.org/10.1016/j.brat.2015.11.008 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Glisson C, Williams NJ, Hemmelgarn A, Proctor EK, Green P. Aligning organizational priorities with ARC to improve youth mental health service outcomes. Journal of Consulting and Clinical Psychology. 2016b; 84 (8):713–725. https://doi.org/10.1037/ccp0000107 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Gold R, Bunce AE, Cohen DJ, Hollombe C, Nelson CA, Proctor EK, DeVoe JE. Reporting on the strategies needed to implement proven interventions: An example from a “real-world” cross-setting implementation study. Mayo Clinic Proceedings. 2016; 91 (8):1074–1083. https://doi.org/10.1016/j.mayocp.2016.03.014 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Greenfield S. Clinical practice guidelines: Expanded use and misuse. Journal of the American Medical Association. 2017; 317 (6):594–595. doi: 10.1001/jama.2016.19969. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hasche L, Perron B, Proctor E. Making time for dissertation grants: Strategies for social work students and educators. Research on Social Work Practice. 2009; 19 (3):340–350. https://doi.org/10.1177/1049731508321559 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Institute of Medicine (IOM), Committee on Comparative Effectiveness Research Prioritization. Initial national priorities for comparative effectiveness research. Washington, DC: The National Academies Press; 2009. [ Google Scholar ]
  • Institute of Medicine (IOM) Clinical practice guidelines we can trust. Washington, DC: The National Academies Press; 2011. [ Google Scholar ]
  • Institute of Medicine (IOM) Psychosocial interventions for mental and substance use disorders: A framework for establishing evidence-based standards. Washington, DC: The National Academies Press; 2015. https://doi.org/10.17226/19013 . [ PubMed ] [ Google Scholar ]
  • Jaccard J. The prevention of problem behaviors in adolescents and young adults: Perspectives on theory and practice. Journal of the Society for Social Work and Research. 2016; 7 (4):585–613. https://doi.org/10.1086/689354 . [ Google Scholar ]
  • Kessler RC, Chiu WT, Demler O, Merikangas KR, Walters EE. Prevalence, severity, and comorbidity of 12-month DSM-IV disorders in the National Comorbidity Survey Replication. Archives of General Psychiatry. 2005; 62 (6):617–627. https://doi.org/10.1001/archpsyc.62.6.617 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kohl PL, Schurer J, Bellamy JL. The state of parent training: Program offerings and empirical support. Families in Society: The Journal of Contemporary Social Services. 2009; 90 (3):248–254. http://dx.doi.org/10.1606/1044-3894.3894 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kreuter MW, Casey CM. Enhancing dissemination through marketing and distribution systems: A vision for public health. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: Translating science to practice. New York, NY: Oxford University Press; 2012. [ Google Scholar ]
  • Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, Comtois KA. The Society for Implementation Research collaboration instrument review project: A methodology to promote rigorous evaluation. Implementation Science. 2015; 10 (2):1–18. https://doi.org/10.1186/s13012-014-0193-x . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Lohr KN. Medicare: A strategy for quality assurance. I. Washington, DC: National Academies Press; 1990. [ PubMed ] [ Google Scholar ]
  • Luke D, Baumann A, Carothers B, Landsverk J, Proctor EK. Forging a link between mentoring and collaboration: A new training model for implementation science. Implementation Science. 2016; 11 (137):1–12. http://dx.doi.org/10.1186/s13012-016-0499-y . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Straus SS. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: A tower of Babel? Implementation Science. 2010; 5 (16) doi: 10.1186/1748-5908-5-16. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McMillen JC, Proctor EK, Megivern D, Striley CW, Cabasa LJ, Munson MR, Dickey B. Quality of care in the social services: Research agenda and methods. Social Work Research. 2005; 29 (3):181–191. doi.org/10.1093/swr/29.3.181. [ Google Scholar ]
  • McMillen JC, Raffol M. Characterizing the quality workforce in private U.S. child and family behavioral health agencies. Administration and Policy in Mental Health and Mental Health Services Research. 2016; 43 (5):750–759. doi: 10.1007/s10488-0150-0667-4. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Megivern DA, McMillen JC, Proctor EK, Striley CW, Cabassa LJ, Munson MR. Quality of care: Expanding the social work dialogue. Social Work. 2007; 52 (2):115–124. https://dx.doi.org/10.1093/sw/52.2.115 . [ PubMed ] [ Google Scholar ]
  • Merikangas KR, He J, Burstein M, Swendsen J, Avenevoli S, Case B, Olfson M. Service utilization for lifetime mental disorders in U.S. adolescents: Results of the National Comorbidity Survey Adolescent Supplement (NCS-A) Journal of the American Academy of Child and Adolescent Psychiatry. 2011; 50 (1):32–45. https://doi.org/10.1016/j.jaac.2010.10.006 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • National Institute of Mental Health. Psychosocial research at NIMH: A primer. 2016 Retrieved from https://www.nimh.nih.gov/research-priorities/psychosocial-research-at-nimh-a-primer.shtml .
  • National Institutes of Health. Dissemination and implementation research in health (R01) 2016 Sep 14; Retrieved from https://archives.nih.gov/asites/grants/09-14-2016/grants/guide/pa-files/PAR-16-238.html .
  • National Task Force on Evidence-Based Practice in Social Work. Unpublished recommendations to the Social Work Leadership Roundtable 2016 [ Google Scholar ]
  • Person SD, Allison JJ, Kiefe CI, Weaver MT, Williams OD, Centor RM, Weissman NW. Nurse staffing and mortality for Medicare patients with acute myocardial infarction. Medical Care. 2004; 42 (1):4–12. https://doi.org/10.1097/01.mlr.0000102369.67404.b0 . [ PubMed ] [ Google Scholar ]
  • Powell BJ, McMillen C, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, York JL. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2012; 69 (2):123–157. https://dx.doi.org/10.1177/1077558711430690 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Powell BJ, Proctor EK, Glass JE. A systematic review of strategies for implementing empirically supported mental health interventions. Research on Social Work Practice. 2013; 24 (2):192–212. https://doi.org/10.1177/1049731513505778 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Kirchner JE. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015; 10 (21):1–14. https://doi.org/10.1186/s13012-015-0209-1 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Knudsen KJ, Fedoravicius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: Agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007; 34 (5):479–488. https://doi.org/10.1007/s10488-007-0129-8 . [ PubMed ] [ Google Scholar ]
  • Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, Chambers D. The Implementation Research Institute: Training mental health implementation researchers in the United States. Implementation Science. 2013; 8 (105):1–12. https://doi.org/10.1186/1748-5908-8-105 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, Padek M. Sustainability of evidence-based healthcare: Research agenda, methodological advances, and infrastructure support. Implementation Science. 2015; 10 (88):1–13. https://doi.org/10.1186/s13012-015-0274-5 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, McMillen JC. Quality of care. In: Mizrahi T, Davis L, editors. Encyclopedia of Social Work. 20. Washington, DC, and New York, NY: NASW Press and Oxford University Press; 2008. http://dx.doi.org/10.1093/acrefore/9780199975839.013.33 . [ Google Scholar ]
  • Proctor EK, Powell BJ, McMillen CJ. Implementation strategies: Recommendations for specifying and reporting. Implementation Science. 2012; 8 (139):1–11. https://doi.org/10.1186/1748-5908-8-139 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Rosen A. From knowledge production to implementation: Research challenges and imperatives. Research on Social Work Practice. 2008; 18 (4):285–291. https://doi.org/10.1177/1049731507302263 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Proctor EK, Rosen A, Rhee C. Outcomes in social work practice. Social Work Research & Evaluation. 2002; 3 (2):109–125. [ Google Scholar ]
  • Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Hensley M. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011; 38 (2):65–76. https://doi.org/10.1007/s10488-010-0319-7 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rabin BA, Purcell P, Naveed S, Moser RP, Henton MD, Proctor EK, Glasgow RE. Advancing the application, quality and harmonization of implementation science measures. Implementation Science. 2012; 7 (119):1–11. https://doi.org/10.1186/1748-5908-7-119 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rogal SS, Yakovchenko V, Waltz TJ, Powell BJ, Kirchner JE, Proctor EK, Chinman MJ. The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample. Implementation Science. 2017; 12 (60) http://doi.org/10.1186/s13012-017-0588-6 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rosen A, Proctor EK. Specifying the treatment process: The basis for effectiveness research. Journal of Social Service Research. 1978; 2 (1):25–43. https://doi.org/10.1300/J079v02n01_04 . [ Google Scholar ]
  • Rosen A, Proctor EK. Distinctions between treatment outcomes and their implications for treatment evaluation. Journal of Consulting and Clinical Psychology. 1981; 49 (3):418–425. http://dx.doi.org/10.1037/0022-006X.49.3.418 . [ PubMed ] [ Google Scholar ]
  • Rosen A, Proctor EK, Staudt M. Targets of change and interventions in social work: An empirically-based prototype for developing practice guidelines. Research on Social Work Practice. 2003; 13 (2):208–233. https://dx.doi.org/10.1177/1049731502250496 . [ Google Scholar ]
  • Steffen K, Doctor A, Hoerr J, Gill J, Markham C, Riley S, Spinella P. Controlling phlebotomy volume diminishes PICU transfusion: Implementation processes and impact. Pediatrics (in press) [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Videka L. Accounting for variability in client, population, and setting characteristics: Moderators of intervention effectiveness. In: Rosen A, Proctor EK, editors. Developing practice guidelines for social work intervention: Issues, methods, and research agenda. New York, NY: Columbia University Press; 2003. pp. 169–192. [ Google Scholar ]
  • Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. Expert Recommendations for Implementing Change (ERIC): Protocol for a mixed methods study. Implementation Science. 2014; 9 (39):1–12. https://doi.org/10.1186/1748-5908-9-39 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, Kirchner JE. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science. 2015; 10 (109):1–8. https://doi.org/10.1186/s13012-015-0295-0 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Wang PS, Lane M, Olfson M, Pincus HA, Wells KB, Kessler RC. Twelvemonth use of mental health services in the United States: Results from the National Comorbidity Survey Replication. Archives of General Psychiatry. 2005; 62 (6):629–640. doi: 10.1001/archpsyc.62.6.629. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wang PS, Demler O, Kessler RC. Adequacy of treatment for serious mental illness in the United States. American Journal of Public Health. 2002; 92 (1):92–98. http://ajph.aphapublications.org/doi/abs/10.2105/AJPH.92.1.92 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zayas L. Service delivery factors in the development of practice guidelines. In: Rosen A, Proctor EK, editors. Developing practice guidelines for social work intervention: Issues, methods, and research agenda. New York, NY: Columbia University Press; 2003. pp. 169–192. https://doi.org/10.7312/rose12310-010 . [ Google Scholar ]
  • Search Menu

Sign in through your institution

  • Advance articles
  • Editor's Choice
  • Author Guidelines
  • Submission Site
  • Open Access
  • About The British Journal of Social Work
  • About the British Association of Social Workers
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

  • < Previous

Practice research methods in social work: Processes, applications and implications for social service organisations

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

Bowen McBeath, Michael J Austin, Sarah Carnochan, Emmeline Chuang, Practice research methods in social work: Processes, applications and implications for social service organisations, The British Journal of Social Work , Volume 52, Issue 6, September 2022, Pages 3328–3346, https://doi.org/10.1093/bjsw/bcab246

  • Permissions Icon Permissions

Although social work research is commonly rooted within social service settings, it can be difficult for social work researchers and practitioners to develop and sustain participatory studies that specifically promote knowledge sharing and service improvement involving organisational practice. One participatory approach is practice research (PR), which involves social work researchers and practitioners collaborating to define, understand and try to improve the delivery of health and social care services and organisational structures and processes. The two goals of this commentary are to introduce essential methods and approaches to PR and to identify points of connection involving PR and social service organisational studies. Our specific focus on PR in statutory, voluntary and private social service organisations reflects efforts to connect practice, theory and qualitative and quantitative research methods to develop and share organisationally-situated knowledge.

Personal account

  • Sign in with email/username & password
  • Get email alerts
  • Save searches
  • Purchase content
  • Activate your purchase/trial code
  • Add your ORCID iD

Institutional access

Sign in with a library card.

  • Sign in with username/password
  • Recommend to your librarian
  • Institutional account management
  • Get help with access

Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:

IP based access

Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.

Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.

  • Click Sign in through your institution.
  • Select your institution from the list provided, which will take you to your institution's website to sign in.
  • When on the institution site, please use the credentials provided by your institution. Do not use an Oxford Academic personal account.
  • Following successful sign in, you will be returned to Oxford Academic.

If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.

Enter your library card number to sign in. If you cannot sign in, please contact your librarian.

Society Members

Society member access to a journal is achieved in one of the following ways:

Sign in through society site

Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:

  • Click Sign in through society site.
  • When on the society site, please use the credentials provided by that society. Do not use an Oxford Academic personal account.

If you do not have a society account or have forgotten your username or password, please contact your society.

Sign in using a personal account

Some societies use Oxford Academic personal accounts to provide access to their members. See below.

A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.

Some societies use Oxford Academic personal accounts to provide access to their members.

Viewing your signed in accounts

Click the account icon in the top right to:

  • View your signed in personal account and access account management features.
  • View the institutional accounts that are providing access.

Signed in but can't access content

Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.

For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.

Short-term Access

To purchase short-term access, please sign in to your personal account above.

Don't already have a personal account? Register

Month: Total Views:
December 2021 53
January 2022 39
February 2022 30
March 2022 26
April 2022 37
May 2022 44
June 2022 25
July 2022 17
August 2022 11
September 2022 66
October 2022 93
November 2022 43
December 2022 35
January 2023 37
February 2023 39
March 2023 42
April 2023 43
May 2023 33
June 2023 33
July 2023 19
August 2023 32
September 2023 34
October 2023 41
November 2023 42
December 2023 15
January 2024 24
February 2024 36
March 2024 24
April 2024 28
May 2024 33
June 2024 62
July 2024 23
August 2024 7

Email alerts

Citing articles via.

  • Recommend to your Library

Affiliations

  • Online ISSN 1468-263X
  • Print ISSN 0045-3102
  • Copyright © 2024 British Association of Social Workers
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Rights and permissions
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

The link between social work research and practice

When thinking about social work, some may consider the field to solely focus on clinical interventions with individuals or groups.

There may be a mistaken impression that research is not a part of the social work profession. This is completely false. Rather, the two have been and will continue to need to be intertwined.

This guide covers why social workers should care about research, how both social work practice and social work research influence and guide each other, how to build research skills both as a student and as a professional working in the field, and the benefits of being a social worker with strong research skills. 

A selection of social work research jobs are also discussed.  

  • Social workers and research
  • Evidence-based practice
  • Practice and research
  • Research and practice
  • Build research skills
  • Social worker as researcher
  • Benefits of research skills
  • Research jobs

Why should social workers care about research?

Sometimes it may seem as though social work practice and social work research are two separate tracks running parallel to each other – they both seek to improve the lives of clients, families and communities, but they don’t interact. This is not the way it is supposed to work.

Research and practice should be intertwined, with each affecting the other and improving processes on both ends, so that it leads to better outcomes for the population we’re serving.

Section 5 of the NASW Social Work Code of Ethics is focused on social workers’ ethical responsibilities to the social work profession. There are two areas in which research is mentioned in upholding our ethical obligations: for the integrity of the profession (section 5.01) and for evaluation and research (section 5.02). 

Some of the specific guidance provided around research and social work include:

  • 5.01(b): …Social workers should protect, enhance, and improve the integrity of the profession through appropriate study and research, active discussion, and responsible criticism of the profession.
  • 5.01(d): Social workers should contribute to the knowledge base of social work and share with colleagues their knowledge related to practice, research, and ethics…
  • 5.02(a) Social workers should monitor and evaluate policies, the implementation of programs, and practice interventions.
  • 5.02(b) Social workers should promote and facilitate evaluation and research to contribute to the development of knowledge.
  • 5.02(c) Social workers should critically examine and keep current with emerging knowledge relevant to social work and fully use evaluation and research evidence in their professional practice.
  • 5.02(q) Social workers should educate themselves, their students, and their colleagues about responsible research practices.

Evidence-based practice and evidence-based treatment

In order to strengthen the profession and determine that the interventions we are providing are, in fact, effective, we must conduct research. When research and practice are intertwined, this leads practitioners to develop evidence-based practice (EBP) and evidence-based treatment (EBT).

Evidence-based practice is, according to The National Association of Social Workers (NASW) , a process involving creating an answerable question based on a client or organizational need, locating the best available evidence to answer the question, evaluating the quality of the evidence as well as its applicability, applying the evidence, and evaluating the effectiveness and efficiency of the solution. 

Evidence-based treatment is any practice that has been established as effective through scientific research according to a set of explicit criteria (Drake et al., 2001). These are interventions that, when applied consistently, routinely produce improved client outcomes. 

For example, Cognitive Behavioral Therapy (CBT) was one of a variety of interventions for those with anxiety disorders. Researchers wondered if CBT was better than other intervention options in producing positive, consistent results for clients.

So research was conducted comparing multiple types of interventions, and the evidence (research results) demonstrated that CBT was the best intervention.

The anecdotal evidence from practice combined with research evidence determined that CBT should become the standard treatment for those diagnosed with anxiety. Now more social workers are getting trained in CBT methods in order to offer this as a treatment option to their clients.

How does social work practice affect research?

Social work practice provides the context and content for research. For example, agency staff was concerned about the lack of nutritional food in their service area, and heard from clients that it was too hard to get to a grocery store with a variety of foods, because they didn’t have transportation, or public transit took too long. 

So the agency applied for and received a grant to start a farmer’s market in their community, an urban area that was considered a food desert. This program accepted their state’s version of food stamps as a payment option for the items sold at the farmer’s market.

The agency used their passenger van to provide free transportation to and from the farmer’s market for those living more than four blocks from the market location.

The local university also had a booth each week at the market with nursing and medical students checking blood pressure and providing referrals to community agencies that could assist with medical needs. The agency was excited to improve the health of its clients by offering this program.

But how does the granting foundation know if this was a good use of their money? This is where research and evaluation comes in. Research could gather data to answer a number of questions. Here is but a small sample:

  • How many community members visited each week and purchased fruits and vegetables? 
  • How many took advantage of the transportation provided, and how many walked to the market? 
  • How many took advantage of the blood pressure checks? Were improvements seen in those numbers for those having repeat blood pressure readings throughout the market season? 
  • How much did the self-reported fruit and vegetable intake increase for customers? 
  • What barriers did community members report in visiting and buying food from the market (prices too high? Inconvenient hours?)
  • Do community members want the program to continue next year?
  • Was the program cost-effective, or did it waste money by paying for a driver and for gasoline to offer free transportation that wasn’t utilized? What are areas where money could be saved without compromising the quality of the program?
  • What else needs to be included in this program to help improve the health of community members?

How does research affect social work practice?

Research can guide practice to implement proven strategies. It can also ask the ‘what if’ or ‘how about’ questions that can open doors for new, innovative interventions to be developed (and then research the effectiveness of those interventions).

Engel and Schutt (2017) describe four categories of research used in social work:

  • Descriptive research is research in which social phenomena are defined and described. A descriptive research question would be ‘How many homeless women with substance use disorder live in the metro area?’
  • Exploratory research seeks to find out how people get along in the setting under question, what meanings they give to their actions, and what issues concern them. An example research question would be ‘What are the barriers to homeless women with substance use disorder receiving treatment services?’
  • Explanatory research seeks to identify causes and effects of social phenomena. It can be used to rule out other explanations for findings and show how two events are related to each other.  An explanatory research question would be ‘Why do women with substance use disorder become homeless?’
  • Evaluation research describes or identifies the impact of social programs and policies. This type of research question could be ‘How effective was XYZ treatment-first program that combined housing and required drug/alcohol abstinence in keeping women with substance use disorder in stable housing 2 years after the program ended?’

Each of the above types of research can answer important questions about the population, setting or intervention being provided. This can help practitioners determine which option is most effective or cost-efficient or that clients are most likely to adhere to. In turn, this data allows social workers to make informed choices on what to keep in their practice, and what needs changing. 

How to build research skills while in school

There are a number of ways to build research skills while a student.  BSW and MSW programs require a research course, but there are other ways to develop these skills beyond a single class:

  • Volunteer to help a professor working in an area of interest. Professors are often excited to share their knowledge and receive extra assistance from students with similar interests.
  • Participate in student research projects where you’re the subject. These are most often found in psychology departments. You can learn a lot about the informed consent process and how data is collected by volunteering as a research participant.  Many of these studies also pay a small amount, so it’s an easy way to earn a bit of extra money while you’re on campus. 
  • Create an independent study research project as an elective and work with a professor who is an expert in an area you’re interested in.  You’d design a research study, collect the data, analyze it, and write a report or possibly even an article you can submit to an academic journal.
  • Some practicum programs will have you complete a small evaluation project or assist with a larger research project as part of your field education hours. 
  • In MSW programs, some professors hire students to conduct interviews or enter data on their funded research projects. This could be a good part time job while in school.
  • Research assistant positions are more common in MSW programs, and these pay for some or all your tuition in exchange for working a set number of hours per week on a funded research project.

How to build research skills while working as a social worker

Social service agencies are often understaffed, with more projects to complete than there are people to complete them.

Taking the initiative to volunteer to survey clients about what they want and need, conduct an evaluation on a program, or seeing if there is data that has been previously collected but not analyzed and review that data and write up a report can help you stand out from your peers, be appreciated by management and other staff, and may even lead to a raise, a promotion, or even new job opportunities because of the skills you’ve developed.

Benefits of being a social worker with strong research skills

Social workers with strong research skills can have the opportunity to work on various projects, and at higher levels of responsibility. 

Many can be promoted into administration level positions after demonstrating they understand how to conduct, interpret and report research findings and apply those findings to improving the agency and their programs.

There’s also a level of confidence knowing you’re implementing proven strategies with your clients. 

Social work research jobs

There are a number of ways in which you can blend interests in social work and research. A quick search on Glassdoor.com and Indeed.com retrieved the following positions related to social work research:

  • Research Coordinator on a clinical trial offering psychosocial supportive interventions and non-addictive pain treatments to minimize opioid use for pain.
  • Senior Research Associate leading and overseeing research on a suite of projects offered in housing, mental health and corrections.
  • Research Fellow in a school of social work
  • Project Policy Analyst for large health organization
  • Health Educator/Research Specialist to implement and evaluate cancer prevention and screening programs for a health department
  • Research Interventionist providing Cognitive Behavioral Therapy for insomnia patients participating in a clinical trial
  • Research Associate for Child Care and Early Education
  • Social Services Data Researcher for an organization serving adults with disabilities.
  • Director of Community Health Equity Research Programs evaluating health disparities.

No matter your population or area of interest, you’d likely be able to find a position that integrated research and social work. 

Social work practice and research are and should remain intertwined. This is the only way we can know what questions to ask about the programs and services we are providing, and ensure our interventions are effective. 

There are many opportunities to develop research skills while in school and while working in the field, and these skills can lead to some interesting positions that can make a real difference to clients, families and communities. 

Drake, R. E., Goldman, H., Leff, H. S., Lehman, A. F., Dixon, L., Mueser, K. T., et al. (2001). Implementing evidence-based practices in routine mental health service settings. Psychiatric Services, 52(2), 179-182. 

Engel, R.J., & Schutt, R.K. (2017). The Practice of Research in Social Work. Sage.

National Association of Social Workers. (n.d). Evidence Based Practice. Retrieved from: https://www.socialworkers.org/News/Research-Data/Social-Work-Policy-Research/Evidence-Based-Practice

Social Work Research Methods That Drive the Practice

A social worker surveys a community member.

Social workers advocate for the well-being of individuals, families and communities. But how do social workers know what interventions are needed to help an individual? How do they assess whether a treatment plan is working? What do social workers use to write evidence-based policy?

Social work involves research-informed practice and practice-informed research. At every level, social workers need to know objective facts about the populations they serve, the efficacy of their interventions and the likelihood that their policies will improve lives. A variety of social work research methods make that possible.

Data-Driven Work

Data is a collection of facts used for reference and analysis. In a field as broad as social work, data comes in many forms.

Quantitative vs. Qualitative

As with any research, social work research involves both quantitative and qualitative studies.

Quantitative Research

Answers to questions like these can help social workers know about the populations they serve — or hope to serve in the future.

  • How many students currently receive reduced-price school lunches in the local school district?
  • How many hours per week does a specific individual consume digital media?
  • How frequently did community members access a specific medical service last year?

Quantitative data — facts that can be measured and expressed numerically — are crucial for social work.

Quantitative research has advantages for social scientists. Such research can be more generalizable to large populations, as it uses specific sampling methods and lends itself to large datasets. It can provide important descriptive statistics about a specific population. Furthermore, by operationalizing variables, it can help social workers easily compare similar datasets with one another.

Qualitative Research

Qualitative data — facts that cannot be measured or expressed in terms of mere numbers or counts — offer rich insights into individuals, groups and societies. It can be collected via interviews and observations.

  • What attitudes do students have toward the reduced-price school lunch program?
  • What strategies do individuals use to moderate their weekly digital media consumption?
  • What factors made community members more or less likely to access a specific medical service last year?

Qualitative research can thereby provide a textured view of social contexts and systems that may not have been possible with quantitative methods. Plus, it may even suggest new lines of inquiry for social work research.

Mixed Methods Research

Combining quantitative and qualitative methods into a single study is known as mixed methods research. This form of research has gained popularity in the study of social sciences, according to a 2019 report in the academic journal Theory and Society. Since quantitative and qualitative methods answer different questions, merging them into a single study can balance the limitations of each and potentially produce more in-depth findings.

However, mixed methods research is not without its drawbacks. Combining research methods increases the complexity of a study and generally requires a higher level of expertise to collect, analyze and interpret the data. It also requires a greater level of effort, time and often money.

The Importance of Research Design

Data-driven practice plays an essential role in social work. Unlike philanthropists and altruistic volunteers, social workers are obligated to operate from a scientific knowledge base.

To know whether their programs are effective, social workers must conduct research to determine results, aggregate those results into comprehensible data, analyze and interpret their findings, and use evidence to justify next steps.

Employing the proper design ensures that any evidence obtained during research enables social workers to reliably answer their research questions.

Research Methods in Social Work

The various social work research methods have specific benefits and limitations determined by context. Common research methods include surveys, program evaluations, needs assessments, randomized controlled trials, descriptive studies and single-system designs.

Surveys involve a hypothesis and a series of questions in order to test that hypothesis. Social work researchers will send out a survey, receive responses, aggregate the results, analyze the data, and form conclusions based on trends.

Surveys are one of the most common research methods social workers use — and for good reason. They tend to be relatively simple and are usually affordable. However, surveys generally require large participant groups, and self-reports from survey respondents are not always reliable.

Program Evaluations

Social workers ally with all sorts of programs: after-school programs, government initiatives, nonprofit projects and private programs, for example.

Crucially, social workers must evaluate a program’s effectiveness in order to determine whether the program is meeting its goals and what improvements can be made to better serve the program’s target population.

Evidence-based programming helps everyone save money and time, and comparing programs with one another can help social workers make decisions about how to structure new initiatives. Evaluating programs becomes complicated, however, when programs have multiple goal metrics, some of which may be vague or difficult to assess (e.g., “we aim to promote the well-being of our community”).

Needs Assessments

Social workers use needs assessments to identify services and necessities that a population lacks access to.

Common social work populations that researchers may perform needs assessments on include:

  • People in a specific income group
  • Everyone in a specific geographic region
  • A specific ethnic group
  • People in a specific age group

In the field, a social worker may use a combination of methods (e.g., surveys and descriptive studies) to learn more about a specific population or program. Social workers look for gaps between the actual context and a population’s or individual’s “wants” or desires.

For example, a social worker could conduct a needs assessment with an individual with cancer trying to navigate the complex medical-industrial system. The social worker may ask the client questions about the number of hours they spend scheduling doctor’s appointments, commuting and managing their many medications. After learning more about the specific client needs, the social worker can identify opportunities for improvements in an updated care plan.

In policy and program development, social workers conduct needs assessments to determine where and how to effect change on a much larger scale. Integral to social work at all levels, needs assessments reveal crucial information about a population’s needs to researchers, policymakers and other stakeholders. Needs assessments may fall short, however, in revealing the root causes of those needs (e.g., structural racism).

Randomized Controlled Trials

Randomized controlled trials are studies in which a randomly selected group is subjected to a variable (e.g., a specific stimulus or treatment) and a control group is not. Social workers then measure and compare the results of the randomized group with the control group in order to glean insights about the effectiveness of a particular intervention or treatment.

Randomized controlled trials are easily reproducible and highly measurable. They’re useful when results are easily quantifiable. However, this method is less helpful when results are not easily quantifiable (i.e., when rich data such as narratives and on-the-ground observations are needed).

Descriptive Studies

Descriptive studies immerse the researcher in another context or culture to study specific participant practices or ways of living. Descriptive studies, including descriptive ethnographic studies, may overlap with and include other research methods:

  • Informant interviews
  • Census data
  • Observation

By using descriptive studies, researchers may glean a richer, deeper understanding of a nuanced culture or group on-site. The main limitations of this research method are that it tends to be time-consuming and expensive.

Single-System Designs

Unlike most medical studies, which involve testing a drug or treatment on two groups — an experimental group that receives the drug/treatment and a control group that does not — single-system designs allow researchers to study just one group (e.g., an individual or family).

Single-system designs typically entail studying a single group over a long period of time and may involve assessing the group’s response to multiple variables.

For example, consider a study on how media consumption affects a person’s mood. One way to test a hypothesis that consuming media correlates with low mood would be to observe two groups: a control group (no media) and an experimental group (two hours of media per day). When employing a single-system design, however, researchers would observe a single participant as they watch two hours of media per day for one week and then four hours per day of media the next week.

These designs allow researchers to test multiple variables over a longer period of time. However, similar to descriptive studies, single-system designs can be fairly time-consuming and costly.

Learn More About Social Work Research Methods

Social workers have the opportunity to improve the social environment by advocating for the vulnerable — including children, older adults and people with disabilities — and facilitating and developing resources and programs.

Learn more about how you can earn your  Master of Social Work online at Virginia Commonwealth University . The highest-ranking school of social work in Virginia, VCU has a wide range of courses online. That means students can earn their degrees with the flexibility of learning at home. Learn more about how you can take your career in social work further with VCU.

From M.S.W. to LCSW: Understanding Your Career Path as a Social Worker

How Palliative Care Social Workers Support Patients With Terminal Illnesses

How to Become a Social Worker in Health Care

Gov.uk, Mixed Methods Study

MVS Open Press, Foundations of Social Work Research

Open Social Work Education, Scientific Inquiry in Social Work

Open Social Work, Graduate Research Methods in Social Work: A Project-Based Approach

Routledge, Research for Social Workers: An Introduction to Methods

SAGE Publications, Research Methods for Social Work: A Problem-Based Approach

Theory and Society, Mixed Methods Research: What It Is and What It Could Be

READY TO GET STARTED WITH OUR ONLINE M.S.W. PROGRAM FORMAT?

Bachelor’s degree is required.

VCU Program Helper

This AI chatbot provides automated responses, which may not always be accurate. By continuing with this conversation, you agree that the contents of this chat session may be transcribed and retained. You also consent that this chat session and your interactions, including cookie usage, are subject to our privacy policy .

Field Educator: A Scholarly Journal from the Simmons University School of Social Work

  • ISSN 2165-3038
  • How to Submit
  • Current Issue
  • Past Issues
  • Integrating Practice Research into Social Work Field Education
  • Volume 11.1 | Spring 2021 |
  • Field Scholar |

Social work field education is considered a key element of social work education. While many field education placements traditionally have focused on teaching practice-based skills and integrating theory into practice, it is also critical to incorporate research into social work practice and field education. This article discusses how practice research can be integrated into social work field education by drawing upon a training module designed for this purpose by the Transforming the Field Education Landscape (TFEL) partnership. Implications and recommendations for practice research and field educators are provided.

Keywords : field education; practice research; social work; practicum/internship

Field education, also referred to as the signature pedagogy of social work (Council on Social Work Education, 2015), commonly accompanies coursework to enable students to connect classroom theory with practice in professional “on-the-ground” experiences (Bogo & Sewell, 2019). While field placements, practica, and internships are utilized in various disciplines, they play a critical role in social work by preparing students to provide effective services in various settings (Bogo, 2010). Within the social work field education domain, traditional student placements have tended to adopt one-on-one tutoring, role modelling, and mentoring of students by field instructors (Ayala et al., 2018; Bogo, 2010). While this model has been and is currently one of the most prominent models of social work field education, many authors have commented on how the traditional model of social work field education is becoming increasingly difficult to sustain (Ayala & Drolet, 2014; Ayala et al., 2018; Bellinger, 2010; Drolet, 2020).

Considering these observations, Ayala and colleagues (2018) interviewed Canadian field coordinators about addressing existing challenges within social work field education. They compiled several promising and wise practices that potentially could be used to move the state of social work field education from crisis to sustainability. These practices included designing and implementing alternative placement models and enhancing training of field instructors and faculty liaisons. In considering these promising practices suggested by Ayala and colleagues, the Transforming the Field Education Landscape (TFEL) partnership initiated the development of a field instructor training module for the Canadian Association for Social Work Education (CASWE). The training module focuses on creating strategies to assist field instructors in integrating practice research into social work field education. In combination with attempting to give field instructors the tools and knowledge needed to incorporate practice research within their workplace context, the module also seeks to bridge the gap between research and practice, and to demonstrate how these can be understood as parallel processes. Drawing from a literature review and consultations with TFEL members, the module has been developed further into an online course for students, social work educators and researchers, and other social work professionals. The online course is designed to share knowledge and understanding of practice research in order to strengthen the role of practice research in social work field education.

Transforming the Field Education Landscape (TFEL) Partnership

To date, there has been a disconnect between research and social work practice (Drolet & Harriman, 2020). Traditional formal research approaches are often viewed as being inapplicable to social work practice and inaccessible to social work students and practitioners (Driessens et al., 2011; Fook et al., 2011; Shannon, 2013). There is a need to rethink traditional approaches to teaching research in order to provide practical, hands-on learning opportunities that engage students (Benson & Blackman, 2003; Freymond et al., 2014; Trevisan, 2002). The Transforming the Field Education Landscape (TFEL) partnership aims to better prepare the next generation of social workers in Canada by creating training and mentoring opportunities for students, developing and mobilizing innovative and promising field education practices, and improving the integration of research and practice in field education (Drolet, 2020). This article is based on the findings of a literature review conducted on practice research and social work field education. The findings of the review informed the development of a practice research training module that was designed to better integrate practice research in social work field education. By creating multiple online training modules on practice research for diverse audiences including field instructors, students, social work educators, and researchers, the TFEL partnership aims to build capacity for bridging the gap between research and practice in field education. We hope that this article will invoke a sense of curiosity on practice research and provide information on how practice research can be integrated into social work field education.

Bringing Practice Research and Social Work Field Education Together

In recent years, many social work scholars have described how the profession requires incorporating a greater understanding of research into social work practice (Teater, 2017). Despite good intentions and the pursuit of the “social good,” social work can often lack accurate measurements of effective practice (Cabassa, 2016). Considering these shortcomings, social work must develop more intervention research that emerges from practice in order to maintain competency and proficiency (Fortune et al., 2010). Researchers and practitioners alike are witnessing the need, if the profession is to move forward, for social work research no longer to remain in the profession’s background, but to be incorporated into practice and emerge in an inseparable and interdependent manner (Webber & Carr, 2015, pp. 3–21). Often, research and practice are viewed as distinct and separate elements of the profession. Yet it is becoming increasingly evident that research and practice are both more effective when they function as collaborative processes.

In response to the need to incorporate research more effectively into social work practice, some educational institutions, workplaces, and practitioners have utilized a collaborative research process termed “practice research” to bridge the gap between research and practice. As new social work practitioners are educated and prepared for professional practice, it is of utmost importance that they are prepared to carry out their duties to the best of their abilities, generate new knowledge, and enhance practice strategies. By utilizing practice research in social work field education, social work students and practitioners will be better prepared to harness curiosity and research collaboratively to improve practice. We believe the profession of social work, through integrating practice research, will advance research-informed practice and practice-informed research.

For many social work practitioners and students, the thought of research can elicit feelings of anxiety, dread, and confusion (Wahler, 2019). However, when research is viewed as a regular part of daily practice, feelings of comfort and excitement about new opportunities can emerge. When closely examined, research and practice strategies can be considered parallel processes. Within both traditional practice and research settings, questions are raised, answers are pursued, and knowledge is discovered and developed. These parallel processes of research and practice have been expanded upon in detail through the development of the “Research As Daily Practice” philosophy created by Sally St. George, Dan Wulff, and Karl Tomm (St. George et al., 2015).

“Research As Daily Practice”

For many years, social work clinicians and educators Dan Wulff and Sally St. George have been advocating the notion of “Research As Daily Practice” (St. George, et al., 2015). Their work has identified how research and clinical practice overlap and share similar steps, procedures, and strategies. The core of their philosophy is the belief that “practitioners are researchers because they use inquiring processes to make quality decisions in their daily practice (gathering data, organizing the data to better explain phenomena, constructing a plan of action, implementing the plan, observing the effects, gathering more data)” (St. George et al., 2015, p. 5). When social work practitioners perceive themselves as active investigators and knowledge developers in practice, research becomes the tool to improve practice and push the profession forward. We believe that adopting a research-informed approach to practice, such as viewing research as a part of daily practice, is crucial for effective practice and practice research utilization within social work field education.

Defining Practice Research

The term “practice research” is used across disciplines to explain the negotiated process between practice (service providers and service users) and research (researchers and educators) working collaboratively to understand and address challenges, develop new knowledge on a subject matter, and address existing gaps between research and practice (Fisher et al., 2016). At the core of practice research is a focus on learning, sharing, and seeking to understand individual, group, and organizational environments (Joubert & Webber, 2020). While the methods through which practice research can be carried out vary, effective practice research involves pursuing curiosity about practice through collaborative means (Austin & Isokuortti, 2016). These elements of curiosity and collaboration are essential to both practice research and field education. Curiosity can be a catalyst to seeking and developing promising strategies to meet practice needs. When multiple parties with different perspectives can harness their curiosity, a critical examination of practice can often occur, which can lead to the “development of new ideas in the light of experience” (Austin & Isokuortti, 2016, p. 11).

While curiosity is a cornerstone of effective practice research, it is through collaboration that practice research becomes increasingly valuable. Practice research is often conducted through collaborative partnerships by a range of stakeholders to enhance understanding of practice issues and enrich practice strategies (Fouché, 2016). At a fundamental level, practice research is the process through which ideas, problems, or concepts found in practice are expounded upon through a research undertaking. When utilizing practice research, collaborators can pursue knowledge through informal or formal research designs, with the central premise being that the research generat knowledge to improve practice.

Practice Research Operationalized

In seeking to implement practice research in a social work setting, it is important to note that there is not one overarching methodology or formula that will work in every context. Instead, effective practice research methods tend to emerge organically through researchers’ and practitioners’ strategic collaborations based on their practice context and research questions. Effective methods are typically based on a research question that emerges from a practice issue. These methods must be conducted in a mutually beneficial process for all the collaborators involved in the practice research setting.

For those seeking more direction in designing a practice research study, Miller (2019) outlines a process in which the practitioners and researchers (“enablers”) work together to compile an effective agency report that follows a rigid methodological approach. In their approach, the enabler and the practitioners work collaboratively to identify an issue to investigate, discuss the issue, and narrow down a specific research question that meets the needs of the practice agency. The practitioner is supported in the data collection process by the enabler. The enabler usually will take the lead in data analysis, with direction from the practitioners, to ensure that the data accurately addresses the research question. The results typically culminate in some change to practice in the agency (Miller, 2019, pp. 682–683).

One of Miller’s (2019) strengths in this example is that the author presents the researcher as an enabler and enhancer of practice. While suggesting roles within the research process such as “enabler” and “enhancer” can be beneficial in highlighting where students may be able to utilize practice research within field education, these labels must be used judiciously, so as not to separate the elements of practice and research. Such a separation may create an insider/outsider argument, such as questioning who the real researcher might be. It is important to remember that practice research includes the notion that research can be conducted through inquiry at all levels of practice through a methodological research process. Miller’s example shows the importance of viewing the research and practice partnership as a method to increase practice competencies and to discover critical areas for further research. While this approach may be beneficial for some to follow, others may prefer a less formal process that invites cocreating a practice research framework based upon the practice setting and the investigators available.

Practice Research on a Continuum

To assist practitioners and students in implementing practice research in field education, we have outlined the multiple continual stages and central themes that we believe the practice research process can contain. This process is illustrated below in Figure 1.

research on social work practice

While we believe that these themes and stages can help organize and implement practice research within the field education context, we also accept that it is valuable to see practice research as on a continuum. Viewing practice research on a continuum allows for the notion that research is continually growing, evolving, and changing over time. This means that practice researchers can be in multiple stages at any given time, and that the research process can be continually renegotiated and improved to better meet the research and practice objectives. As we have identified multiple themes that students and field instructors may find to represent their current process, we also propose that curiosity, reflection, and collaboration are three central themes that must be present for effective practice research to be conducted. We have confidence that students and field instructors can create their own practice research projects collaboratively through utilizing this framework of practice research on a continuum. This framework may be beneficial for simplifying the practice research process, which can allow for both students and field instructors to identify important research questions and the proper strategies needed for investigation. We also recognize the importance of integrating practice research activities in field learning agreement contracts, integration seminars, assignments, and readings.

Recommendations and Implications for Integrating Practice Research into Social Work Field Education

The curiosity that drives inquiry and investigation forms the foundation of integrating research in field education (St. George et al., 2015). For practice research to be integrated successfully into social work field education, the research must be relevant and based on local practice topics that are meaningful, important, and appropriate for investigation. When considering integrating practice research into field education, it must be ensured that the research meets the needs of the agency, practitioners, and service users. This calls for attention with respect to caseloads, organizational patterns, and practice outcomes. These patterns can become more evident through discussions at staff meetings and case conferences, highlighting agency needs for further investigation. At this point in the process, field educators and students can commit to collaborate in the investigation of emerging curiosities and challenges. Any practice research agenda must include agency workers and respect practitioners’ time, and can either be planned out ahead of time or be a spontaneous response to an emerging need. These steps can help incorporate the utilization of practice research within social work field education.

Curiosity is but one quality motivating practice research. Walsh et al. (2019) also found that attitudes towards research, comfort level, the time involved, and interpersonal/relational factors also influence how enthusiastically students approach research. Thus, practice research is more than a cognitive exercise. Practical considerations need to be factored into any agency-based research: length of the practicum, time availability with the supervisor, and available practical resources. For this reason, it is helpful to think of practice research on a continuum so that all students can benefit from integrating research and theory into their practice.

The impetus for practice research initiatives can start in the curriculum and teaching of the social work research course. Rather than being a distant, esoteric topic, social work research courses can be designed “closer to home” where they have greater personal significance to students. Research also can be taught to create “habits of the mind” that have relevance in everyday life. Students can become disciplined to discern valid and reliable information critically, rather than accepting any opinion that catches their attention. Research does not have to be a practice that is overly formal and hard to reach. We need to bring research to the student instead of bringing the student to research. Similarly, research conducted within field education placements provides an opportunity to bring research closer to the field for agencies and field instructors.

A Continuing Need for Collaboration

In keeping with this project, field education can be transformed through joint training initiatives (Drolet & Harriman, 2020), such as between social work programs, faculty liaisons, field instructors, social work educators, students, service users, and agencies. To achieve this transformation, we need to enhance students’ knowledge and skills and build research capacity through partnered research endeavors (Drolet & Harriman, 2020, p. 4). One of the ways that this has been considered within our current context is through the creation of a practice research training module for the field instructor training hosted by the Canadian Association of Social Work Education (CASWE). This specific module introduces field instructors to practice research, demonstrates how practice research may be used in field education, and invites field instructors to consider utilizing practice research alongside their students within their practice context. In combination with this field instructor training module, we also created an open-access practice research course that can be accessed by students, educators, field instructors, and other social work professionals interested in learning more about practice research and how it can be used within field education. In building capacity for practice research, there is value in encouraging social work educational programs and field agencies to develop more sustainable field education models and promote multidirectional exchanges of knowledge on innovations and promising practices. As it is with practice research, collaboration is a key element in the incorporation of practice research into social work field education.

Points of Consideration for Social Work Field Education Programs

As field education programs and placements consider integrating practice research into their contexts, there are several considerations that require examination. These considerations include:

  • incorporating into research courses at least one project involving practice research that echoes a field situation;
  • offering field instructors in-service training and workshops on using practice research in their agencies;
  • clearly demonstrating how research and practice activities are parallel processes;
  • developing strategies to help students and field instructors to become more comfortable and competent in developing and undertaking practice research in field education placements;
  • designing learning contract agreements that incorporate, at a minimum, at least one research/practice initiative;
  • facilitating the integration of research and theory into practice; and
  • evaluating how practice research is adopted in the field, what challenges emerge, and how to maximize the use of practice research in field education.

Gaps and Areas of Future Research

While we have been able to identify multiple ways to integrate practice research into social work field education, we also acknowledge that the incorporation of practice research into field education is in the early stages. As we move forward with this initiative, we need to examine the various configurations of field education across Canada and internationally to identify promising practices for incorporating practice research into field education. To accomplish this, we must become aware of the challenges impacting students, field instructors, and agencies, and consider them in our approaches. As research moves forward, there is a need to expand on the suggestions in this article and demonstrate how practice research can be utilized in different types of social work field education placements and contexts.

The authors would like to acknowledge the Social Science and Humanities Research Council Partnership Grant for providing financial support to the TFEL partnership.

Austin, M. J., & Isokuortti, N. (2016). A framework for teaching practice-based research with a focus on service users. Journal of Teaching in Social Work, 36 (1), 11–32. https://doi.org/10.1080/08841233.2016.1129931

Ayala, J., & Drolet, J. (2014). Special issue on social work field education. Currents: Scholarship in the Human Services, 13 (1), 1–4. https://journalhosting.ucalgary.ca/index.php/currents/article/view/15948

Ayala, J., Drolet, J., Fulton, A., Hewson, J., Letkemann, L., Baynton, M., Elliot, G., Judge-Stasiak, A., Blaug, C., Tetrault, G., & Schweizer, E. (2018). Restructuring social work field education in 21 st century Canada. Canadian Social Work Review, 35 (2), 45–66.

Bellinger, A. (2010). Studying the landscape: Practice learning from social work reconsidered. Social Work Education: The International Journal, 29 (6), 599–615. https://doi.org/10.1080/02615470903508743

Benson, A., & Blackman, D. (2003). Can research methods ever be interesting? Active Learning in Higher Education, 4 (1), 39–55. https://doi.org/10.1177%2F1469787403004001004

Bogo, M. (2010). Achieving competence in social work through field education. University of Toronto Press.

Bogo, M., & Sewell, K. M. (2019). Introduction to the special issue on field education of students. Clinical Social Work Journal, 47 , 1–4. https://doi.org/10.1007/s10615-018-0696-z

Cabassa, L. J. (2016). Implementation science: Why it matters for the future of social work. Journal of Social Work Education , 52 (sup1), S38–S50. https://doi.org/10.1080/10437797.2016.1174648

Council on Social Work Education. (2015). Report of the CSWE summit on field education 2014. CSWE.

Driessens, K., Saurama, E., & Fargion, S. (2011). Research with social workers to improve their social interventions. European Journal of Social Work, 14 (1), 71–88. https://doi.org/10.1080/13691457.2010.516629

Drolet, J. (2020). A new partnership: Transforming the field education landscape: Intersections of research and practice in Canadian social work field education. Field Educator , 10 (1), 1–18. https://fieldeducator.simmons.edu/article/transforming-the-field-education-landscape/

Drolet, J., & Harriman, K. (2020). A conversation on a new Canadian social work field education and research collaboration initiative. Field Educator, 10 (1), 1-7. https://fieldeducator.simmons.edu/article/a-conversation-on-a-new-canadian-social-work-field-education-and-research-collaboration-initiative

Fisher, M., Austin, M. J., Julkunen, I., Sim, T., Uggerhøj, L., & Isokuortti, N. (2016). Practice research. Oxford Bibliographies Online Datasets . https://doi.org/10.1093/obo/9780195389678-0232

Fook, J., Johannessen, A., & Psoinos, M. (2011). Partnership in practice research: A Norwegian experience. Social Work & Society, 9 (1), 29–44.

Fortune, A. E., McCallion, P., & Briar-Lawson, K. (2010). Social work practice research for the twenty-first century . Columbia University Press.

Fouché, C. (2016). Practice research partnerships in social work: Making a difference. Aotearoa New Zealand Social Work , 28 (4), 118–119. https://doi.org/10.11157/anzswj-vol28iss4id291

Freymond, N., Morgenshtern, M., Duffie, M., Hong, L., Bugeja-Freitas, S., & Eulenberg, J. (2014). Mapping MSW research training. Journal of Teaching in Social Work, 34 (3), 248–268. https://doi.org/10.1080/08841233.2014.912998

Joubert, L. B., & Webber, M. (2020). The Routledge handbook of social work practice research . Routledge.

Miller, K. (2019). Practice research enabler: Enabling research in a social work practice context. Qualitative Social Work, 18 (4), 677–692. https://doi.org/10.1177/1473325017751038

Shannon, P. (2013). Value-based social work research: Strategies for connecting research to the mission of social work. Critical Social Work, 14 (1). https://doi.org/10.22329/csw.v14i1.5875

St. George, S., Wulff, D., & Tomm, K. (2015). Research as daily practice. Journal of Systemic Therapies, 34 (2), 3–14. https://doi.org/10.1007/978-3-319-15877-8_890-1

Teater, B. (2017). Social work research and its relevance to practice: “The gap between research and practice continues to be wide.” Journal of Social Sciences Research, 43 (5), 547–565. https://doi.org/10.1080/01488376.2017.1340393

Trevisan, M. S. (2002). Enhancing practical evaluation training through long-term evaluation projects. American Journal of Evaluation, 23 (1), 81–92. https://doi.org/10.1016/S1098-2140(01)00163-1

Wahler, E. (2019). Improving student receptivity to research by linking methods to practice skills . Journal of Teaching in Social Work, 39 (3), 248–259. https://doi.org/10.1080/08841233.2019.1611693

Walsh, C., Gulbrandsen, C., & Lorenzetti, L. (2019). Research practicum: An experiential model for social work research. Sage Open, 9 (2), 1–11. https://doi.org/10.1177/2158244019841922

Webber, M., & Carr, S. (2015). Applying research evidence in social work practice: Seeing beyond paradigms. In M. Webber (Ed.), Applying research evidence in social work practice . Palgrave Macmillan.

Dillon K. Traber, MSW University of Calgary Tara Collins, PhD Candidate University of Calgary Julie L. Drolet, PhD University of Calgary Diana J. Adamo, MSW Student University of Calgary Monica Franco, MSW Student University of Calgary Kristina M. Laban, MSW Student University of Calgary Sheri M. McConnell, PhD Memorial University of Newfoundland Ellen Mi, BSW Student University of Calgary Sally St. George, PhD University of Calgary Dan Wulff, PhD University of Calgary

© May 2021 Reprinting & reuse

Volume 11.1 | Spring 2021

Conversations.

  • The Conversation: Dr. Melissa Reitmeier, Chair of the Council on Field Education

Field Scholar

  • BSW Students in Field: Factors Contributing to the Internship Experience
  • Preparing Students for Trauma Exposure in Field Education Settings
  • Kudos: Dr. William Fisher retiring after a twenty-four-year career in Social Work Education
  • Letter from the Editor
  • Letter from the Editor: Back to the Future

Practice Digest

  • Addressing Self-Care for Students and Field Educators with Mindfulness: A Collaborative Approach to Field Placement
  • COVID-19: An Existential Crisis for Social Work Field Education
  • Innovations to the Design and Delivery of Foundation Field Education Seminars
  • Integrating Classroom and Field Assignments: Creating Comprehensive Assessment and Learning Opportunities

What We're Reading

  • Recent Articles of Note

The Field Educator is published by the Simmons University School of Social Work. Learn more »

Current Volume

Field talks.

  • Episode 3: Service User Perspectives in Social Work Education
  • Supporting Students with Disabilities in Social Work Field Placements: What Is Being Done?
  • Moving from Darkness to Light: Social Work Students’ Reflections on COVID-19 in Practice And the Future of the Profession
  • Toward Understanding the Training Needs of Canadian Field Instructors
  • Social Work Departmental Leadership Attitudes on Formal Mentorship: The Impact on Field Directors
  • The Truth, Liberation, and Justice Project: Engaging Students in Conversations about Antiracism in Social Work

© 2024 Field Educator | Simmons School of Social Work, 300 The Fenway, Boston, MA 02115 | T: 617-521-3970 | F: 617-521-3980 | [email protected]

  • School of Social Work >
  • Academics >
  • Micro-Credentials >
  • Social Work Practice in Serious Illness Care Micro-Credential >
  • Application Form

Why is serious illness care important?

- 6 of 10 adults live with a serious mental or physical illness or disability (1 in 10 have 2+)

- There are 44 million informal caregivers (family and friends) who provide care for an adult or child with a serious illness or disability

- Serious illnesses affect people of all ages, genders, races and socioeconomic situations

- Today, people live longer and die later from multiple, chronic serious conditions

Social Work Practice in Serious Illness Care Micro-Credential Application Form

Complete and submit the following form. Once submitted, you will be contacted to set up an interview to learn more about your interest in the program.

IMAGES

  1. The Practice of Research in Social Work

    research on social work practice

  2. Use of Research in Social Work Practice, The: A Case Example from

    research on social work practice

  3. What is evidence based practice in social work?

    research on social work practice

  4. Research on Social Work Practice

    research on social work practice

  5. (PDF) The Science of Social Work and Its Relationship to Social Work

    research on social work practice

  6. (PDF) Special Issue: Research on Social Work Practice in Israel

    research on social work practice

VIDEO

  1. social work research : MEANING , DEFINITION AND OBJECTIVES OF SOCIAL WORK RESEARCH

  2. Social Work Research: Steps of Research #researchstudy #socialresearch #BSW #MSW #UGC-NET

  3. National Study Conference

  4. social work meaning and definition MSW/BSW syllabus professional social work practice

  5. Afrocentric Perspective in Social Work

  6. Social Work Theories || UGC-NET || NADEEM

COMMENTS

  1. Research on Social Work Practice: Sage Journals

    Research on Social Work Practice (RSWP), peer-reviewed and published eight times per year, is a disciplinary journal devoted to the publication of empirical research concerning the assessment methods and outcomes of social work practice. … | View full journal description. This journal is a member of the Committee on Publication Ethics (COPE).

  2. Research on Social Work Practice

    Yunus Kara. Ayşe Sezen Serpen. Preview abstract. Purpose: This study evaluated the possible effects of empathy-focused group work on the participants, which is designed by bringing together cisgender heterosexual and LGBTQ+ people. Method: The study group of the research consists of 28 people (14 people ...

  3. Research on Social Work Practice

    A journal dedicated to the publication of empirical research on social work practice outcomes and methods. Find original reports, reviews, essays, and special issues on various topics and interventions in social work.

  4. Practice Research in Social Work: Themes, Opportunities and Impact

    Practice research and social work co-exist within an environment of collaboration and interdisciplinary cooperation, where social workers collaborate with researchers, policymakers, and other professionals to collectively address complex social issues. There is a need for holistic research methodologies to develop approaches that address the multifaceted needs of individuals and communities ...

  5. Social Work Research

    Social Work Research is a peer-reviewed journal that publishes research to advance social work knowledge and practice. Find out more about the journal's scope, impact, collections, featured articles, and special issues.

  6. Journal of the Society for Social Work and Research

    JSSWR publishes a wide range of perspectives, research approaches, and types of analyses that advance knowledge useful for designing social programs and interventions, developing innovative public policies, and improving social work research and practice globally.

  7. Back to the Future: Using Social Work Research to Improve Social Work

    Abstract This article traces themes over time for conducting social work research to improve social work practice. The discussion considers 3 core themes: (a) the scientific practitioner, including different models for applying this perspective to research and practice; (b) intervention research; and (c) implementation science. While not intended to be a comprehensive review of these themes ...

  8. Social Work Research

    Social Work Research publishes exemplary research to advance the development of knowledge and inform social work practice. Widely regarded as the outstanding journal in the field, it includes analytic reviews of research, theoretical articles pertaining to social work research, evaluation studies, and diverse research studies that contribute to knowledge about social work issues and problems ...

  9. The Pursuit of Quality for Social Work Practice: Three Generations and

    In the words of the American Academy of Social Work and Social Welfare (AASWSW; n.d.), social work seeks to "champion social progress powered by science." The research community needs to support practice through innovative and rigorous science that advances the evidence for interventions to address social work's grand challenges.

  10. Social Work Research and Its Relevance to Practice: "The Gap Between

    The social work profession should take action to address and further research the research-practice disconnect by establishing a clear definition and aims of social work research, and training academics in effective research-to-practice translational methods.

  11. (PDF) Social Work Research and Its Relevance to Practice: "The Gap

    The social work profession should take action to address and further research the research-practice disconnect by establishing a clear definition and aims of social work research, and training ...

  12. Practice research methods in social work: Processes, applications and

    This article introduces practice research (PR), a participatory approach that involves social work researchers and practitioners collaborating to improve social service organisations. It also discusses the methods, applications and implications of PR for social work research and practice.

  13. Research on Social Work Practice

    Subscription and open access journals from Sage, the world's leading independent academic publisher.

  14. Full article: Social workers use of knowledge in an evidence-based

    ABSTRACT Since the 1990s, evidence-based practice has become part of social work, grounded in the notion that social work should be a research-based profession. However, recent studies show that social workers struggle with bridging research and practice. This study analysed Norwegian social workers' use of knowledge in their daily practice, drawing on data from a survey consisting of 2060 ...

  15. Research on Social Work Practice

    Implications for prevention science, social work practice and social work research are discussed in the context of the bi-national border region and the applicability and prospect for ...

  16. How to Bring Research Into Social Work Practice

    This guide is all about how practicing social workers can benefit from research and how they can contribute to research.

  17. Social Work Research Methods That Drive the Practice

    What is social work research? Learn more about the data-driven research designs and social work research methods that drive modern policy and practice.

  18. Developing and promoting social work practice research

    These articles are vital for the rapidly developing social work practice research that comprises multiple research approaches in Mainland China. We hope Chinese researchers, practitioners and other readers can benefit greatly from this issue, enhance their research capabilities, and work together to promote the development of social work in China.

  19. Journal of Social Work: Sage Journals

    Journal of Social Work. The Journal of Social Work is a forum for the publication, dissemination and debate of key ideas and research in social work. The journal aims to advance theoretical understanding, shape policy, and inform practice, and welcomes submissions from all … | View full journal description.

  20. PDF Practice-Informed Research: Contemporary Challenges and Ethical

    Social work educational programs implementing this EPAS standard must work to ensure that social work graduates are equipped to contribute meaningful work to this underdeveloped area and help to integrate practice-informed research.

  21. Integrating Practice Research into Social Work Field Education

    Abstract Social work field education is considered a key element of social work education. While many field education placements traditionally have focused on teaching practice-based skills and integrating theory into practice, it is also critical to incorporate research into social work practice and field education.

  22. Full article: Promoting Practitioner Research through a Social Work

    Social work requires a robust evidence base to support effective interventions, yet social work research only minimally influences practice, indicating that the profession should address the research-practice disconnect (Teater 2017 ).

  23. Promise into practice: Application of computer vision in empirical

    Social scientists increasingly use video data, but large-scale analysis of its content is often constrained by scarce manual coding resources. Upscaling may be possible with the application of automated coding procedures, which are being developed in the field of computer vision. Here, we introduce computer vision to social scientists, review the state-of-the-art in relevant subfields, and ...

  24. Social Work Practice in Serious Illness Care Micro-Credential

    Social Work Practice in Serious Illness Care Micro-Credential > Application Form; Academics. Why Choose Social Work at UB? Master of Social Work (MSW) MSW Online; ... Buffalo Center for Social Research. 6/23/23 Institutes and Centers; 9/25/23 Behavioral Health Clinic; News and Events . 6/11/24 Latest News; 11/20/23 Upcoming Events;

  25. Research on Social Work Practice

    The social work profession has enhanced its research capabilities since the 1991 Task Force on Social Work Research. However, these impressive changes have not closed the research-practice gap or made substantial contributions to empirically supported ...

  26. Research Fellow Positions: Epidemiology Program in Bethesda, MD for

    Our interdisciplinary program is designed to provide experiences that foster the development of skills necessary to conduct high quality independent research. Research fellows are expected to develop their own research initiatives within the mission of the EP using the wealth of data from available cohorts and cross-sectional studies.