» Articles » PMID: 26537706

Outcomes for Implementation Science: an Enhanced Systematic Review of Instruments Using Evidence-based Rating Criteria

Overview
Journal Implement Sci
Publisher Biomed Central
Specialty Health Services
Date 2015 Nov 6
PMID 26537706
Citations 179
Authors
Affiliations
Soon will be listed here.
Abstract

Background: High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings.

Methods: Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile.

Results: We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity.

Conclusions: Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.

Citing Articles

Assessing the readiness and feasibility to implement a model of care for spine disorders and related disability in Cross Lake, an Indigenous community in northern Manitoba, Canada: a research protocol.

Bussieres A, Passmore S, Kopansky-Giles D, Tavares P, Ward J, Ladwig J Chiropr Man Therap. 2025; 33(1):12.

PMID: 40082975 PMC: 11908001. DOI: 10.1186/s12998-025-00576-1.


Development of an instrument (Cost-IS) to estimate costs of implementation strategies for digital health solutions: a modified e-Delphi study.

Donovan T, Abell B, McPhail S, Carter H Implement Sci. 2025; 20(1):13.

PMID: 40055802 PMC: 11889902. DOI: 10.1186/s13012-025-01423-w.


Measuring Implementation Outcomes Change Over Time Using an Adapted Checklist for Assessing Readiness to Implement (CARI).

Bourdeau B, Guze M, Rebchook G, Shade S, Psihopaidas D, Chavis N AIDS Behav. 2025; .

PMID: 39899228 DOI: 10.1007/s10461-025-04614-0.


Addressing the "Last Mile" Problem in Educational Research: Educational Researchers' Interest, Knowledge, and Use of Implementation Science Constructs.

Gaias L, Cook C, Brewer S, Bruns E, Lyon A Educ Res Eval. 2025; 28(7-8):205-233.

PMID: 39845578 PMC: 11753797. DOI: 10.1080/13803611.2023.2285440.


Implementation research logic model in the design and execution of eHealth innovations for maternal and newborn healthcare in Ethiopia.

Nigatu D, Azage M, Misgan E, Enquobahrie D, Kebebaw T, Abate E Health Res Policy Syst. 2025; 23(1):4.

PMID: 39762955 PMC: 11702162. DOI: 10.1186/s12961-024-01259-8.


References
1.
Wisdom J, Chor K, Hoagwood K, Horwitz S . Innovation adoption: a review of theories and constructs. Adm Policy Ment Health. 2013; 41(4):480-502. PMC: 3894251. DOI: 10.1007/s10488-013-0486-4. View

2.
Glasgow R, Riley W . Pragmatic measures: what they are and why we need them. Am J Prev Med. 2013; 45(2):237-43. DOI: 10.1016/j.amepre.2013.03.010. View

3.
Chor K, Wisdom J, Olin S, Hoagwood K, Horwitz S . Measures for Predictors of Innovation Adoption. Adm Policy Ment Health. 2014; 42(5):545-73. PMC: 4201641. DOI: 10.1007/s10488-014-0551-7. View

4.
Muse K, McManus F . A systematic review of methods for assessing competence in cognitive-behavioural therapy. Clin Psychol Rev. 2013; 33(3):484-99. DOI: 10.1016/j.cpr.2013.01.010. View

5.
Rabin B, Purcell P, Naveed S, Moser R, Henton M, Proctor E . Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012; 7:119. PMC: 3541131. DOI: 10.1186/1748-5908-7-119. View