Skip to main content

Implementation strategies: recommendations for specifying and reporting

Abstract

Implementation strategies have unparalleled importance in implementation science, as they constitute the ‘how to’ component of changing healthcare practice. Yet, implementation researchers and other stakeholders are not able to fully utilize the findings of studies focusing on implementation strategies because they are often inconsistently labelled and poorly described, are rarely justified theoretically, lack operational definitions or manuals to guide their use, and are part of ‘packaged’ approaches whose specific elements are poorly understood. We address the challenges of specifying and reporting implementation strategies encountered by researchers who design, conduct, and report research on implementation strategies. Specifically, we propose guidelines for naming, defining, and operationalizing implementation strategies in terms of seven dimensions: actor, the action, action targets, temporality, dose, implementation outcomes addressed, and theoretical justification. Ultimately, implementation strategies cannot be used in practice or tested in research without a full description of their components and how they should be used. As with all intervention research, their descriptions must be precise enough to enable measurement and ‘reproducibility.’ We propose these recommendations to improve the reporting of implementation strategies in research studies and to stimulate further identification of elements pertinent to implementation strategies that should be included in reporting guidelines for implementation strategies.

Peer Review reports

The need for better specification and reporting of implementation strategies

Implementation strategies have unparalleled importance in implementation science, as they constitute the ‘how to’ component of changing healthcare practice. Comprising the specific means or methods for adopting and sustaining interventions[1], implementation strategies are recognized as necessary for realizing the public health benefits of evidence-based care[2]. Accordingly, developing strategies to overcome barriers and increase the pace and effectiveness of implementation is a high research priority[3–7].

While the evidence for particular implementation strategies is increasing[8], limitations in their specification pose serious problems that thwart their testing and hence the development of an evidence-base for their efficiency, cost, and effectiveness. Implementation strategies are often inconsistently labelled and poorly described[9], are rarely justified theoretically[10, 11], lack operational definitions or manuals to guide their use, and are part of ‘packaged’ approaches whose specific elements are poorly understood[12]. The literature on implementation has been characterized as a ‘Tower of Babel’[13], which makes it difficult to search for empirical studies of implementation strategies, and to compare the effects of different implementation strategies through meta-analyses[9]. Worse yet, the lack of clarity and depth in the description of implementation strategies within the published literature precludes replication in both research and practice. As with all intervention research, implementation strategies need to be fully and precisely described, in detail sufficient to enable measurement and ‘reproducibility’[14] of their components.

The purpose of this article is to provide guidance to researchers who are designing, conducting, and reporting studies by proposing specific standards for characterizing implementation strategies in sufficient detail. We begin by providing a brief introduction to implementation strategies, including how the broad term has been defined as well as some examples of implementation strategies. Thereafter we suggest an extension of existing reporting guidelines that provides direction to researchers with regard to naming, clearly describing, and operationalizing implementation strategies.

Definitions and examples of implementation strategies

Implementation strategies can be defined as methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice[15]. A growing literature on implementation strategies provides a window into their type, range, and nature. They include ‘top down/bottom up,’ ‘push/pull,’ and ‘carrot/stick’ tactics, and typically involve ‘package’ approaches[16]. They include methods for provider training and decision support; intervention-specific tool kits, checklists, and algorithms; formal practice protocols and guidelines; learning collaboratives, business strategies and organizational interventions from management science (e.g., plan-do-study-act cycles[17] and ‘lean thinking’[18]); and economic, fiscal, and regulatory strategies.

The complexity of implementation strategies can vary widely. For instance, some implementation efforts may involve a single component strategy, such as disseminating treatment guidelines in the hopes of changing clinicians’ behavior (e.g., Azocar et al.[19]). These strategies have been referred to as discrete strategies in the literature[20, 21], though they have also been called ‘implementation interventions’[22], ‘actions’[23], and ‘specified activities’[23]. A number of publications provide lists and taxonomies that attempt to reflect the range of these strategies[20, 24–26]. For example, Powell et al.[20] compiled a ‘menu’ of 68 implementation strategies, grouped by six key processes: planning (e.g., conducting a local needs assessment, developing a formal implementation plan), educating (e.g., conduct educational meetings, distribute educational materials), financing (e.g., alter incentive/allowance structures, access new funding), restructuring (e.g., revise professional roles), managing quality (e.g., provide clinical supervision, audit and feedback, reminders), and attending to policy context (e.g., creating or changing credentialing and/or licensure requirements)[20]. Michie et al.[26] focused on a more granular level in their published taxonomy of 93 behavior change techniques (e.g., punishment, prompts/cues, material reward, habit formation, etc.), many of which could be used to further specify implementation strategies as well.

Most often a number of strategies are combined to form a multifaceted strategy such as training, consultation, and audit and feedback. There are also a number of manualized and branded multifaceted implementation strategies, such as the ‘ARC’ organizational implementation strategy[27, 28], the Institute for Healthcare Improvement’s learning collaborative[29] and framework for spread models[30], the Getting to Outcomes framework[31], and the Replicating Effective Programs (REP) framework[32, 33]. The REP framework for instance, includes a number of discrete or component implementation strategies across four phases: pre-conditions (e.g., identifying need, identifying barriers), pre-implementation (e.g., developing a community working group), implementation (e.g., training, technical assistance, feedback and refinement), and maintenance and evolution (e.g., re-customize delivery as need arises)[33]. Some authors have simply used the term ‘implementation strategy’ to refer to these multi-faceted implementation strategies comprised of multiple ‘implementation interventions’[15], whereas others have referred to ‘implementation programs’ to be inclusive of all of the component implementation strategies utilized in an implementation effort[34].

We have chosen to use the term ‘implementation strategy’ to be inclusive of both single component and multi-faceted implementation strategies, and we purposefully attempt to avoid the word ‘intervention’ largely to reduce the chance that clinical interventions and implementation interventions be confused[23]. That said, we acknowledge that some interventions can be used as either implementation strategies or interventions in their own right. For instance, the ‘ARC’ intervention[28, 35] was designed as an organizational improvement strategy (i.e., not necessarily as a method of implementing other clinical interventions). A randomized trial of ARC as a ‘standalone’ intervention has shown it to be effective in improving organizational culture, climate, and work attitudes as well as clinical outcomes for youth[27, 36]. However, it also has been used as a strategy to implement a psychosocial intervention (Multisystemic Therapy)[28]. In cases where a strategy may be conceptualized as an improvement intervention in its own right (i.e., independent of the clinical intervention being implemented) it may be useful to employ a 2 x 2 factorial design, in which both the implementation strategy and the clinical intervention are compared independently and in combination to a no treatment control. The complexity of even making the distinction between an implementation strategy and an independent intervention highlights the importance of carefully specifying the strategy in the manner that we describe below, so as to ensure that consumers of the resulting research understand how, when, why, and where the strategy is likely to be effective.

As evidenced by many of the examples above, interest has been high and progress has been made in the identification, development, and testing of implementation strategies. However, definitions and descriptions of implementation strategies in the literature often lack the clarity required to interpret study results and build upon the knowledge gained through the replication and extension of the research. This signals the need for more guidance that would assist researchers designing, conducting, and reporting implementation studies.

Prerequisites to studying implementation strategies empirically

The study of implementation strategies should be approached in a similar fashion as evidence-based interventions (EBIs), for strategies are in fact a type of intervention. Accordingly, their specification carries the same demands as treatment specification: If they are to be scientifically tested, communicated clearly in the literature, and accurately employed in actual healthcare practice, they must be specified both conceptually and operationally[37]. There are a number of prerequisites to the measurement of implementation strategies, many of which are detailed below. They are also listed in Table 1, along with examples, resources, or tools from the literature (when available) for advancing the state of measurement.

Table 1 Prerequisites to measuring implementation strategies

The complexity of implementation strategies poses one of the greatest challenges to their clear description, operational definition, and measurement. Implementation strategies are inherently complex social interventions, as they address multifaceted and complicated processes within interpersonal, organizational, and community contexts[12, 56–58]. Implementation strategies must be capable of dealing with the contingencies of various service systems, sectors, of care, and practice settings, as well as the human capital challenge of staff training and support. They must tackle a myriad of barriers to evidence-based care[59, 60] and the various properties of interventions that make them more or less amenable to implementation[52]. All these factors significantly contribute to the challenge of measuring, testing, and effectively employing implementation strategies in actual healthcare practice. We attempt to provide this guidance by discussing fundamental principles for naming, defining, and specifying implementation strategies, all of which are prerequisites to studying them empirically.

  1. 1.

    Name it

    To be measured, an implementation strategy must first be named or labelled. While this may seem simplistic or self-evident, Gerring[61] draws our attention to three problems that ‘…plague the social science lexicon: homonymy (multiple meanings for the same term), synonymy (different terms with the same, or overlapping, meanings), and instability (unpredictable changes in the foregoing).’ Certainly, these problems are evident within the dissemination and implementation science literature[13, 62–64], and this makes it difficult to search the empirical literature, conduct meta-analyses, and ultimately, to build a body of evidence that supports the use of specific strategies in particular contexts[9, 64]. For example, Brouwers et al.[2] found their review of studies of implementation strategies for cancer screening programs hampered by the inconsistent labelling of strategies and other specification issues related to the description and justification of selected strategies.

    Given the confusion caused by poorly labelled implementation strategies and the call for the harmonization of terminology, constructs, and measures in implementation science[62], implementation stakeholders should be thoughtful as they name implementation strategies, preferably drawing upon the same terms as other researchers in the field when possible. A number of sources that have compiled implementation strategies may be helpful in identifying potentially appropriate names[20, 24, 25]. When different terms are used (or created), they should be carefully distinguished from strategies that are already more established in the literature. It should be noted that naming may be more complicated with multifaceted and blended strategies[20] that contain a wide variety of discrete implementation strategies. In these cases, every effort should be made to specify the discrete or component parts of the implementation strategy. For example, Forsner et al.[65] described a number of components to a multifaceted implementation strategy to support the implementation of clinical guidelines, including the formation of local implementation teams, the development of implementation plans, documentation of quality indicators, academic outreach detailing, etc.

  2. 2.

    Define it

    A second step is to define the implementation strategy conceptually. This is distinct from the operationalization of the strategy, which will be addressed below. For example, audit and feedback can be defined conceptually as ‘any summary of clinical performance of health care over a specified period of time’ that can be provided in a written, electronic, or verbal format[66]. A conceptual definition gives a general sense of what the strategy may involve, and allows the reader to more fully discern whether or not the current usage is consistent with other uses of the term represented in the literature. Defining more complex multifaceted and/or blended implementation strategies also requires that each of the discrete strategies or components are distinguished and conceptually defined. Many of the existing taxonomies[20, 24, 25] provide conceptual definitions that can prove helpful in generating a better understanding of implementation strategies. Indeed, both naming and defining implementation strategies conceptually makes it possible to distinguish one strategy from another. Yet this is not sufficient for full specification. For instance, while the strategy audit and feedback may have a commonly recognized name and definition, it can be delivered in a multitude of ways in actual practice. Eccles et al.[67] describe five modifiable elements of audit and feedback that alone produce 288 potential forms of audit and feedback, and Hysong[68] has produced a meta-analysis that documents how different features of audit and feedback impact its effectiveness. Much of what follows regarding specification is intended to further advance better operationalization and contextualization of strategy uses, thereby propelling the field toward a greater understanding of not just what strategies are effective, but how and why they are effective in different contexts[57].

  3. 3.

    Operationalize it

    Strategies must be described clearly in a manner that ensures that they are discussed at a common level of granularity, are rateable across multiple dimensions, and are readily comparable. In short, they must be defined operationally. This will make implementation strategies more comparable and evaluable, and ultimately make it easier for researchers and other implementation stakeholders to make decisions about which implementation strategies will be most appropriate for their purposes. It will also go a long way toward ensuring that strategies are enacted in the manner intended (i.e., with fidelity). As with clinical interventions, assessing the fidelity of implementation strategy delivery enables a clear test of effectiveness by showing whether or not the strategy was delivered as intended. Without such assessments, it is difficult to determine whether the effectiveness (or lack thereof) of a given strategy can be attributed to the strategy itself or to other contextual factors. An example from another field (human resource management) highlights the utility of carefully operationalizing complex processes. Functional Job Analysis suggests that any given task must include the following information: a) who, b) performs what actions, c) drawing on what knowledge, d) relying on what skills, e) using what materials or tools, f) in order to achieve what outcome?[69].

    In a similar fashion, we propose seven dimensions that, if detailed adequately, would constitute the adequate operationalization of implementation strategies: the actors(s)—i.e., who delivers the strategy?; the action(s); the target(s) of the action—i.e., toward what or whom and at what level?; temporality—i.e., when or at what phase?; e) dose—i.e., at what frequency and intensity?; f) the implementation outcome(s) affected; and g) justification—i.e., based upon what theoretical, empirical, or pragmatic justification?. In the following sections, we address each of these dimensions. We provide an illustration of how these dimensions can be specified in Table 2, using two implementation strategies as an example (‘clinical supervision’ and ‘clinician implementation teams’).

  4. a)

    The actor

    We define ‘actor’ as a stakeholder who actually delivers the implementation strategy. A wide range of stakeholders can fill this function, as implementation strategies may be employed or enacted by payers, administrators, intervention developers, outside consultants, personnel within an organization charged with being ‘implementers,’ providers/clinicians/support staff, clients/patients/consumers, or community stakeholders. Some strategies could, arguably, be employed only by certain actors. For example, changing reimbursement levels is inherently a ‘payer’ or ‘regulator’ action. Yet other strategies, such as training, could be employed by treatment disseminators external to the organization or supervisors within the organization. Whether certain types of stakeholders are more effective than others in delivering particular strategies is an empirical question; however, there is some theoretical and empirical precedent for relying upon individuals who have more credibility with those whose behavior is expected to change (e.g., the literature on opinion leaders[48, 72, 73]). Those who report, disseminate, and describe implementation strategies should report details on who enacted the strategy. This will help pave the way for important research on the effect of the ‘actor’ on such outcomes as the strategy’s acceptability to providers, sustainability and implementation costs, and the ultimate effectiveness of the implementation effort.

  5. b)

    The action

    Implementation strategies require dynamic verb statements that indicate actions, steps or processes, and sequences of behavior. Ideally, these actions are behaviourally defined a priori to allow comparison with what was actually done during the implementation process. Good examples include strategies such as plan-do-study-act (PDSA) cycles[74] and audit and feedback[66], wherein the very name indicates the actions involved and the definitions expand upon the actions to be taken.

  6. c)

    Action target

    The complexity of implementation strategies is also a function of where they are directed or the conceptual ‘targets’ they attempt to impact. For example, strategies such as ‘realigning payment incentives’ target the policy context, while ‘training’ targets front line providers by increasing knowledge and skill, and ‘fidelity checklists’ target the clarity of the intervention as well as the providers’ understanding and ability to break down the intervention into more ‘doable’ steps.

    A number of theories, conceptual models, and frameworks point to important ‘targets,’ and most emphasize that implementation strategies may be needed to address multiple ‘targets.’ Rogers’ diffusion of innovation theory, for example, identifies several different targets of implementation efforts related to the innovation itself (e.g., making the EBI more acceptable or seem more ‘doable’), the adopter (e.g., working to make individuals more accepting of innovation), the system adopting the innovation, and the diffusion system[73]. Other models have followed suit in emphasizing the multi-level nature of implementation. For instance, Shortell[75] advances a model with four hierarchical levels involved in any implementation of evidence-based care: the top level, or policy context; two middle levels or organization and group or team; and the bottom level of individual behavior in implementation. The Consolidated Framework for Implementation Research (CFIR)[42], which extends Greenhalgh et al.’s[76] seminal model, includes: intervention characteristics (e.g., evidence, adaptability, cost), outer setting (e.g., policies and incentives), inner setting (e.g., structural characteristics of the organization, organizational culture, implementation climate), characteristics of individuals (e.g., self-efficacy), and the process of implementation (e.g., planning, engaging, executing, and reflecting). A recently published checklist for identifying the determinants of practice includes guideline factors; individual health professional factors; patient factors; professional interactions; incentives and resources; capacity for organizational change; and social, political, and legal factors[43]. When the target is an individual, the recently revised Theoretical Domains Framework[44] includes a number of potential targets, such as an individual’s knowledge; skills; roles; optimism; beliefs about consequences; intentions; goals; memory, attention, and decision processes; social influences; emotions; and behavioural regulation. In fact, the multi-level nature of implementation is reflected in the vast majority of pertinent conceptual models. A review of 61 conceptual models pertinent to dissemination and implementation research found that 98% of the included models addressed more than one of the five ‘socioecological levels’ that they specified: system-, community-, organization-, individual-, and policy-levels[41].

    Yet too rarely are the specific targets of implementation strategies clearly stated. Specifying the target is necessary because it helps focus the use of the strategy and suggests where and how outcomes should be measured. This is particularly important when reporting complex multifaceted implementation strategies, and the notion here is to be as specific as possible and to rely upon existing conceptual models and frameworks to identify relevant targets.

  7. d)

    Temporality

    The order or sequence of strategy use may be critical in some cases. For instance, Lyon et al.[77] suggest that strategies to boost providers’ motivation to learn new treatments may need to precede other common implementation strategies such as training and supervision. Several ‘branded’ multifaceted implementation strategies such as ARC organizational implementation strategy[27, 28, 35], the Replicating Effective Practices framework[32, 33], and the Getting to Outcomes framework[31] also lend support to the potential importance of temporality by suggesting specific sequences for the application of component implementation strategies across implementation stages.

    The phased nature of implementation is also highlighted in several theories, conceptual models, and frameworks. Fixsen et al.[23] suggest six stages of implementation, including exploration and adoption, program installation, initial implementation, full operation, innovation, and sustainability. More recently, Damschroder et al.[42] distinguished four processes: planning, engaging, executing, and reflecting/evaluating. In a conceptual model of implementation in public service sectors, Aarons et al.[78] also note four phases of implementation, including: exploration, adoption decision, active implementation, and sustainment. Accordingly, implementation strategies may vary in appropriateness and effectiveness across such phases. For example, the strategies needed in the planning stage of implementing interventions may be different from the strategies required to sustain them, once successfully implemented. In their paper on the Dynamic Adaptation Process, Aarons et al.[79] illustrate strategy variation across three phases: adoption decision/preparation, active implementation, and sustainment.

    Articles that report the use of strategies should include information about the stage or phase when the strategy was used. This should include start and stop dates of strategy use, along with any information about dosage decreasing or increasing over time. Researchers who test strategies need to address the challenges of repeated data collection and analysis. As we come to learn more about the relationships between strategy appropriateness and implementation phases, implications for strategy specification and measurement will become clearer.

  8. e)

    Dose

    Just as the intervention or treatment literature addresses the concept of dose, implementation strategies also can vary tremendously in dosage or intensity. Studies of the effectiveness and comparative effectiveness of implementation strategies should measure dose. This is particularly important, because the field needs to know the minimal dose required to get the strongest effect. Thus, details about the dose or intensity of implementation strategies such as the amount of time spent with an external facilitator[39], the time and intensity of training[80], or the frequency of audit and feedback[81] should be designated a priori and reported.

  9. f)

    The implementation outcome affected

    Proctor et al.[47] proposed a taxonomy of implementation outcomes (acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration and sustainability). Certain strategies may target one or more these implementation outcomes (or other outcomes not identified in the Proctor et al.[47] taxonomy). For instance, using consensus meetings to decide which treatment to implement may be designed to increase the acceptability of the treatment from the perspective of multiple stakeholders. Training or educational strategies typically target fidelity, while financial and policy strategies likely enhance feasibility and acceptability. More information about implementation outcomes can be found in reviews by Proctor et al.[47, 49, 50], and we direct readers to the dissemination and implementation section of the Grid Enabled Measurement Initiative and the Seattle Implementation Research Collaborative’s measures project for repositories of implementation outcomes[82, 83]. Researchers or practice leaders who develop, design, and test implementation strategies should explicitly state the implementation outcomes targeted by the strategy.

  10. g)

    The justification

    Researchers should make efforts to provide justification or rationale for the strategies that they use to implement a given intervention[57, 84]. The selection of implementation strategies may be justified by prospective assessments that identify potential needs, barriers, or facilitators—sometimes termed ‘determinants of practice’[43, 55, 85, 86]. While these determinants of practice could be identified through formal assessment processes, they could also be identified using theory or conceptual models e.g.,[87], research literature e.g.,[59, 60, 88–90], or more informal approaches such as brainstorming e.g.,[55]. One these determinants of practice are identified, researchers should attempt to provide clear justification for why the particular strategies were selected (i.e., why would they help in overcoming barriers and/or leveraging facilitators?). Ideally, they should be selected because relevant theory[52, 67], empirical evidence[8], and/or some pragmatic rationale (e.g., using a low-cost, low intensity intervention when theory and evidence for more intensive strategies is not compelling) suggest they may be appropriate to address the specific challenges posed by the implementation context. While the role and importance of theory has been debated[51, 54, 67, 91], providing theoretical justification for the selected implementation strategy can highlight the potential mechanisms by which change is expected to occur, ultimately providing greater insight into how and why the strategies might work. A chosen implementation strategy that cannot be justified theoretically, empirically, and/or pragmatically should be carefully reconsidered.

Table 2 Specification of two implementation strategies

Existing reporting guidelines and suggested extensions

We suggest that journals that routinely publish implementation studies could advance knowledge about strategies by formally adopting reporting guidelines and providing them to authors and reviewers. Applying such guidelines not only to implementation trials but also to articles that focus on the intervention being tested would pushing detail about implementation processes in treatment effectiveness trials and thus accelerate our understanding of strategies. This point is underscored by the call for ‘hybrid trials’ that advance knowledge about both the treatment and the implementation[15].

Several existing guidelines are relevant. For instance, Implementation Science and several other journals have embraced the WIDER Recommendations[9, 92], which call for authors to provide detailed descriptions of interventions (and implementation strategies) in published papers, clarify assumed change processes and design principles, provide access to manuals and protocols that provide information about the clinical interventions or implementation strategies, and give detailed descriptions of active control conditions. The Standards for Quality Improvement Reporting Excellence (SQUIRE) suggest that authors provide, among other things, a description of the intervention (in this case implementation strategy) and its component parts in sufficient detail so that others could reproduce it, an indication of the main factors that contributed to the choice of the intervention, and initial plans for how the intervention was to be implemented, including the specific steps to be taken and by whom (i.e., the intended roles, qualifications, and training of staff)[93]. The Equator Network[94] is a repository of reporting guidelines (e.g., CONSORT and STROBE) that can provide guidance to specific research designs and methodologies utilized in implementation research. However, there is a need for the development of a suite of reporting guidelines for different types of implementation research[3].

We build upon and extend existing guidelines by recommending two standards as outlined above. First, all studies of implementation should name and define the implementation strategies used. Linguistic harmony in implementation science will be advanced if authors label or describe implementation strategies using terms that already appear in a published review article, a strategy compilation or taxonomy, or another primary research article. If and when unique language is introduced to characterize a strategy, the authors should provide a rationale for the new terminology and should clarify how the new strategy label is similar to or conceptually different from labels already in the literature.

Second, all strategies used should be specified or operationalized. In our view, definition and specification should include each of the seven dimensions outlined above. Ideally, descriptions of implementation strategies should be ‘packaged’ in detailed protocols or manuals describing how a given innovation is to be enacted. These manuals can be considered akin to the kinds of manuals that accompany evidence-based psychotherapies, and could then be published in online supplements and appendices to journal articles.

Adopting these guidelines would address many of the current problems that make it difficult to interpret and use findings from implementation research, such as inconsistent labelling, poor descriptions, and unclear justification for specific implementation strategies[9–11, 13]. Specifically, it would facilitate meta-analysis and replication (in both research and practice), and would increase the comparability of implementation strategies by allowing them to be described in similar ways. It would also help to accelerate our understanding of how, why, when, and where they work, and our translation of those findings to real-world improvements in healthcare. We welcome dialogue regarding additional considerations for reporting research on implementation, and acknowledge room for national or international consensus processes that could formalize and extend the guidelines we present here. In the meantime, we hope that these suggestions provide much needed guidance to those endeavouring to advance our understanding of implementation strategies.

Authors’ information

EKP directs the Center for Mental Health Services Research at Washington University in St. Louis (NIMH P30 MH085979), the Dissemination and Implementation Research Core (DIRC) of the Washington University Institute of Clinical and Translational Sciences (NCRR UL1RR024992), the Center for Dissemination and Implementation at the Washington University Institute for Public Health, and the Implementation Research Institute (NIMH R25 MH080916).

Abbreviations

CFIR:

Consolidated framework for implementation research

EBIs:

Evidence-based interventions

NIH:

National institutes of health

PDSA:

Plan-do-study act

SQUIRE:

Standards for quality improvement reporting excellence.

References

  1. Lomas J: Diffusion, dissemination, and implementation: who should do what?. Ann N Y Acad Sci. 1993, 703: 226-237. 10.1111/j.1749-6632.1993.tb26351.x.

    Article  CAS  PubMed  Google Scholar 

  2. Brouwers MC, De Vito C, Bahirathan L, Carol A, Carroll JC, Cotterchio M, Dobbins M, Lent B, Levitt C, Lewis N, McGregor SE, Paszat L, Rand C, Wathen N: What implementation efforts increase cancer screening rates? a systematic review. Implement Sci. 2011, 6: 1-17. 10.1186/1748-5908-6-1.

    Article  Google Scholar 

  3. Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, Gasziou P, Ilott I, Kinmonth ALL, Leng G, Logan S, Marteau T, Michie S, Rogers H, Rycroft-Malone J, Sibbald B: An implementation research agenda. Implement Sci. 2009, 4: 1-7. 10.1186/1748-5908-4-1.

    Article  Google Scholar 

  4. Institute of Medicine: Initial national priorities for comparative effectiveness research. 2009, Washington, DC: The National Academies Press

    Google Scholar 

  5. Dissemination and implementation research in health (R01).http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html,

  6. Researching implementation and change while improving quality (R18).http://grants.nih.gov/grants/guide/pa-files/PAR-08-136.html,

  7. AHRQ health services research demonstration and dissemination grants (R18).http://grants.nih.gov/grants/guide/pa-files/PA-13-046.html,

  8. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE: Knowledge translation of research findings. Implement Sci. 2012, 7: 1-17. 10.1186/1748-5908-7-1.

    Article  Google Scholar 

  9. Michie S, Fixsen DL, Grimshaw JM, Eccles MP: Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009, 4: 1-6. 10.1186/1748-5908-4-1.

    Article  Google Scholar 

  10. Davies P, Walker AE, Grimshaw JM: A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010, 5: 1-6. 10.1186/1748-5908-5-1.

    Article  Google Scholar 

  11. Colquhoun HL, Brehaut JC, Sales A, Ivers N, Grimshaw J, Michie S, Carroll K, Cahlifoux M, Eva KW: A systematic review of the use of theory in randomized controlled trials of audit and feedback. Implement Sci. 2013, 8: 1-8. 10.1186/1748-5908-8-1.

    Article  Google Scholar 

  12. Alexander JA, Hearld LR: Methods and metrics challenges of delivery-systems research. Implement Sci. 2012, 7: 1-11. 10.1186/1748-5908-7-1.

    Article  Google Scholar 

  13. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Haynes RB, Straus S: A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a tower of Babel?. Implement Sci. 2010, 5: 1-11. 10.1186/1748-5908-5-1.

    Article  Google Scholar 

  14. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M: Developing and evaluating complex interventions: new guidance. 2008, London: Medical Research Council, 1-39.

    Google Scholar 

  15. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C: Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012, 50: 217-226. 10.1097/MLR.0b013e3182408812.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Proctor EK, Landsverk J, Aarons GA, Chambers DA, Glisson C, Mittman BS: Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009, 36: 24-34. 10.1007/s10488-008-0197-4.

    Article  PubMed  Google Scholar 

  17. Deming WE: Out of the crisis. 1986, Cambridge, MA: MIT Press

    Google Scholar 

  18. Mazzocato P, Savage C, Brommels M, Aronsson H, Thor J: Lean thinking in healthcare: a realist review of the literature. Qual Saf Health Care. 2010, 19: 376-382. 10.1136/qshc.2009.037986.

    PubMed  Google Scholar 

  19. Azocar F, Cuffel B, Goldman W, McCarter L: The impact of evidence-based guideline dissemination for the assessment and treatment of major depression in a managed behavioral health care organization. J Behav Health Serv Res. 2003, 30: 109-118. 10.1007/BF02287816.

    Article  PubMed  Google Scholar 

  20. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, York JL: A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012, 69: 123-157. 10.1177/1077558711430690.

    Article  PubMed  Google Scholar 

  21. Magnabosco JL: Innovations in mental health services implementation: a report on state-level data from the U.S. evidence-based practices project. Implement Sci. 2006, 1: 1-11.

    Article  Google Scholar 

  22. Eccles MP, Johnston M, Hrisos S, Francis J, Grimshaw JM, Steen N, Kaner EF: Translating clinicians’ beliefs into implementation interventions (TRACII): a protocol for an intervention modeling experiment to change clinicians’ intentions to implement evidence-based practice. Implement Sci. 2007, 2: 1-6. 10.1186/1748-5908-2-1.

    Article  Google Scholar 

  23. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F: Implementation research: a synthesis of the literature (No. FMHI publication #231). 2005, Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network

    Google Scholar 

  24. Mazza D, Bairstow P, Buchan H, Chakraborty SP, Van Hecke O, Grech C, Kunnamo I: Refining a taxonomy for guideline implementation: results of an exercise in abstract classification. Implement Sci. 2013, 8: 1-10. 10.1186/1748-5908-8-1.

    Article  Google Scholar 

  25. Cochrane Effective Practice and Organisation of Care Group: Data Collection Checklist. 2002, 1-30.http://epoc.cochrane.org/sites/epoc.cochrane.org/files/uploads/datacollectionchecklist.pdf,

    Google Scholar 

  26. Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, Eccles MP, Cane J, Wood CE: The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013, 46: 81-95. 10.1007/s12160-013-9486-6.

    Article  PubMed  Google Scholar 

  27. Glisson C, Hemmelgarn A, Green P, Dukes D, Atkinson S, Williams NJ: Randomized trial of the availability, responsiveness, and continuity (ARC) organizational intervention with community-based mental health programs and clinicians serving youth. J Am Acad Child Adolesc Psychiatry. 2012, 51: 780-787. 10.1016/j.jaac.2012.05.010.

    Article  PubMed  Google Scholar 

  28. Glisson C, Schoenwald S, Hemmelgarn A, Green P, Dukes D, Armstrong KS, Chapman JE: Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010, 78: 537-550.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Institute for Healthcare Improvement: The breakthrough series: IHI’s colaborative model for achieving breakthrough improvement. 2003, Cambridge, Massachusetts: Institute for Healthcare Improvement

    Google Scholar 

  30. Massoud MR, Nielsen GA, Nolan K, Nolan T, Schall MW, Sevin CA: A framework for spread: from local improvements to system-wide change. 2006, Institute for Healthcare Improvement: Cambridge, Massachusetts

    Google Scholar 

  31. Chinman M, Imm P, Wandersman A: Getting to outcomes 2004: promoting accountability through methods and tools for planning, implementation, and evaluation. 2004, Rand Health: Santa Monica, CA

    Google Scholar 

  32. Kegeles SM, Rebchook GM, Hays RB, Terry MA, O’Donnell L, Leonard NR, Kelly JA, Neumann MS: From science to application: the development of an intervention package. AIDS Educ Prev. 2000, 12 (5 Suppl): 62-74.

    CAS  PubMed  Google Scholar 

  33. Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R: Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implement Sci. 2007, 2: 1-10. 10.1186/1748-5908-2-1.

    Article  Google Scholar 

  34. Stetler CB, Mittman BS, Francis J: Overview of the VA quality enhancement research inititative (QUERI) and QUERI theme articles: QUERI series. Implement Sci. 2008, 3: 1-9. 10.1186/1748-5908-3-1.

    Article  Google Scholar 

  35. Glisson C, Dukes D, Green P: The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse Negl. 2006, 30: 855-880. 10.1016/j.chiabu.2005.12.010.

    Article  PubMed  Google Scholar 

  36. Glisson C, Hemmelgarn A, Green P, Williams NJ: Randomized trial of the availability, responsiveness and continuity (ARC) organizational intervention for improving youth outcomes in community mental health programs. J Am Acad Child Adolesc Psychiatry. 2013, 52: 493-500. 10.1016/j.jaac.2013.02.005.

    Article  PubMed  Google Scholar 

  37. Rosen A, Proctor EK: Specifying the treatment process: the basis for effective research. J Soc Serv Res. 1978, 2: 25-43. 10.1300/J079v02n01_04.

    Article  Google Scholar 

  38. Abraham C, Michie S: A taxonomy of behavior change techniques used in interventions. Health Psychol. 2008, 27: 379-387.

    Article  PubMed  Google Scholar 

  39. Kauth MR, Sullivan G, Blevins D, Cully JA, Landes RD, Said Q, Teasdale TA: Employing external facilitation to implement cognitive behavioral therapy in VA clinics: a pilot study. Implement Sci. 2010, 5: 1-11. 10.1186/1748-5908-5-1.

    Article  Google Scholar 

  40. Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, Callaghan J, Whitley R: Evidence-based practice implementation strategies: results from a qualitative study. Community Ment Health J. 2008, 44: 213-224. 10.1007/s10597-007-9109-4.

    Article  PubMed  Google Scholar 

  41. Tabak RG, Khoong EC, Chambers DA, Brownson RC: Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012, 43: 337-350. 10.1016/j.amepre.2012.05.024.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009, 4: 1-15. 10.1186/1748-5908-4-1.

    Article  Google Scholar 

  43. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, Baker R, Eccles MP: A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013, 8: 1-11. 10.1186/1748-5908-8-1.

    Article  Google Scholar 

  44. Cane J, O’Connor D, Michie S: Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012, 7: 1-17. 10.1186/1748-5908-7-1.

    Article  Google Scholar 

  45. Michie S, van Stralen MM, West R: The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011, 6: 1-11. 10.1186/1748-5908-6-1.

    Article  Google Scholar 

  46. Landsverk J, Brown CH, Chamberlain P, Palinkas LA, Ogihara M, Czaja S, Goldhaver-Fiebert JD, Rolls Reutz JA, Horwitz SM: Design and analysis in dissemination and implementation research. Dissemination and implementation research in health. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 225-260.

    Google Scholar 

  47. Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons GA, Bunger A, Griffey R, Hensley M: Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011, 38: 65-76. 10.1007/s10488-010-0319-7.

    Article  Google Scholar 

  48. Atkins MS, Frazier SL, Leathers SJ, Graczyk PA, Talbott E, Jakobsons L, Adil JA, Marinez-Lora A, Demirtas H, Gibbons RB, Bell CC: Teacher key opinion leaders and mental health consultation in low-income urban schools. J Consult Clin Psychol. 2008, 76: 905-908.

    Article  PubMed  Google Scholar 

  49. Proctor EK, Brownson RC: Measurement issues in dissemination and implementation research. Dissemination and implementation research in health: translating science to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 261-280.

    Chapter  Google Scholar 

  50. Proctor EK, Powell BJ, Feely M: Measurement issues in dissemination and implementation science. Child and adolescent therapy: dissemination and implementation of empirically supported treatments. Edited by: Beidas RS, Kendall PC. New York: Oxford University Press, In Press

  51. Eccles M, Grimshaw JM, Walker A, Johnston M, Pitts N: Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol. 2005, 58: 107-112. 10.1016/j.jclinepi.2004.09.002.

    Article  PubMed  Google Scholar 

  52. Grol R, Bosch MC, Hulscher MEJ, Eccles MP, Wensing M: Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Q. 2007, 85: 93-138. 10.1111/j.1468-0009.2007.00478.x.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Cochrane effective practice and organisation of care group.http://epoc.cochrane.org,

  54. Oxman AD, Fretheim A, Flottorp S: The OFF theory of research utilization. J Clin Epidemiol. 2005, 58: 113-116. 10.1016/j.jclinepi.2004.10.002.

    Article  PubMed  Google Scholar 

  55. Wensing M, Oxman A, Baker R, Godycki-Cwirko M, Flottorp S, Szecsenyi J, Grimshaw J, Eccles M: Tailored implementation for chronic diseases (TICD): a project protocol. Implement Sci. 2011, 6: 1-8. 10.1186/1748-5908-6-1.

    Article  Google Scholar 

  56. Craig P, Dieppe P, Macintyre S, Mitchie S, Nazareth I, Petticrew M: Developing and evaluating complex interventions: the new medical research council guidance. BMJ. 2008, 337: 979-983. 10.1136/bmj.a979.

    Article  Google Scholar 

  57. Mittman BS: Implementation science in health care. Dissemination and implementation research in health: translating science to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 400-418.

    Chapter  Google Scholar 

  58. May C: Towards a general theory of implementation. Implement Sci. 2013, 8: 1-14. 10.1186/1748-5908-8-1.

    Article  Google Scholar 

  59. Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PACAC, Rubin HR: Why don’t physicians follow clinical practice guidelines?. JAMA. 1999, 282: 1458-10.1001/jama.282.15.1458.

    Article  CAS  PubMed  Google Scholar 

  60. Cook JM, Biyanova T, Coyne JC: Barriers to adoption of new treatments: an internet study of practicing community psychotherapists. Adm Policy Ment Health Ment Health Serv Res. 2009, 36: 83-90. 10.1007/s10488-008-0198-3.

    Article  Google Scholar 

  61. Gerring J: Social science methodology: a criterial framework. 2001, Cambridge: Cambridge University Press

    Book  Google Scholar 

  62. Rabin BA, Purcell P, Naveed S, Moser RP, Henton MD, Proctor EK, Brownson RC, Glasgow RE: Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012, 7: 1-11. 10.1186/1748-5908-7-1.

    Article  Google Scholar 

  63. Rabin BA, Brownson RC, Joshu-Haire D, Kreuter MW, Weaver NL: A glossary of dissemination and implementation research in health. J Public Health Manag. 2008, 14: 117-123. 10.1097/01.PHH.0000311888.06252.bb.

    Article  Google Scholar 

  64. Rabin BA, Brownson RC: Developing terminology for dissemination and implementation research. Dissemination and implementation research in health: translating science to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 23-51.

    Chapter  Google Scholar 

  65. Forsner T, Wistedt AA, Brommels M, Janszky I, de Leon AP, Forsell Y: Supported local implementation of clinical guidelines in psychiatry: a two-year follow-up. Implement Sci. 2010, 5: 1-11. 10.1186/1748-5908-5-1.

    Article  Google Scholar 

  66. Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD: Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2006, 2: CD000259-

    PubMed  Google Scholar 

  67. The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG): Designing theoretically-informed implementation interventions. Implement Sci. 2006, 1: 1-8. 10.1186/1748-5908-1-1.

    Article  Google Scholar 

  68. Hysong SJ: Audit and feedback features impact effectiveness on care quality. Med Care. 2009, 47: 1-8. 10.1097/MLR.0b013e3181808bb5.

    Article  Google Scholar 

  69. Fine SA, Cronshaw SF: Functional Job analysis: a foundation for human resources management. 1995, Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc.

    Google Scholar 

  70. Beidas RS, Edmunds JM, Marcus SC, Kendall PC: Training and consultation to promote implementation of an empirically supported treatment: a randomized trial. Psychiatr Serv. 2012, 63: 660-665.

    Article  PubMed  PubMed Central  Google Scholar 

  71. Johnson DW, Johnson RT, Smith KA: Cooperative learning returns to college: what evidence is there that it works?. Change. 1998, 30: 27-35.

    Article  Google Scholar 

  72. Flodgren G, Parmelli E, Doumit G, Gattellari M, O’Brien MA, Grimshaw J, Eccles MP: Local opinion leaders: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews. 2011, Art. No (Issue 8): CD000125-doi: 10.1002/14651858.CD000125.pub4

    Google Scholar 

  73. Rogers EM: Diffusion of innovations. 2003, New York: Free Press, 5

    Google Scholar 

  74. Berwick DM: A primer on leading the improvement of systems. BMJ. 1996, 312: 619-622. 10.1136/bmj.312.7031.619.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  75. Shortell SM: Increasing value: a research agenda for addressing the managerial and organizational challenges facing health care delivery in the United States. Med Care Res. 2004, 61: 12S-30S. 10.1177/1077558704266768.

    Article  Google Scholar 

  76. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004, 82: 581-629. 10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Lyon AR, Wiltsey Stirman S, Kerns SEU, Burns EJ: Developing the mental health workforce: review and application of training approaches from multiple disciplines. Adm Policy Ment Health. 2011, 38: 238-253. 10.1007/s10488-010-0331-y.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011, 38: 4-23. 10.1007/s10488-010-0327-7.

    Article  PubMed  Google Scholar 

  79. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, Silovsky JF, Hecht DB, Chaffin MJ: Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012, 7: 1-9. 10.1186/1748-5908-7-1.

    Article  Google Scholar 

  80. Herschell AD, Kolko DJ, Baumann BL, Davis AC: The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010, 30: 448-466. 10.1016/j.cpr.2010.02.005.

    Article  PubMed  PubMed Central  Google Scholar 

  81. Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, O’Brien MA, Johansen M, Grimshaw J, Oxman AD: Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database of Systematic Reviews. 2012, Art. No (Issue 6): CD000259-doi: 10.1002/14651858.CD000259.pub3

    Google Scholar 

  82. GEM-Dissemination and implementation initiative (GEM-D&I).http://www.gem-beta.org/GEM-DI,

  83. SIRC measures project.http://www.seattleimplementation.org/sirc-projects/sirc-instrument-project/,

  84. Wensing M, Bosch M, Grol R: Selecting, tailoring, and implementing knowledge translation interventions. Knowledge translation in health care: moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Oxford, UK: Wiley-Blackwell, 94-113.

    Google Scholar 

  85. Bosch M, van der Weijden T, Wensing M, Grol R: Tailoring quality improvement interventions to identified barriers: a multiple case analysis. J Eval Clin Pract. 2007, 13: 161-168. 10.1111/j.1365-2753.2006.00660.x.

    Article  PubMed  Google Scholar 

  86. Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N:Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database of Systematic Reviews. 2010, Art. No (Issue 3): CD005470-doi: 10.1002/14651858.CD005470.pub2,

    Google Scholar 

  87. McSherry LA, Dombrowski SU, Francis JJ, Murphy J, Martin CM, O’Leary JJ, Sharp L, ATHENS Group: ‘It’s A can of worms’: understanding primary care practitioners’ behaviors in relation to HPV using the theoretical domains framework. Implement Sci. 2012, 7: 1-16. 10.1186/1748-5908-7-1.

    Article  Google Scholar 

  88. Addis ME, Wade WA, Hatgis C: Barriers to dissemination of evidence-based practices: addressing practitioners’ concerns about manual-based psychotherapies. Clin Psychol Sci Pract. 1999, 6: 430-441. 10.1093/clipsy.6.4.430.

    Article  Google Scholar 

  89. Bartholomew NG, Joe GW, Rowan-Szai GA, Simpson DD: Counselor assessments of training and adoption barriers. J Subst Abuse Treat. 2007, 33: 193-199. 10.1016/j.jsat.2007.01.005.

    Article  PubMed  PubMed Central  Google Scholar 

  90. Shapiro CJ, Prinz RJ, Sanders MR: Facilitators and barriers to implementation of an evidence-based parenting intervention to prevent child maltreatment: the triple p-positive parenting program. Child Maltreat. 2012, 17: 86-95. 10.1177/1077559511424774.

    Article  PubMed  Google Scholar 

  91. Bhattacharyya O, Reeves S, Garfinkel S, Zwarenstein M: Designing theoretically-informed implementation interventions: fine in theory, but evidence of effectiveness in practice is needed. Implement Sci. 2006, 1: 1-3. 10.1186/1748-5908-1-1.

    Article  Google Scholar 

  92. WIDER recommendations to improve reporting of the content of behaviour change interventions.http://interventiondesign.co.uk/wp-content/uploads/2009/02/wider-recommendations.pdf,

  93. Davidoff F, Batalden P, Stevens D, Ogrinc G, Mooney S: Publication guidelines for quality improvement in health care: evolution of the SQUIRE project. Qual Saf Health Care. 2008, 17 (Supplement 1): i3-i9.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Equator network.http://www.equator-network.org/,

Download references

Acknowledgements

The authors acknowledge their colleagues in the practice community who ask important and provocative questions about how to improve care, the kind of questions that simulated this work. Preparation of this paper was supported in part by National Center for Research Resources through the Dissemination and Implementation Research Core of Washington University in St. Louis’ Institute of Clinical and Translational Sciences (NCRR UL1 RR024992); the National Institute of Mental Health through the Center for Mental Health Services Research (NIMH P30 MH068579), the Washington University Center for Diabetes Translation Research (NIDDK/NIH P30 DK092950); the Washington University TREC (Energy Balance and Cancer Across the Lifecourse, NCI/NIH. U54 CA155496); the Implementation Research Institute (NIMH R25 MH080916), and a Ruth L. Kirschstein National Research Service Award (NIMH F31 MH098478); and a Doris Duke Charitable Foundation Fellowship for the Advancement of Child Well-Being. An earlier version of this paper was presented at the Implementation Research Institute on June 20, 2012 at Washington University in St. Louis.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Enola K Proctor.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

EKP conceived the idea for this paper and wrote the initial draft. BJP, JCM, and EKP contributed to the conceptualization and writing of subsequent drafts. All authors read and approved the final manuscript.

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Proctor, E.K., Powell, B.J. & McMillen, J.C. Implementation strategies: recommendations for specifying and reporting. Implementation Sci 8, 139 (2013). https://doi.org/10.1186/1748-5908-8-139

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1748-5908-8-139

Keywords