Perceived difficulty of the intervention, reflected by duration, scope, radicalness, disruptiveness, centrality, and intricacy and number of steps required to implement.

Perceived intricacy or difficulty of the intervention, reflected by intervention type (e.g., behavior change or plug-in technology), duration, scope, radicalness, disruptiveness, centrality, and intricacy and number of steps required to implement [1][2]. Radical interventions require significant reorientation and non-routine processes to produce fundamental changes in the organization’s activities and reflects a clear departure from existing practices [1]. One way to determine complexity is by assessing ‘length’ (the number of sequential sub-processes or steps for using or implementing an intervention) and ‘breadth’ (number of choices presented at decision points) [3]. Complexity is also increased with higher numbers of potential target organizational units (teams, clinics, departments) or types of people (providers, patients, managers) targeted by the intervention [3], and the degree to which the intervention will alter central work processes [2].

Appropriately diagnosing and assessing complexity is thought to benefit implementation by avoiding unintended consequences [3]. There is a negative association between stakeholder’s perception of how complex an intervention is and effective implementation (i.e., simple interventions are more likely to be effective) [1][4] because it affects user satisfaction and the speed required to be competent in using the interventions [5]. The ability to implement an intervention incrementally (sometimes referred to as divisibility [2]; see Executing) can influence perception of complexity.

The type of intervention, whether a technical (e.g., a new computer module) or an administrative change (behavioral change), can contribute to the perception of complexity. Technical interventions may include a purchased product, packaged service, or an automated production process (e.g., computerized order entry). Administrative interventions primarily affect organizational social structures or processes. Most interventions are a hybrid of both. Technical interventions tend to be more visible and administrative interventions tend to be more complex and difficult to implement [1]. On the other hand, complex behavioral change interventions can also work in favor of implementation. If organizations embrace an intervention as a fundamental change to processes up front, they are more likely to do what it takes to fully and effectively implement the intervention compared to sites that regard it as a simple “plug-in” intervention [6]. Edmondson and colleagues describe a “technological frame” of thinking that influences implementation effectiveness. In their study of a new cardiac surgical approach, the sites with less successful implementation viewed the intervention as a “plug-in technology” while those with better implementation effectiveness regarded the intervention “as fundamental change for the [operating] team” [6] and had more engaged support by key stakeholders (see Leadership Engagement and Engaging: Key Stakeholders). Regardless of the degree of complexity, simple, clear, and detailed implementation plans, schedules, and task assignments contribute to successful implementation [4].

Inclusion Criteria

Code statements regarding the complexity of the innovation.

  • “There were so many pieces and parts to the process but we were able to get it going because we did it in phases.”
  • “It changed the whole way our team worked together – the workflow and roles during these surgeries is so different now.”

Exclusion Criteria

Exclude statements regarding the complexity of implementation and code to other appropriate CFIR codes, e.g., code difficulties related to space to Available Resources and code difficulties related to engaging participants in a new program to Engaging: Innovation Participants.

Check out SIRC’s Instrument Review project and published systematic review protocol, which has cataloged over 400 implementation-related measures. 

Note: As we become aware of measures, we will post them here. Please contact us with updates.

  1. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q2004, 82:581-629.
  2. Grol RP, Bosch MC, Hulscher ME, Eccles MP, Wensing M: Planning and studying improvement in patient care: The use of theoretical perspectives. Milbank Q 2007, 85:93-138.
  3. Kochevar LK, Yano EM: Understanding health care organization needs and context. Beyond performance gaps. J Gen Intern Med 2006, 21(Suppl 2):S25-29.
  4. Gustafson DH, Sainfort F, Eichler M, Adams L, Bisognano M, Steudel H: Developing and testing a model to predict outcomes of organizational change. Health Serv Res 2003, 38:751-776.
  5. Klein KJ, Conn AB, Sorra JS: Implementing computerized technology: An organizational analysis. J Appl Psychol 2001, 86:811-824.
  6. Edmondson AC, Bohmer RM, Pisana GP: Disrupted routines: Team learning and new technology implementation in hospitals. Adm Sci Q 2001, 46:685-716.