Access to Knowledge & Information

Guidance and/or training is accessible to implement and deliver the innovation.

The accessibility of guidance, training, and education related to the innovation and its implementation are critical to successful implementation and delivery of the innovation (Dy et al., 2015; Greenhalgh, Robert, et al., 2004).

The original CFIR further elaborated on the importance of the ease of access to digestible information about the innovation and how to incorporate the innovation into work tasks (Greenhalgh, Robert, et al., 2004; Helfrich, Weiner, et al., 2007; Klein et al., 2001; Wallin et al., 2006). The number of different knowledgeable occupational types or specialties who are involved with the implementation is positively associated with effective implementation (Wallin et al., 2006). When timely on-the-job training is available, especially at the team level, implementation is more likely to be successful (Greenhalgh, Robert, et al., 2004). Education, training, and access to information about the innovation are all key strategies to move deliverers from unengaged to fully committed users of the innovation (R. P. Grol et al., 2007).

Qualitative coding guidelines that are aligned with the Updated CFIR will be added in the future.

Inclusion Criteria

Include statements related to implementation leads’ and teams’, and innovation users’ access to knowledge and information regarding using the innovation, e.g., training on how to use the innovation or how it works.

  • “They sent us to training about 9 months before implementation so by the time things were finally ready, I forgot what to do.”
  • “She’s been wonderful. Every time I have a question, I just email her and she helps me out.”
  • “I can’t get access to the reports I need to figure out which patients to target.”
  • “I plan to give little sessions about the intervention to nurses, residents, and physicians. It’s important to tailor the information to each group. Otherwise, they won’t buy in.”
  • “I didn’t get all the information I needed. I felt so frustrated because no one around here knew anything about it, so I just gave up and started doing it the old way.”

Exclusion Criteria

Exclude statements related to engagement strategies and outcomes, e.g., how key stakeholders became engaged with the innovation and what their role is in implementation, and code to Engaging: Innovation Users.

Exclude statements about general networking, communication, and relationships in the organization, such as descriptions of meetings, email groups, or other methods of keeping people connected and informed, and statements related to team formation, quality, and functioning and code to Relational Connections or Communications.

Regarding quantitative measurement of this construct: In a systematic review of quantitative measures related to implementation, Weiner et al. identified six measures (Weiner et al., 2020). Using PAPERS criteria of measurement quality with an aggregate scale ranging from -9 to +36 (Lewis, Mettert, Stanick, et al., 2021), 5 measures (83.33%) had sufficient information for assessment and scores ranged from -1 to +6. The Structured Interview of Evidence Use (Palinkas et al., 2016) had the highest score. Results indicate the need for continued development of high-quality measures.

As we become aware of measures, we will post them here. Please contact us with updates.

Dy, S. M., Ashok, M., Wines, R. C., & Rojas Smith, L. (2015). A Framework to Guide Implementation Research for Care Transitions Interventions: Journal for Healthcare Quality, 37(1), 41–54.

Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Q, 82(4), 581–629.

Grol, R. P., Bosch, M. C., Hulscher, M. E., Eccles, M. P., & Wensing, M. (2007). Planning and studying improvement in patient care: The use of theoretical perspectives. Milbank Q, 85(1), 93–138.Helfrich, C. D., Weiner, B. J., McKinney, M. M., & Minasian, L. (2007). Determinants of implementation effectiveness: Adapting a framework for complex innovations. Med Care Res Rev, 64(3), 279–303.

Klein, K. J., Conn, A. B., & Sorra, J. S. (2001). Implementing computerized technology: An organizational analysis. Journal of Applied Psychology, 86(5), 811–824.

Lewis, C. C., Mettert, K. D., Stanick, C. F., Halko, H. M., Nolen, E. A., Powell, B. J., & Weiner, B. J. (2021). The psychometric and pragmatic evidence rating scale (PAPERS) for measure development and evaluation. Implementation Research and Practice, 2, 263348952110373.

Palinkas, L. A., Garcia, A. R., Aarons, G. A., Finno-Velasquez, M., Holloway, I. W., Mackie, T. I., Leslie, L. K., & Chamberlain, P. (2016). Measuring Use of Research Evidence: The Structured Interview for Evidence Use. Research on Social Work Practice, 26(5), 550–564.

Wallin, L., Estabrooks, C. A., Midodzi, W. K., & Cummings, G. G. (2006). Development and Validation of a Derived Measure of Research Utilization by Nurses: Nursing Research, 55(3), 149–160.

Weiner, B. J., Mettert, K. D., Dorsey, C. N., Nolen, E. A., Stanick, C., Powell, B. J., & Lewis, C. C. (2020). Measuring readiness for implementation: A systematic review of measures’ psychometric and pragmatic properties. Implementation Research and Practice, 1, 263348952093389.