Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Previously Published Works bannerUC San Diego

Enhancing review criteria for dissemination and implementation science grants

Abstract

Background

The existing grant review criteria do not consider unique methods and priorities of Dissemination and Implementation Science (DIS). The ImplemeNtation and Improvement Science Proposals Evaluation CriTeria (INSPECT) scoring system includes 10 criteria based on Proctor et al.'s "ten key ingredients" and was developed to support the assessment of DIS research proposals. We describe how we adapted INSPECT and used it in combination with the NIH scoring system to evaluate pilot DIS study proposals through our DIS Center.

Methods

We adapted INSPECT to broaden considerations for diverse DIS settings and concepts (e.g., explicitly including dissemination and implementation methods). Five PhD-level researchers with intermediate to advanced DIS knowledge were trained to conduct reviews of seven grant applications using both the INSPECT and NIH criteria. The INSPECT overall scores range from 0 to 30 (higher scores are better), and the NIH overall scores range from 1 to 9 (lower scores are better). Each grant was independently reviewed by two reviewers, then discussed in a group meeting to compare the experiences using both criteria to evaluate the proposal and to finalize scoring decisions. A follow-up survey was sent to grant reviewers to solicit further reflections on each scoring criterion.

Results

Averaged across reviewers, the INSPECT overall scores ranged from 13 to 24, while the NIH overall scores ranged from 2 to 5. Reviewer reflections highlighted the unique value and utility for each scoring criterion. The NIH criteria had a broad scientific purview and were better suited to evaluate more effectiveness-focused and pre-implementation proposals not testing implementation strategies. The INSPECT criteria were easier to rate in terms of the quality of integrating DIS considerations into the proposal and to assess the potential for generalizability, real-world feasibility, and impact. Overall, reviewers noted that INSPECT was a helpful tool to guide DIS research proposal writing.

Conclusions

We confirmed complementarity in using both scoring criteria in our pilot study grant proposal review and highlighted the utility of INSPECT as a potential DIS resource for training and capacity building. Possible refinements to INSPECT include more explicit reviewer guidance on assessing pre-implementation proposals, providing reviewers with the opportunity to submit written commentary with each numerical rating, and greater clarity on rating criteria with overlapping descriptions.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View