Download presentation
Presentation is loading. Please wait.
Published byEmmeline Gilmore Modified over 9 years ago
1
Session 731 Progress Reporting for US Federal Grant Awards: Templates, Guidance, and Data Standards to Support Effective Program Evaluation Laure Haak, Discovery Logic. Chair and Discussant Helena Davis, NIEHS/NIH. Using the Logic Model Process to Guide Data Collection on Outcomes and Metrics Larry Nagahara and Nicole Moore, NCI/NIH. Evaluating Collabortion and Team Sciene in the National Cancer Institute’s Physical Sciences-Oncology Consortium David Baker, CASRAI. Creating a Shared Core Set of Reporting Elements Sponsored by the AEA Research, Technology,and Development Evaluation TIG
2
Panel Scope Effective evaluation of grant programs will require mixed methods and a variety of data sources and types. Can we support iterative program evaluation? Can we collect quality data efficiently? How do we involve grantees? Can we scale processes across centers and programs?
3
Questions Are evaluation guidelines included in funding announcement? How are data collected (e-document, web form)? When are data collected (before, during after)? Are there standard definitions used across the Institute or Agency? Who reports and how are the data verified?
4
Question Included in FOA Trend at NIH to include guidance in funding announcements; this makes data collection a part of grant Data collection format In most cases, data are collected using e- documents. Some systems have been created to collect data using Web forms. Collection timing Annually at renewal and close-out. For training grants some post-hoc tracking. Data standards A Fed-Wide RPPR will be rolled out in 2012/13. Core set of data elements established Reporting responsibilityUsually principal investigator Data verificationIndividual program officers
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.