Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University.

Slides:



Advertisements
Similar presentations
Creating National Performance Indicators that are Relevant to Stakeholders: Participatory Methods for Indicator Development, Selection, and Refinement.
Advertisements

Local Immigration Partnerships: Systems Planning to Help People.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
Linking Actions for Unmet Needs in Children’s Health
Family Resource Center Association January 2015 Quarterly Meeting.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
Working together across disciplines Challenges for the natural and social sciences.
NRCOI March 5th Conference Call
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Evaluation. Practical Evaluation Michael Quinn Patton.
Using Journal Citation Reports The MyRI Project Team.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
How to Improve your Grant Proposal Assessment, revisions, etc. Thomas S. Buchanan.
Community Level Models; Participatory Research and Challenges
Essential Service # 7:. Why learn about the 10 Essential Services?  Improve quality and performance.  Achieve better outcomes – improved health, less.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Perinatal and Infant Oral Health Quality Improvement National Learning Network Estimated Number Awards: One (1) Type of Award: Cooperative Agreement Estimated.
1 The Prevention Research Centers Program: The Case for Networks Eduardo Simoes, MD, MSc, MPH Program Director Prevention Research Centers National Center.
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Evaluating the Effectiveness of the CTRC: Designing Quantitative and Qualitative Measures to Assess in Real Time the Value of the Center Mike Conlon, University.
Evaluating the Strength of the Advocacy Field A prospective look at the Missouri health advocacy ecosystem Tanya Beer Center for Evaluation Innovation.
1 Introduction to Grant Writing Beth Virnig, PhD Haitao Chu, MD, PhD University of Minnesota, School of Public Health December 11, 2013.
The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania.
Evaluation Assists with allocating resources what is working how things can work better.
Building a Toolkit of Skills and Resources Sarah Lampe, Rebecca Rapport & Mary Wold Paige Backlund Jarquín.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
FY 2013 Alumni Survey Results. Agenda Discussion and Feedback Takeaways & Next Steps Executive Summary Survey Limitations & Methodology.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Systems Approaches to Population Health. Activities Supported by NCI.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Linking Collaborating Centres to Build Global Capacity for Community Health and Development Stephen Fawcett and Jerry Schultz, WHO Collaborating Centre,
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
School Improvement Partnership Programme: Summary of interim findings March 2014.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
The HMO Research Network (HMORN) is a well established alliance of 18 research departments in the United States and Israel. Since 1994, the HMORN has conducted.
SUBMITTED TO THE HIGHER LEARNING COMMISSION OF THE NORTH CENTRAL ASSOCIATION OF COLLEGES AND SCHOOLS MAY 2010 Progress Report on Outcomes Assessment.
Recent Developments of the PEFA Program Video-conference of the PEMPAL BCOP PEFA Working Group February 20, 2009 Frans Ronsholt Head of PEFA Secretariat.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
University of Kentucky Center for Clinical and Translational Science (CCTS) November 2015 Stephen W. Wyatt, DMD, MPH Senior Associate Director Center for.
Proposal Development Guidelines for Signature Grantee Semi-Finalists The Covenant Foundation.
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Connect2Complete Theory of Change Development for Colleges and State Offices November 10, 2011 OMG Center for Collaborative Learning.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Comparative Effectiveness Research (CER) and Patient- Centered Outcomes Research (PCOR) Presentation Developed for the Academy of Managed Care Pharmacy.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Creating Positive Culture through Leadership (Recovery Orientation) Jennifer Black.
Collaborative Innovation Awards Paula K. Shireman, MD Professor of Surgery Vice Dean for Team Science Co-PI Institute for Integration of Medicine & Science.
National Institutes of Health U.S. Department of Health and Human Services Planning for a Team Science Evaluation ∞ NIEHS: Children’s Health Exposure Analysis.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Evaluating R&D Programs: R&D Evaluation Planning Concept Mapping and Logic Modeling Presentation to the Washington Research Evaluation Network June 2008.
Designing Evaluation for Complex Multiple-Component Programs
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Module 1: Introducing Development Evaluation
Grant Writing Information Session
Knowledge Translation Across RERC Activities
Presentation to the Washington Research Evaluation Network June 2008
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
STEPS Site Report.
Presentation transcript:

Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University American Evaluation Association November 4, 2006

The Context Changing nature of science Interdisciplinary, collaborative Large initiatives for complex problems Expansion of use of large center grants as research funding mechanism Similar issues reported in the European Union (EU) in connection with the evaluation of Science, Technology, and Innovation (STI) policies Government wide accountability expectations GPRA PART ExpectMore.gov Good science requires good management

Evaluation of Large Initiatives National Cancer Institute Transdisciplinary Tobacco Use Research Centers (TTURCs), (2001 – 2003) Centers for Disease Control Prevention Research Centers Network, National Institute of Allergies and Infectious Diseases AIDS Clinical Trials Network, Division of AIDS, National Institutes of Health (2005 – present)

Evaluation Approach Culture change Collaboration and involvement of researchers, funders, consultants Understand initiative life-cycle Develop initiative logic model Link comprehensive measures and tools to model Keep costs and respondent burden low Assure scientific objectivity and credibility Address multiple purposes and audiences Design for re-use where possible Report and utilize results Provide an opportunity for reflection and learning

Initiative Life Cycle Model Conceptual Model Measures Questions Stakeholders Context Motivation Capacity Structure Expertise Support The context includes the organizational structures and organizational constraints that delimit evaluation activities. Issues include: At each stage a wide variety of stakeholders need to be involved both in helping determine what questions should be addressed in evaluation and in providing their assessments of initiative performance and outcomes. At each stage there are a variety of evaluation questions with more prospective questions earlier in the life-cycle and more retrospective ones later. Processes are needed for for prioritizing which questions will be addressed at each stage. Evaluation is an empirical activity. Consequently, measures that are related to the constructs in the conceptual model needed at every stage.

Structured Conceptualization Evaluation Methods Needs Assessment Evaluability Assessment Implementation Evaluation Process Evaluation Outcome Evaluation Impact Evaluation Cost-Effectiveness & Cost Benefit Evaluation Secondary Analysis Meta-Evaluation Conceptual Model Formative/Ex Ante Summative/Ex Post Methods Policy Context New Initiatives Strategic Impact Policy Implications Strategic Goals PlanDisseminate Implement Develop Plan Develop Implement Disseminate

The TTURC Case Study Transdisciplinary Tobacco Use Research Centers History RFA released 12/98 Grants reviewed 7/99 First award 9/99 Reissuance 9/04 Approximately $75 million in first phase TTURC Life Cycle Model

Model Development Concept Map Engage the Community Diversity & Sensitivity Relationships & Recognition Active Dissemination Technical Assistance Training Resear ch Method s Research Agenda Core Expertise & Resources Evaluation System Plan …… …… ……. Logic Model InputsOutputsOutcomesActivities Active Dissemin ation Training Engage the Commu nity Technic al Assistan ce Core Exper tise & Reso urces Research Agenda Community Health change

AnalysesMeasures Measures & Analyses Conceptual Map & Logic Model Financial Report (SF259a) Financial Report (SF259a) Budget & Justification Budget & Justification Bibliometrics Progress Report Summary Progress Report Summary Publications Expenditures & Carryover Expenditures & Carryover Researcher Form Researcher Form Peer Evaluation Financial Analysis Personnel Report Personnel Report Personnel Analysis Evaluation Analysis Evaluation Analysis Content Analysis Survey Analysis Progress Report (PHS2590) Progress Report (PHS2590)

1.How well is the collaborative transdisciplinary work of the centers (including training) accomplished? 2.Does the collaborative transdisciplinary research of the centers lead to the development of new or improved research methods? 3.Does the collaborative transdisciplinary research of the centers lead to the development of new or improved scientific models and theories? 4.Does TTURC research result in scientific publications that are recognized as high-quality? 5.Does TTURC research get communicated effectively? 6.Are models and methods translated into improved interventions? 7.Does TTURC research influence health practice? 8.Does TTURC research influence health policy? 9.Does TTURC research influence health outcomes? Evaluation Questions

1.How well is the collaborative transdisciplinary work of the centers accomplished? What are TTURC researcher attitudes about collaboration and transdisciplinary research? How do researchers assess performance of their centers on collaboration, transdisciplinary research, training, institutional support and center management? What are examples of collaboration, transdisciplinary and training activities of the centers? What is the quality and impact of the collaboration, transdisciplinary and training activities of the centers? Do TTURC research publications provide evidence of collaboration and transdisciplinary research, and how do they compare with “traditional” research? How effective and efficient is the management of the TTURCs? Subquestions:

Evaluation Questions 1.How well is the collaborative transdisciplinary work of the centers accomplished? Data Sources: Researcher Form Attitudes about Transdisciplinary Research Scale (15 items) Center Collaboration Scale (15 items) Attitudes about Collaboration in Research Scale (8 items) Institutional Support Index (12 items) Overall Ratings of collaboration, transdisciplinary integration, training, institutional support Content Analysis of annual progress reports for activities, results and barriers (code on collaboration, transdisciplinary integration, training, institutional support) Peer evaluation Annual progress reports Publications Bibliometric analysis of publications Collaboration within and across institutions and centers Numbers of fields represented by publications, cited and citing articles, weighted by impact of journals Management analysis Personnel Budget and Financial

Researcher Form Each center responsible for generating measures for 3-4 clusters on the map (at least two centers reviewed each cluster) Compiled into measure development database, draft measure produced 25 closed-ended questions each with multiple subquestions Overall performance ratings by outcome area Open-ended Comments 244 specific measurement items proposed across the 13 content clusters

Scales and Indexes Attitudes about Transdisciplinary Research Scale (15 items) Center Collaboration Scale (15 items) Attitudes about Collaboration in Research Scale (8 items) Institutional Support Index (12 items) Methods Progress Scale (7 items) Science and Models Scale (17 items) Barriers to Communications Scale (8 items) Center-to-Researcher Communications (5 items) Center External Communications (2 items) Progress on Development of Interventions Index (12 items) Policy Impact Index (4 items) Translation to Practice Index(9 items) Health Outcome Impact Scale (6 items)

Researcher Survey 8. Collaboration within the center o. n. m. l. k. j. i. h. g. f. e. d. c. b. a. 95% CI a.Support staffing for the collaboration. b. Physical environment support (e.g., meeting space) for collaboration. c.Acceptance of new ideas. d.Communication among collaborators. e. Ability to capitalize on the strengths of different researchers. f.Organization or structure of collaborative teams. g.Resolution of conflicts among collaborators. h. Ability to accommodate different working styles of collaborators. i.Integration of research methods from different fields. j.Integration of theories and models from different fields. k.Involvement of collaborators from outside the center. l.Involvement of collaborators from diverse disciplines. m.Productivity of collaboration meetings. n.Productivity in developing new products (e.g., papers, proposals, courses). o.Overall productivity of collaboration.

Content Analysis Code approximately project reports per year by the 13 outcome clusters Did three rounds of reliability testing and refinement of coding definitions Final reliability >.9

Progress Report Content Analysis – Years 1-3 Collaboration Transdisciplinary Integration External Recognition and Support Science & Models Publications Interventions Communication Policy Implications Translation to Practice Health Outcomes Training Internal Recognition And Support Methods (data from Content Analysis of Annual Progress Report Form PHS2590)

Peer Evaluation – Years 1-3 Training Collaboration Transdisciplinary Integration Internal Recognition And Support External Recognition and Support Methods Science & Models Publications Interventions Communication Policy Implications Translation to Practice Health Outcomes

Bibliometric Analysis What is a TTURC publication? Results from TTURC research Cites TTURC Grant Number Independent peer evaluation would identify the influence Components of bibliometric analysis Publications, citations, cited (references) Journals of publication, citing, cited Field (Current Contents) Year

Bibliometric Analysis Indicators Journal Impact Factor (JIF) – average number of citations of a journal of all articles published in previous two years Journal Impact Factor Journal Performance Indicator (JPI) – average number of publications to date for all publications in a journal in a particular year Journal Performance Indicator Field Journal Performance Indicator – JPI for all journals in a field Field Journal Performance Indicator Adjusted Journal Performance Indicator (Expected Citations) – JPI for a specific type of publication Adjusted Journal Performance Indicator 5-year Impact – Average number of citations to publications over a five year period 5-year Impact

Bibliometrics On average, there were.64 more citations of TTURC publications than for other publications in the same journal. On average, there were.6 more citations of TTURC publications than for other publications in the same field. Citation of TTURC publications is significantly higher than for journal and field comparison groups.

Bibliometrics Only the two complete years were used in this analysis. Citations lower than expected in year 1, higher in year 2. Citation of TTURC research publications is significantly increasing over time relative to expectation.

Financial Analysis (data from Financial Status Reports of grantees) Cumulative Percent of Federal Funds Spent by Grantee

Carryover (data from Budget Justification, Annual Progress Report Form PHS2590) Percent of subprojects by center and year that reported a carryover.

Delay of project start Unanticipated obstacles Changes in process - practical Other-SpecifyNot stated Reasons for Carryover Causes of Delay or Unanticipated Obstacles Staffing issue Implementation or logistical issue Research/methods issue Granting agency issue Infrastructure issue Other-Specify Not stated (data from Budget Justification, Annual Progress Report Form PHS2590)

What Worked Less Promising Researcher Survey – one wave Content Analysis – costly, time consuming Peer Evaluation of publications More Promising Researcher Survey scales Peer evaluation of progress reports Financial Analysis Bibliometrics

Conclusions Sustainability Challenges Funding challenges Researcher motivation Methodological Challenges Peer review Bibliometrics Integrating results Organizational Challenges Agency resources Grantee resources External contractors Utilization Challenges Building over multiple time points Building over multiple initiatives