Download presentation
Presentation is loading. Please wait.
Published byClarissa Shields Modified over 9 years ago
1
August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR) https://ictr.wisc.edu University of Wisconsin, Madison (UW-Madison) www.wisc.edu Marshfield Clinic Research Foundation (MCRF) http://marshfieldclinic.org/research/pages/index.aspx This document is confidential and is intended solely for the use and information of the client to whom it is addressed.
2
1 Institute’s Resources and Organization for Evaluation Evaluation Team organizationally located in the ICTR Administrative Core and ICTR Client Services Center (ICSC) D. Paul Moberg, PhD, Assistant Director, Tracking & Evaluation, ICTR/Madison (18% time) Jan Hogle, PhD, Evaluation Researcher, ICTR/Madison (100% time) Jennifer Bufford, Evaluation Coordinator, ICTR/Marshfield (30% time) To be hired: Evaluation Research Specialist, ICTR/Madison (100% time) This is a reduction from the proposed 3.35 FTE, corresponding to NIH budget constraints
3
2 Overview of UW-ICTR’s Evaluation Goals Collaborate with national and local stakeholders to –conduct self-evaluation of ICTR –track trainees and activities Incorporate an approach that is –utilization-focused (intended uses by intended users) – logic model http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html –participatory (ICTR stakeholders; Evaluation Working Group) –methodologically flexible (quan/qual; not doing experimental design) Apply the evaluation process and findings to –priority-setting –program accountability –continuous quality improvement efforts
4
3 Objectives of ICTR Evaluation In order to achieve the goals, ICTR Evaluation: Develops and implements ICTR’s cross-component evaluation plan and provide support for managing and analyzing central ICTR databases. Provides evaluation consultation services to ICTR’s 25+ components, as well as to collaborating institutions, as time and funding allow. Interfaces with national CTSA evaluation activities; participate in CTSA Consortium sponsored collaboration.
5
4 Approach to CTSA Evaluation matches CDC’s http://www.cdc.gov/eval/framework.htm Framework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11) http://www.cdc.gov/eval/framework.htm
6
5 Partially staffed Evaluation Office (current budget issues) Obtained stakeholder input, develop consensus on roles, responsibilities (Evaluation Working Group) Developed common understandings of each component’s goals & objectives via meetings on Component Tracking Tables Developed definitions for evaluation-related terms and other concepts for ICTR-wide use ICTR Evaluation Office Year 1 Activities: the proposal said… and we accomplished (1):
7
6 ICTR Evaluation Office Year 1 Activities: the proposal said… and we accomplished (2): Developed central ICTR databases & tracking systems collaboratively with IT resources & ICTR components (Member DB; Request for Consult DBs; DBs for Investigators, Pubs, Grants; APR data tracking system; Resource Tracking Systems in individual components) Interpreted APR requirements; set data collection mechanisms and trouble-shooting systems in place collaboratively with ICTR Administration Began to refine/prioritize/develop cross-component evaluation plans – “what would a successful institute look like”
8
7 ICTR Evaluation Office Year 1 Activities: additional accomplishments (3): Assisted with creation and evaluation plan design for the ICTR Client Services Center (ICSC) Collaboratively developed guidelines for Case Studies Collection (qualitative descriptive summaries) to tell the story of translational research at UW-Madison and Marshfield (MCRF) Co-led development of Resource Tracking System (RTS) with Biostatistics & Bioinformatics Core (BBI) and other ICTR components
9
8 ICTR Evaluation Office Year 1 Activities: additional accomplishments (4): Participated in Nat’l CTSA Consortium calls, Evaluation Steering Committee mtg, Wiki, Working Groups Collaborated with ICTR Admin on refinements to Member Database Began planning for Annual Member Survey and Key Informant Interviews (analysis in progress) Collaborating with Marshfield on tracking & evaluation coordination
10
9 Summary of Evaluation Metrics (1) Long term: Improvement in key health indicators [SHOW – Survey of the Health of Wisconsin] Medium term: “Silo removal” so that multidisciplinary & translational approach becomes the norm for health sciences research Cadre of researchers reflects more closely the gender, racial & ethnic diversity of the US population Short term: Reduction in time from IRB submission to approval Reduction in number of IRB deferrals and modifications Reduction in number of protocols withdrawn by the IRB for quality issues Increase in satisfaction of users and of IRB staff and committee members
11
10 Summary of Evaluation Metrics (2) Short term (cont’d): # and types of Members in the Web Portal Member Database (800+ members) # and descriptors of investigators/mentors/scholars reported via APR whose research has benefited significantly from CTSA resources (n=300+) # publications based on research that benefits from CTSA/ICTR resources, annually # and $ grants representing research that benefits significantly from ICTR resources, annually # and $ of pilot grants awarded annually (2 rounds awarded in April & June 2008) % of grants obtained, based on research that benefits, which are Type 2 translational Feedback from ICTR members on services provided via Annual Member Survey Qualitative assessments: Key Informant Interviews, Case Studies Collection, ICTR Client Services Center Database analysis: Members, Request for Consults, Resource Tracking Systems
12
11 ICTR Evaluation: Year 2 Proposed Work Plan: 1 Evaluation Working Group – developing cross-component metrics Operationalize measures & develop strategies for evaluating ICTR goals and specific aims Implement Annual Member Survey preceded by key informant interviews Begin to assemble Case Studies Collection Collect and report on user feedback from ICTR “front door” and Web Portal Continue to refine Resource Tracking System(s)
13
12 ICTR Evaluation : Year 2 Proposed Work Plan: 2 Assist with analysis of ICTR databases (Member, Consult, Grants, Pubs) Continue to assist components with internal evaluation tasks Participate in evaluation of ICTR Client Services Center (ICSC) Participate in CTSA Consortium Working Groups & Steering Committee Continue to support Annual Progress Reporting with Wiki-based data collection system
14
13 Institution Evaluation Challenges and/or Questions Operationalizing & prioritizing measures & indicators Evaluation Office staffing and funding for evaluative studies--prioritize Size and complexity of ICTR Lack of consensus on database development: purpose, process, organization, and use: database development forces structural development; multiple & varied needs of 25+ components Defining and tracking how “research” has “benefited significantly” from CTSA “resources” for the APR Adapting evaluation plans to fit emerging realities.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.