August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR) https://ictr.wisc.edu.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Background This linked collaborative is intended to identify opportunities to exchange best practices, administrative and regulatory support models and.
How To Impact Your Research: An Overview of Research Support Services Quincy J. Byrdsong, Terri Hagan, and Alice Owens-Gatlin Research Services Consultants.
CTSA KFC Meeting October 28, 2009 UIC Center for Clinical and Translational Science (CCTS): Biomedical Informatics Core.
Dr. John E. Niederhuber Director, National Cancer Institute Board of Scientific Advisors June 22, 2009 NCI Director’s Update.
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
Basic Principles of Successful Grant Writing
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
1 Outcome Study for the Evaluation of the Modular Grants Program May 2005 Prepared by Westat under Contract # GS-23F-8144H.
Comprehensive M&E Systems
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
By Saurabh Sardesai October 2014.
October 19, 2010 Steven Hirschfeld, MD, PhD Julia Slutsman, PhD
Evaluation. Practical Evaluation Michael Quinn Patton.
UWM CIO Office A Collaborative Process for IT Training and Development Copyright UW-Milwaukee, This work is the intellectual property of the author.
Research Bioethics Consultation: More potential than sequencing genomes Benjamin S. Wilfond MD Seattle Children’s Hospital Treuman Katz Center for Pediatric.
Basic Research Administration Principles Presented by Ronald Kiguba Research Coordinator, Makerere Medical School.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
Ron Sokol, M.D. Principal Investigator, CCTSI Bonnie Walters Executive Director, The Evaluation Center Kathryn Nearing, Ph.D. Associate Director, The Evaluation.
Enterprise IT Decision Making
WHAT IS “CLASS”? A BRIEF ORIENTATION TO THE CLASS METHODOLOGY.
Use of OCAN in Crisis Intervention Webinar October, 2014.
TYPE 2 TRANSLATIONAL RESEARCH 2009 GRANT PROGRAMS UW Institute for Clinical and Translational Research (ICTR) Community-Academic Partnership Core (CAP)
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD, 1,2 B. Rucker, MPH 3,4.
“Working Together, Reducing Cancer, Saving Lives”
Using Technology to Strengthen Human Subject Protections Patricia Scannell Director, IRB Washington University School of Medicine.
Overview: FY12 Strategic Communications Plan Meredith Fisher Director, Administration and Communication.
Where Results Begin. “We don’t have a health care delivery system in this country. We have an expensive plethora of uncoordinated, unlinked, economically.
RESULTS BASED MANAGEMENT IN TANZANIA. STAGE 1 Service Delivery Survey - MDAs have to undertake these surveys which focus on external customers and are.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Working Definition of Program Evaluation
Yale Center for Clinical Investigation: Academic Home to the CTSA Secretary’s Advisory Committee on Human Research Protections (SACHRP) Presentation Tesheia.
Maternal and Child Health Public Health Catalyst Program HRSA FY 2015 Funding Opportunity Announcement Pre-Review Orientation Call Division of MCH.
Belinda Seto, Ph.D. Acting Deputy Director for Extramural Research National Institutes of Health Human Subjects Research Enhancements Awards Renaissance.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
BACK TO SCHOOL Welcome Back! Evaluation Task Force Findings.
1 Preparing an NIH Institutional Training Grant Application Rod Ulane, Ph.D. NIH Research Training Officer Office of Extramural Research, NIH.
The Wisconsin Network for Health Research (WiNHR): Overview. An Infrastructure for Conducting Multi-Site Clinical Research across the State of Wisconsin.
Technology Transfer Innovation Program (T 2 IP). What Is the Program Mission Why Is It Important Who Is It Designed to Assist How Will It Encourage TT.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
© 2007 SRI International CPATH Principal Investigators Meeting: Program Evaluation Update March 26, 2010 By the Center for Education Policy Dr. Raymond.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
First Annual Meeting. Project Development Teams: Concept and Example Scott C. Denne, M.D. Professor of Pediatrics Associate Director of the Indiana CTSI.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Arlington, VA March 31, 2004 Presentation for the Advisory Committee for Business & Operations Effective Practices Research Overview For Merit Review This.
1 National Indicators and Qualitative Studies to Measure the Activities and Outcomes of CDC’s PRC Program Demia L. Sundra, MPH Prevention Research Centers.
NSF INCLUDES Inclusion Across the Nation of Learners of Underrepresented Discoverers in Engineering and Science AISL PI Meeting, March 1, 2016 Sylvia M.
GEO Implementation Mechanisms Giovanni Rum, GEO Secretariat GEO Work Programme Symposium Geneva, 2-4 May 2016.
CHB Conference 2007 Planning for and Promoting Healthy Communities Roles and Responsibilities of Community Health Boards Presented by Carla Anglehart Director,
Measuring Institutional Capacity for Sustainability Mark D. Bardini, Ph.D. Chemonics International AEA Webinar September 15, 2011.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Critical Program Movement: Integration of STD Prevention with Other Programs Kevin Fenton, MD, PhD, FFPH Director National Center for HIV/AIDS, Viral Hepatitis,
National Institutes of Health U.S. Department of Health and Human Services Planning for a Team Science Evaluation ∞ NIEHS: Children’s Health Exposure Analysis.
California Department of Public Health California Department of Public Health Accreditation Readiness Team (ART) Orientation Office of Quality Performance.
Mark Garcia Administrative Director, Institute for Clinical and Translational Research.
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Systems Analysis and Design in a Changing World, 4th Edition
MUHC Innovation Model.
Boston Community Academic Mental Health Partnership
Grant Writing Information Session
Facilitating Routine HIV Testing in Clinical Settings: The Role of the AIDS Education & Training Centers FTCC Meeting- July 14th, 2010 Beth-Anne Jacob,
Sam Houston State University
Performance Partnership Grants for Tribes in R9
Proposal Processing Wake Forest University Health Sciences
Sam Houston State University
Jennifer Esala, Ph.D. Evaluation Core Director
Presentation transcript:

August 20, 2008 Clinical and Translational Science Awards (CTSA) CTSA Evaluation Approach Institute for Clinical & Translational Research (ICTR) University of Wisconsin, Madison (UW-Madison) Marshfield Clinic Research Foundation (MCRF) This document is confidential and is intended solely for the use and information of the client to whom it is addressed.

1 Institute’s Resources and Organization for Evaluation  Evaluation Team organizationally located in the ICTR Administrative Core and ICTR Client Services Center (ICSC)  D. Paul Moberg, PhD, Assistant Director, Tracking & Evaluation, ICTR/Madison (18% time)  Jan Hogle, PhD, Evaluation Researcher, ICTR/Madison (100% time)  Jennifer Bufford, Evaluation Coordinator, ICTR/Marshfield (30% time)  To be hired: Evaluation Research Specialist, ICTR/Madison (100% time)  This is a reduction from the proposed 3.35 FTE, corresponding to NIH budget constraints

2 Overview of UW-ICTR’s Evaluation Goals  Collaborate with national and local stakeholders to –conduct self-evaluation of ICTR –track trainees and activities  Incorporate an approach that is –utilization-focused (intended uses by intended users) – logic model –participatory (ICTR stakeholders; Evaluation Working Group) –methodologically flexible (quan/qual; not doing experimental design)  Apply the evaluation process and findings to –priority-setting –program accountability –continuous quality improvement efforts

3 Objectives of ICTR Evaluation In order to achieve the goals, ICTR Evaluation:  Develops and implements ICTR’s cross-component evaluation plan and provide support for managing and analyzing central ICTR databases.  Provides evaluation consultation services to ICTR’s 25+ components, as well as to collaborating institutions, as time and funding allow.  Interfaces with national CTSA evaluation activities; participate in CTSA Consortium sponsored collaboration.

4 Approach to CTSA Evaluation matches CDC’s Framework for Program Evaluation in Public Health. MMWR 1999;48(No. RR-11)

5  Partially staffed Evaluation Office (current budget issues)  Obtained stakeholder input, develop consensus on roles, responsibilities (Evaluation Working Group)  Developed common understandings of each component’s goals & objectives via meetings on Component Tracking Tables  Developed definitions for evaluation-related terms and other concepts for ICTR-wide use ICTR Evaluation Office Year 1 Activities: the proposal said… and we accomplished (1):

6 ICTR Evaluation Office Year 1 Activities: the proposal said… and we accomplished (2):  Developed central ICTR databases & tracking systems collaboratively with IT resources & ICTR components (Member DB; Request for Consult DBs; DBs for Investigators, Pubs, Grants; APR data tracking system; Resource Tracking Systems in individual components)  Interpreted APR requirements; set data collection mechanisms and trouble-shooting systems in place collaboratively with ICTR Administration  Began to refine/prioritize/develop cross-component evaluation plans – “what would a successful institute look like”

7 ICTR Evaluation Office Year 1 Activities: additional accomplishments (3):  Assisted with creation and evaluation plan design for the ICTR Client Services Center (ICSC)  Collaboratively developed guidelines for Case Studies Collection (qualitative descriptive summaries) to tell the story of translational research at UW-Madison and Marshfield (MCRF)  Co-led development of Resource Tracking System (RTS) with Biostatistics & Bioinformatics Core (BBI) and other ICTR components

8 ICTR Evaluation Office Year 1 Activities: additional accomplishments (4):  Participated in Nat’l CTSA Consortium calls, Evaluation Steering Committee mtg, Wiki, Working Groups  Collaborated with ICTR Admin on refinements to Member Database  Began planning for Annual Member Survey and Key Informant Interviews (analysis in progress)  Collaborating with Marshfield on tracking & evaluation coordination

9 Summary of Evaluation Metrics (1) Long term:  Improvement in key health indicators [SHOW – Survey of the Health of Wisconsin] Medium term:  “Silo removal” so that multidisciplinary & translational approach becomes the norm for health sciences research  Cadre of researchers reflects more closely the gender, racial & ethnic diversity of the US population Short term:  Reduction in time from IRB submission to approval  Reduction in number of IRB deferrals and modifications  Reduction in number of protocols withdrawn by the IRB for quality issues  Increase in satisfaction of users and of IRB staff and committee members

10 Summary of Evaluation Metrics (2) Short term (cont’d):  # and types of Members in the Web Portal Member Database (800+ members)  # and descriptors of investigators/mentors/scholars reported via APR whose research has benefited significantly from CTSA resources (n=300+)  # publications based on research that benefits from CTSA/ICTR resources, annually  # and $ grants representing research that benefits significantly from ICTR resources, annually  # and $ of pilot grants awarded annually (2 rounds awarded in April & June 2008)  % of grants obtained, based on research that benefits, which are Type 2 translational  Feedback from ICTR members on services provided via Annual Member Survey  Qualitative assessments: Key Informant Interviews, Case Studies Collection, ICTR Client Services Center  Database analysis: Members, Request for Consults, Resource Tracking Systems

11 ICTR Evaluation: Year 2 Proposed Work Plan: 1  Evaluation Working Group – developing cross-component metrics  Operationalize measures & develop strategies for evaluating ICTR goals and specific aims  Implement Annual Member Survey preceded by key informant interviews  Begin to assemble Case Studies Collection  Collect and report on user feedback from ICTR “front door” and Web Portal  Continue to refine Resource Tracking System(s)

12 ICTR Evaluation : Year 2 Proposed Work Plan: 2  Assist with analysis of ICTR databases (Member, Consult, Grants, Pubs)  Continue to assist components with internal evaluation tasks  Participate in evaluation of ICTR Client Services Center (ICSC)  Participate in CTSA Consortium Working Groups & Steering Committee  Continue to support Annual Progress Reporting with Wiki-based data collection system

13 Institution Evaluation Challenges and/or Questions  Operationalizing & prioritizing measures & indicators  Evaluation Office staffing and funding for evaluative studies--prioritize  Size and complexity of ICTR  Lack of consensus on database development: purpose, process, organization, and use: database development forces structural development; multiple & varied needs of 25+ components  Defining and tracking how “research” has “benefited significantly” from CTSA “resources” for the APR  Adapting evaluation plans to fit emerging realities.