Overview of OPASI’s Division of Evaluation and Systemic Assessments (DESA) Deborah Guadalupe Duran, Ph.D. Chief, Systemic Assessments Branch.

Slides:



Advertisements
Similar presentations
Definitions Innovation Reform Improvement Change.
Advertisements

Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Healthy People 2020: Preparing for a New Decade Dr. Jeanette Guyton-Krishnan NCHS Data Users Conference August 18, 2010.
Institutional Effectiveness (ie) and Assessment
Core Competencies Student Focus Group, Nov. 20, 2008.
Enterprise Strategic Planning Digital Government Summit September 14, 2006.
How Country Stakeholders Get Involved Group Exercise June 2013 MONITORING AND EVALUATION IN THE GEF.
Enterprise Architecture. 2 Agenda What is Enterprise Architecture (EA)? Roles in EA? Why is EA Important? Tangible Benefits from EA? What Do We Need to.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Example claims for approaches to the support of learning and teaching that influence, motivate and inspire students to learn fostering student development.
Health Line of Business Revised Health Domains January 26, 2005 Outcomes / Domains have been revised.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
Quality evaluation and improvement for Internal Audit
Michele Dupuis, Senior Officer Knowledge Integration SSHRC Knowledge Mobilization: An Overview of SSHRC’s policies and practices March 31, 2014.
February 8, 2012 Session 4: Educational Leadership Policy Standards 1 Council of Chief School Officers April 2008.
Control environment and control activities. Day II Session III and IV.
Evaluation and Policy in Transforming Nursing
Richard J.T. Klein Stockholm Environment Institute and Centre for Climate Science and Policy Research, Linköping University.
The Competitive Grants Environment Presented by: Dr. Deborah Sheely Dr. Mark Poth Competitive Programs Unit.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Perinatal and Infant Oral Health Quality Improvement National Learning Network Estimated Number Awards: One (1) Type of Award: Cooperative Agreement Estimated.
ENGAGING LEADERS FOR CHANGE AND INNOVATION ADEA CCI 2011 Summer Liaison Meeting San Diego, CA June 27-29, 2011 Janet M. Guthmiller, DDS, PhD University.
Communication Degree Program Outcomes
“Putting the pieces together – as a community” December, 2014.
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
INTOSAI Public Debt Working Group Updating of the Strategic Plan Richard Domingue Office of the Auditor General of Canada June 14, 2010.
Hillsdale County Intermediate School District Oral Exit Report Quality Assurance Review Team Education Service Agency Accreditation ESA
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Creating a Culture of Student Affairs Assessment Katie Busby, Ph.D. Jessica Simmons Office of Student Affairs Assessment & Planning University of Alabama.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points Western and Central Africa Dakar, May 2007.
THE TEACHING & LEARNING CENTER- AN OVERVIEW MOHAMMED EL-AFFENDI AUGUST 2014.
Professional Learning and Development: Best Evidence Synthesis Helen Timperley, Aaron Wilson and Heather Barrar Learning Languages March 2008.
A GP for Me Making it Work in Victoria November 27, 2013.
CHAPTER 28 Translation of Evidence into Nursing Practice: Evidence, Clinical practice guidelines and Automated Implementation Tools.
HRSA Frontier Community Health Integration Project (FCHIP) Technical Assistance, Tracking and Analysis Program Guidance Overview Sarah Bryce July.
AKE and other regional work, capacity building and knowledge sharing GeSCI Team Meeting 9 February 2010.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Standard 1- Leadership & Vision Sara Saffell Amy Blackwell Marilyn McDonald 1. Leadership and Vision-Educational leaders inspire a shared vision for comprehensive.
GEF Evaluation Office. Two overarching objectives:  Promote accountability for the achievement of GEF objectives through the assessment of results, effectiveness,
SMRB Working Group on Approaches to Assess the Value of Biomedical Research Supported by NIH SMRB Working Group on Approaches to Assess the Value of Biomedical.
TRANSPORTATION RESEARCH BOARD WATER SCIENCE AND TECHNOLOGY BOARD TRANSPORTATION RESEARCH BOARD TRB’s Vision for Transportation Research.
Science Department Draft of Goals, Objectives and Concerns 2010.
United Nations Development Programme Ministry of Labour and Social Policy Local Public Private Partnerships THE BULGARIAN EXPERIENCE.
Company: Cincinnati Insurance Company Position: IT Governance Risk & Compliance Service Manager Location: Fairfield, OH About the Company : The Cincinnati.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
New Framework for Strategic Goal Assessment NSF Advisory Committee for Business and Operations November 8-9, 2006 Tom Cooley, Director, BFA.
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
Developed by: July 15,  Mission: To connect family strengthening networks across California to promote quality practice, peer learning and mutual.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
Social Work Competencies Social Work Ethics
David M. Murray, Ph.D. Associate Director for Prevention Director, Office of Disease Prevention Multilevel Intervention Research Methodology September.
National Institutes of Health U.S. Department of Health and Human Services Planning for a Team Science Evaluation ∞ NIEHS: Children’s Health Exposure Analysis.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Portfolio Analysis in OPASI at NIH
Clinical Practice evaluations and Performance Review
MONITORING AND EVALUATION IN THE GEF
Research Program Strategic Plan
HHS Strategic plan fy An Overview
By Jeff Burklo, Director
MONITORING AND EVALUATION IN THE GEF
MONITORING AND EVALUATION IN THE GEF
Evaluating AETC NCRC Partnerships for Impact
MONITORING AND EVALUATION IN THE GEF
MONITORING AND EVALUATION IN THE GEF
Presentation transcript:

Overview of OPASI’s Division of Evaluation and Systemic Assessments (DESA) Deborah Guadalupe Duran, Ph.D. Chief, Systemic Assessments Branch

DESA: Division of Evaluation and Systemic Assessments Evaluation Feedback System Assessments (Sci of Sci) Evaluability Organization Performance DRDA: Division of Resource Development and Analysis Knowledge Management Portfolio Analysis Public Health Need Centralized tools DSC: Division of Strategic Coordination Strategic Coordination-High Risk-High Impact Trans-NIH Initiatives gaps-potentials data-evidence strengths- improvements DRDA DSC DESA OPASI Overview

Two branches Systemic Assessments Branch (SAB) Evaluation Branch (EB) DESA Structure (DESA)

 Coordinate NIH wide evaluations and assessments  Provide technical assistance for required performance reporting  Facilitate the development of adaptive measures for science  Provide findings, integrated with portfolio analyses that foster scientific planning SAB EB DESA Mission

Evaluation Branch  Evaluate NIH-wide programs, key initiatives, and strategic objectives (formative and summative)  Track outcomes/recommendations  Maximize the effective use of the NIH Evaluation Set-Aside Program SAB EB DESA - EB Mission

Systemic Assessments Branch  Coordinate program performance activities across NIH  Comply and respond to federally-mandated performance reporting mechanisms GPRA PART PAR Scorecard  Foster the development of measures for Adaptive science assessments Organization assessments  Facilitate science of science activities SAB EB DESA – SAB Mission

Strategic Plans System Level - HHS/IC/Office/Program Science of Science Management Pre-evaluation Assessments Visual Analytic Tools e-Monitoring Systems Strategic Plans Project Level Independent Evaluations Evaluation Set-Aside Funds GPRA PART PAR Audit Biennial Report Scorecard Coordinating-Planning-Managing-Tracking-Reporting System Performance & Assessments Independent External Evaluations Related Processes Resources Strategic Planning & Evaluability SAB EB DESA Activities

Evaluation Branch  Manage Evaluation Set-Aside Program which provides dedicated funding for ICs and OD offices  Provide hands-on technical support and online guidance  Consult with ICs and OD offices on evaluation planning  Coordinate two trans-NIH committees to evaluate proposals  Report evaluation results SAB EB DESA – EB Role

Systemic Assessments Branch  Monitor NIH performance (not evaluation)  Report program performance information in lay language  Ensure Budget/Performance Integration  Participate in OMB / Secretary / HHS / Federal workgroups  Liaison to OMB, HHS, ICs performance issues  Develop System Assessments  Conduct internal assessments (evaluability)  Coordinate Science of Science activities  Strategic Planning SAB EB DESA – SAB Role

Systemic Assessments Branch: Evaluability Assessments (internal) Science of Science (internal) Monitoring & Tracking Performance Organization Performance Reporting Evaluation Branch: Funding for independent evaluations (external) Systemic Assessments Branch Evaluation Branch SAB/EB--related (ie IC Planning & Evaluation Community support) but unique functions DESA Functions

DESA (SAB and EB) offer comprehensive evaluations/assessments – Internal and External SAB–internal & organization assessments Evaluability Assessments Monitoring & Tracking Performance System Assessments (Science of Science) Adaptive evaluations for science EB–external Independent Evaluations of projects/programs/key initiatives Internal (SAB) External (EB) Evaluations/Assessments

Evaluation External- best conducted by source outside of the project/program Types  Formative and Summative Evaluations (Mid-term and End-of-Project Reviews)  Needs assessments  Process Standard Methodologies

Standard Evaluation Processes  Program/project as the unit of analysis  Upfront utilization focus / cause and effect  Linear logic models  Focus on achieving pre-planned goals  Findings assessed in relation to plans  Time specific  Methodologies Static designs Preference for quantitative data Randomized Control Trials-Gold Standard

Conditions that Challenge Evaluation  High innovation  Developmental  High uncertainty  Dynamic (complex)  Emergent SCIENCE and SCIENCE SYSTEMS

Science Programs New field Developing theories & tools Developing metrics & analytic methodologies NIH among the leaders (national & international) Program & Management Organization project technologies communicationadministration program human capital management A pictorial representation System Assessments

System Premises Systems neutrality : A system is functioning as observed for some reasons, fulfilling some functions. In whose interests is a system functioning? Who benefits? What does the homeostasis offer?

In a well-functioning system…  no sub-system operates at maximum  dynamic/systems change Relationships are interdependent and non-linear, rather than simple and linear (cause-effect) Transitions - worse before better  complex - science Highly emergent (difficult to plan and predict) Healthy System

Internal preparation for external evaluations Independent external evaluations Organizational and system assessments SAB/EB work together to coordinate assessments of program/project performance and NIH overall performance System Performance & Assessments Independent External Evaluations Related Processes Resources Strategic Planning & Evaluability SAB EB DESA Synergy

Providing Technical Assistance Teaching Strategic Planning course Developing and implementing evaluability survey Providing results feedback Preparing for GPRA and PART Creating visual analytic tools Offering resources Providing technical assistance Guiding application process Reviewing proposals Monitoring awards Promoting innovative methodologies Monitoring and Reporting - PART - GPRA - Audits - PAR/MD&A DESA--implemented through a number of steps/tasks System Performance & Assessments Independent External Evaluations Related Processes Resources Strategic Planning & Evaluability SAB EB DESA Implementation

 Be prepared for Evaluations Performance reporting Strategic planning  Utilize analysis tools for decision makers  Conduct Needs Assessments Feasibility Studies Process evaluations Outcome evaluations Evidence related activities  Score well in PART GPRA Audits PAR/MD&A SAB and EB supports ICs to System Performance & Assessments Independent External Evaluations Related Processes Resources Strategic Planning & Evaluability SAB EB DESA Outcomes

SAB EB Demonstrates appropriate federal stewardship Provides a systematic process for conducting evaluations/ assessments that produce evidence-based results, which can be used to  improve or adapt programs and/or management of science  comply with performance assessment requirements  plan Strategic Planning & Evaluability Independent External Evaluations Related Processes Resources System Performance & Assessments DESA Summation