Developmental Disabilities Program Independent Evaluation (DDPIE) Project UCEDD Meeting – Technical Assistance Institute May 31, 2007 Lynn Elinson, Ph.D.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Delivering as One Viet Nam Country-led Evaluation Kigali, 20 October 2009.
1 Establishing Performance Indicators in Support of The Illinois Commitment Presented to the Illinois Board of Higher Education December 11, 2001.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
August 2006 OSEP Project Director's Conference 1 Preparing Teachers to Teach All Children: The Impact of the Work of the Center for Improving Teacher Quality.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
Alaska Native Education Program (ANEP) Technical Assistance Meeting September 2014 Sylvia E. Lyles Valerie Randall Almita Reed.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Compassion. Action. Change. Recommendations for County PEI Funded Activities in Phase II as of June 2015 CalMHSA Board of Directors Meeting June 11, 2015.
Purpose of the Standards
Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
EVALUATION IN THE GEF Juha Uitto Director
Eric R. Johnson Hillsborough County, (Tampa) FL
Procurement Engineering and Review Team (PERT) PEER REVIEW PROGRAM Patrick Marmo 2/7/2012 Independent Peer Review Program for Contractor’s Purchasing Systems.
Evaluating the impact of careers guidance for continuous improvement
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
1 Oregon Content Standards Evaluation Project, Contract Amendment Phase: Preliminary Findings Dr. Stanley Rabinowitz WestEd November 6, 2007.
MODULE II 1 How are UCEDDs Connected?. Topics of Presentation 1. Administration on Intellectual and Developmental Disabilities (AIDD) 2. Association of.
State of Maine: Quality Management and National Core Indicators.
Center for Community Inclusion & Disability Studies Community Advisory Committee Brief Orientation October 31, 2013.
Evaluation in the GEF and Training Module on Terminal Evaluations
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
Developmental Disabilities Program Independent Evaluation (DDPIE) Project Jennifer Johnson Lynn Elinson Cynthia Thomas AUCD Annual Meeting October 31,
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
NSDI Strategic Plan Overview NSDI Leaders Forum Meeting March 7, 2013.
Reflections on Broadening Recruitment and Widening Participation: The Challenge of Diversity A WP Development Programme for the University of Borås, 2010.
1 Department of Medical Assistance Services Stakeholder Advisory Committee June 25, 2014 Gerald A. Craver, PhD
HECSE Quality Indicators for Leadership Preparation.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
1 Data Quality Standards at the U.S. Census Bureau Pamela D. McGovern and John M. Bushery U.S. Census Bureau Quality Program Staff Washington, DC
Annual Report. Submission of UCEDD Annual Report DD Act: required each center to submit annual report to the Secretary (ADD) DD Act: required each center.
BMH CLINICAL GUIDELINES IN EUROPE. OUTLINE Background to the project Objectives The AGREE Instrument: validation process and results Outcomes.
PATIENT-CENTERED OUTCOMES RESEARCH INSTITUTE PCORI Board of Governors Meeting Washington, DC September 24, 2012 Anne Beal, MD, MPH, Chief Operating Officer.
AUCD’s TA Contract with ADD ADD TA Institute May 31, 2007 Update.
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Office of Performance Review (OPR) U.S. Department of Health and Human Services (DHHS) Health Resources and Services Administration (HRSA) Stephen Dorage.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
1 MODULE II How are UCEDDs Connected?. 2 Topics of Presentation 1. Administration on Developmental Disabilities (ADD) 2. Association of University Centers.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
State Advisory Council Community Support Grant Summary Presentation for Policy Committee Meeting December 3, 2012.
Mid-Decade Assessment of the United Nations 2010 World Population and Housing Census Program Arona L. Pistiner Office of the Associate Director for 2020.
International Speedway Boulevard Stakeholders Task Force (STF) Meeting 1 Wednesday, May 19, 2010.
C-DERL is an application designed to be a Federal- wide, online repository for data standards, definitions, and context. It was authorized jointly by the.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Revised AQTF Standards for Registered Training Organisations Strengthening our commitment to quality - COAG February August 2006.
Strategies for Achieving Broad-based Diversity ADD Perspectives Jennifer G. Johnson, Ed.D.
AUCD’s TA Contract with ADD UCEDD Directors Meeting October 30, 2006.
UNSD Recent international developments in Energy Statistics.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
NSDI Strategic Plan Update August NSDI Strategic Plan – Purpose/Scope Purpose: Develop a concise, updated strategic plan to guide the Federal government’s.
Middle Fork Project Relicensing Process Plan April 25, 2006.
Session 2: Developing a Comprehensive M&E Work Plan.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
UCEDD Program Announcement Jennifer Johnson, Ed.D. Administration on Developmental Disabilities Administration for Children and Families U.S. Department.
PILOT SCHOOL PRINCIPAL EVALUATION
TN: TEACH AACTE Grant TN TEACH: The TN EPP Assistive and Collaborative Help Network.
Annual Report Workgroup Update
Suggestion for next steps for PGA for REDD+ in Vietnam
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
Evaluation in the GEF and Training Module on Terminal Evaluations
UCEDD Response to DDPIE June 1, 2007
Lynn Elinson, Ph.D. Project Director
Presentation transcript:

Developmental Disabilities Program Independent Evaluation (DDPIE) Project UCEDD Meeting – Technical Assistance Institute May 31, 2007 Lynn Elinson, Ph.D. Project Director

Developmental Disabilities Program Independent Evaluation (DDPIE) Project Also known as “ADD Independent Evaluation”

Purpose of PowerPoint To understand the background and progress of the ADD independent evaluation To obtain a background and context for giving feedback on ADD independent evaluation materials

PowerPoint Outline 1. Background of ADD Independent Evaluation A.Purpose of the DDPIE Project B.Challenges 2.Research design 3.Project implementation A. Overview B.Project activities C.Evaluation tools D.Validation 4.Seeking individualized input 5.Progress and timing

1.Background

A.Purpose of the DDPIE Project Demonstrate impact of DD Network programs on: – Individuals – Families – Service providers – State systems Provide feedback to ADD to help improve the effectiveness of its programs and policies Promote positive achievements of DD Network programs by “storytelling” Promote accountability to the public

Why the independent evaluation? In 2003 ADD conducted a Program Assessment Rating Tool (PART) self-assessment under OMB guidance. PART is a series of questions designed to provide a consistent approach to rating programs across the Federal Government. PART has four parts: (1) Program Purpose & Design; (2) Strategic Planning; (3) Program Management; and (4) Program Results. PART 4 asks whether an agency has conducted an independent evaluation of sufficient scope and quality to indicate that the program is effective and achieving results? ADD answered “no” which lowered overall score.

Challenges Each UCEDD program is unique. Challenge is to develop performance standards that: are relevant to all UCEDD programs; capture the differences among the programs (variability); and will be useful to ADD in demonstrating impact.

2. Research design

Design Considerations PART prefers experimental or quasi- experimental research designs The structure of the ADD programs does not lend itself to conducting randomized trials or pre- and post-tests.

Research Design: Standards-Based Evaluation NOT a randomized control trial or quasi- experimental design IS a standards-based evaluation to: -Set national standards -Determine levels that characterize extent to which national standards are being met -Determine impact DD Network programs (and collaboration among programs) are having on people with developmental disabilities, family members, State systems, and services providers

Reporting at national level Data will be collected on individual programs and rolled up to national level. Independent evaluation will NOT be comparing programs to one another Independent evaluation will NOT replace MTARS, which is specific to individual programs.

2 Types of Standards Evidence-based Consensus-based Performance standards for DDPIE are consensus-based Performance standards will be developed for each DD Network program and collaboration among the three DD Network programs

Key assumptions for designing performance standards State programs vary on their level of performance across the standards. Consistently high performance across the standards is related to better outcomes. Consistently low performance across the standards is related to poor outcomes.

Research design: seeks input and participation from stakeholders Seeks input from: Project Advisory Panel DD Network Program Working Groups All State programs Validation Panels The public

Role of Advisory Panel To provide balance, impartiality, and expertise To provide advice on: DDPIE process Benchmarks, indicators, performance standards, and performance levels Data collection protocols Pilot study Synthesis of findings and recommendations

Composition of Advisory Panel Self-advocates Family members Representatives from 3 programs – Richard Carroll from Arizona UCEDD Child/disability advocates Evaluation expert Federal representative (for PAIMI evaluation)

Working Groups 4 Working Groups (P&A, UCEDD, DD Council, Collaboration) Process: In-person and telephone meetings Role: -To assist Westat in understanding programs -To provide feedback on benchmarks, indicators, performance standards

UCEDD Working Group members Carl CalkinsKansas City, MO Tawara GoodeWashington, DC Gloria Krahn*Portland, OR David MankBloomington, IN Fred Orelove*Richmond, VA Fred PalmerMemphis, TN Lucille ZephOrono, ME *Collaboration Working Group

3. Project implementation

A. Overview

Phases of DDPIE Project DDPIE will be conducted in 2 phases. -Phase 1 – development and testing of evaluation tools (measurement matrices and data collection protocols) -Phase 2 – full-scale evaluation Westat was contracted by ADD to implement Phase 1. -Project began September 30, End of contract – September 29, 2008 Phase 2 will be funded upon completion of Phase 1.

B. Project activities

Steps in Phase I Construct evaluation tools (measurement matrices and data collection protocols) that contain performance standards and performance levels Conduct Pilot Study to test evaluation tools (measurement matrices and data collection protocols) Revise evaluation tools

C. Evaluation tools

2 types of evaluation tools Measurement matrices, which include: -Key functions, benchmarks, indicators, performance standards -Performance levels Data collection protocols

Definitions of key terms in measurement matrices Key functions Benchmarks Indicators Performance standards -Outcome performance standards -Program performance standards

Logic model/format for measurement matrices Benchmarks Indicators Performance Standards Key Functions

Groups of activities carried out by DD Network programs Cover all aspects of program activity 5 UCEDD key functions 1 st four key functions identified by Working Group (core functions in DD Act) Governance and Management – Relevant to other four key functions Benchmarks, indicators, and performance standards are being developed for all key functions.

UCEDD Key Functions A. Interdisciplinary pre-service training and continuing education B. Conduct of basic and/or applied research C. Provision of community services D. Dissemination of information E. Governance and management

Benchmarks Broad, general statements Set bar for meeting expected outcome(s) of each key function About 20 UCEDD benchmarks 3-4 benchmarks for each key function

Indicators Identify what gets measured to determine extent to which benchmarks and performance standards are being met 4 types of indicators: outcome, output, process, structural Will guide the development of data collection instruments

Performance standards Criterion-referenced (measurable) Consensus-based 2 types: -Outcome performance standards -Program performance standards

Outcome performance standards Linked to expected outcomes of each key function Answer the questions: - Were the expected outcomes met? - To what extent?

Program performance standards What the program should achieve, have, and do to effectively: -meet the principles and goals of the DD Act; and -have an impact on people with developmental disabilities, family members, State systems, service providers

Program performance standards (continued) Linked to the structures, processes, and outputs of UCEDD program Answers the questions: -What structures should be in place to carry out UCEDD network key functions? What should they be like? -What processes should be used? What should they be like? -What should the UCEDD network produce? What should products be like? To what extent should they be produced (e.g., how often, how many)?

D. Validation

Overview of validation There is no “gold standard” for an effective UCEDD, so another approach needs to be used to identify performance standards. The ADD independent evaluation uses a consensus approach. This implies participation in the process and validation from a wide variety of stakeholders. There will be several opportunities for validation throughout the development of performance standards. Stakeholders hold a variety of perspectives and, therefore, may not always agree with one another.

Validation approach for DDPIE project Consists of obtaining input, feedback, and consensus Consists of validating measurement matrices (indicators and performance standards) and data collection instruments Is a multi-step process Provides validation opportunities to several types of stakeholders (e.g., consumers, family members, program representatives, advocates, evaluation experts) Provides opportunities for validation at different points in the process

Opportunities for validation Working Group process Advisory Panel meetings State programs (at TA meetings, by telephone, in writing) Validation Panel process OMB process Pre-test and pilot study

Validation Panels There will be 4 Validation Panels (UCEDDs, P&As, DD Councils, Collaboration). Process -Telephone call orientation -“Paper” approach (not face-to-face) – accommodation will be provided -Opportunity for discussion by telephone

Criteria for Validation Panel selection Stakeholder groups (e.g., people with developmental disabilities, family members, advocates, programs, service providers) Researchers

Criteria for Validation Panel selection (continued) Understands consumer needs Understands DD Network programs Diverse composition (gender, race/ethnicity) Mix of junior and senior program staff Urban and rural representation

Focus of Validation Panel process Will achieve consensus Formal process Builds in objective methodology (e.g., criteria for eliminating and accepting indicators and performance standards)

OMB approval process is another form of validation OMB approval process results from the Paperwork Reduction Act Act is administered by Office of Management and Budget (OMB) Purpose of Act is to ensure that information collected from the public minimizes burden and maximizes public utility All Federal agencies must comply

OMB approval process (continued) When contemplating data collection from the public, Federal agencies must seek approval from OMB. Must submit an OMB package consisting of description of study and data collection effort, an estimate of burden, and data collection instruments. Approval process consists of making data collection instruments available for public comment in the Federal Register. ADD will be submitting an OMB package; all interested parties will have opportunity to comment during public comment period.

Pre-test and Pilot Study – additional form of validation Data collection protocols will be pre-tested in one state. A pilot study will be conducted in up to 4 states. Pilot study states will be chosen randomly. Pilot study will test reliability and validity of measurement matrices and feasibility of data collection.

4. Seeking individualized input

Opportunities for individualized input UCEDD TA meeting (May 31, 2007) -Distribution of draft benchmarks, indicators, and a few examples of performance standards -Small group discussions facilitated by AUCD Telephone meetings scheduled in June and July In writing

Small Group Discussions at UCEDD Technical Assistance Meeting (May 31, 2007) Westat will: -Distribute draft performance standards on UCEDD Network and Collaboration -Review organization of materials -Describe feedback process for individual UCEDD programs -Answer questions on process for feedback UCEDD programs will: -Continue to meet in small groups to discuss the materials (facilitated by AUCD) -Report out in a large group on first impressions

Type of Input Sought Benchmarks and indicators: Are they the concepts that need to be addressed? Benchmarks and performance standards: Do they represent what the programs should be achieving/should have/should do in order to be effective in meeting the principles and goals of the DD Act and have an impact on people with developmental disabilities, families, State systems, and service providers? Indicators: Which seem the most important and feasible to measure? Which could be eliminated? If not these, then what?

5. Progress and Timing

Progress to Date Meetings with ADD, head of national associations, TA contractors – November, 2006 Site visit to programs in one state – December, 2006 Review of background materials (provided by ADD; Working Groups; national websites; other) – October, 2005 – February, 2007 Meetings with Working Groups – March, 2006 – September, 2006 Meetings with Advisory Panel - March, 2006, October, 2006, March, 2007 Synthesis of all information by Westat – September, 2006 to February, 2007 Draft benchmarks, indicators, performance standards – February, 2007

Upcoming DDPIE Project Milestones Feedback from UCEDD Working GroupApril – May, 2007 UCEDD TA meetingMay 31, 2007 Feedback from all UCEDD programsJune - July, 2007 UCEDD Validation PanelSept. – Dec., 2007 DD Council Validation PanelOct. – Jan., 2008 P&A Validation PanelNov. – Feb., 2008 Collaboration Validation PanelFeb. – April, 2008

DDPIE Project Milestones (continued) Data collection instrumentsJune, 2008 Measurement matricesJuly, 2008 Final report (with evaluation tools)Sept., 2008 OMB Comment Period Pilot StudyNew contract