UCEDD Response to DDPIE June 1, 2007

Slides:



Advertisements
Similar presentations
Introduction to Assessment – Support Services Andrea Brown Director of Program Assessment and Institutional Research Dr. Debra Bryant Accreditation Liaison.
Advertisements

Future Directions for the National Healthcare Quality and Disparities Reports AHRQ 2010 Annual Conference September 27, 2010.
Annual Report Template Workgroup 2009 TA Institute Breakout Session June 2 Bob Bacon Jennifer Johnson and Workgroup Members.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
Developmental Disabilities Program Independent Evaluation (DDPIE) Project UCEDD Meeting – Technical Assistance Institute May 31, 2007 Lynn Elinson, Ph.D.
1 Selecting A Research Method. 2 Selecting research methods How do we answer our research questions? Methodology depends on question, budget, timing,
Student Assessment Inventory for School Districts Inventory Planning Training.
Unit 9: Program Evaluation CERT Program Manager. CERT Program Manager: Program Evaluation 9-2 CERT Program Manager: Program Evaluation Unit Objectives.
The Power Of Partnership: Maryland’s Historic Memorandum of Understanding on the Use of Student Learning Objectives Webinar with David Volrath,
Too expensive Too complicated Too time consuming.
1 OSEP Project Directors’ Conference April 28 th, 2015.
Developmental Disabilities Program Independent Evaluation (DDPIE) Project Jennifer Johnson Lynn Elinson Cynthia Thomas AUCD Annual Meeting October 31,
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
IPMA Executive Conference Value of IT September 22, 2005.
Annual Report. Submission of UCEDD Annual Report DD Act: required each center to submit annual report to the Secretary (ADD) DD Act: required each center.
New England Vocational Rehabilitation Quality Assurance System
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
OSAE sets the PACE: Premier Auditing Consulting and Evaluations! American Recovery and Reinvestment Act (ARRA) Readiness Review.
Chapter 10 Understanding and Planning Reports and Proposals 10-1.
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Supporting local and national accountability Spring Workshops 2016.
© Prentice Hall, 2005 Excellence in Business CommunicationChapter Planning Business Reports and Proposals.
Social return on investments (SROI)
Quality Assurance processes
State Steering Committee
School – Based Assessment – Framework
Designing Effective Evaluation Strategies for Outreach Programs
Chapter 14 Evaluation in Healthcare Education
The assessment process For Administrative units
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Writing Research Proposals
Annual Report Workgroup Update
Consistency of Teacher Judgement
Research Program Strategic Plan
Multi-Sectoral Nutrition Action Planning Training Module
VASS Legislative Conference
Office of Education Improvement and Innovation
Funding Opportunity Announcement Number: HRSA
Working Group Five Indicators and Measurement: demonstrating and improving impact After deliberating the definition of "success" for a transparency regime,
School Improvement Plans and School Data Teams
Chapter 8 Using secondary data
Consumer Satisfaction Data Collection for Information Dissemination
It’s Not Just for OMB Anymore…
UCEDD Outcome Measures
Resource 1. Evaluation Planning Template
Discussion and Vote to Amend the Regulations
Assessment Workshop Title of the Project (date)
Joann Hooper Patty Rooks Paulette Richmond Gary Wenzel
Policy on Transfer Payments Renewal
Using Data for Program Improvement
Background on Provincial Report Cards
Building a Sustainable Community Collaboration
THE RESEARCH PROCESS.
Chapter 14 Evaluation in Healthcare Education
Regulated Health Professions Network Evaluation Framework
Revolutionize USACE Civil Works
Using Data for Program Improvement
PUBLIC SCHOOL CHOICE RENEWAL PROCESS
Strategic Planning Final Plan Team Meeting
Collective Impact: Starting with the end in mind
SGM Mid-Year Conference Gina Graham
Lynn Elinson, Ph.D. Project Director
Chapter 8 Using secondary data
Communication on ongoing processes - agreements on task 1 and 2
Standardized Huddles Process
Expenditure Management
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

UCEDD Response to DDPIE June 1, 2007 Double Tree Eleven Purpose: to summarize the small group feedback on the DDPIE evaluation plan

UCEDD PM Work Group for DDPIE Responses--Purpose Primary audience for this evaluation is OMB, secondary audience may be Congress, programs, public Need to clarify what is needed for the PART process—keep this evaluation focused on that requirement Evaluate DD Act programs—this feedback focuses on the UCEDD network

UCEDD PM Work Group for DDPIE Responses--Approach Recommendations: Base the evaluation on the DD Act; to the extent possible, be cognizant of the work on reauthorization Align the evaluation with compliance with DD legislation (“are programs doing what they are funded to do?”) Evaluate systems impact of the network (“is the legislative intent of the DD Act being accomplished?”)

UCEDD PM Work Group for DDPIE Responses--Approach Clarify what qualifies as “independent” evaluation for OMB (e.g., recognize that independent evaluation may only require primary data collection to answer those questions for which you don’t already have reliable, valid data) In keeping with OMB policy, minimize the burden of data collection on grantees—use existing data sources that we have already invested in (e.g., MTARS, NIRS, annual progress reports).

UCEDD PM Work Group for DDPIE Responses--Approach The present proposal (scope and cost) seems disproportionate to the requirement Use a criterion-based, stratified sampling method of approximately 20-30% of UCEDDs

UCEDD PM Work Group for DDPIE Responses--Methods Add rigorous, qualitative evaluation to tell the story (e.g., multiple case studies, document analysis, focus groups, key informant interviews) Ensure transparency of the process at all levels—communication on steps all along the way Validation needs to occur sooner in the process

UCEDD PM Work Group for DDPIE Responses--Methods Assessment of collaboration needs to include collaboration beyond other ADD-funded partners (i.e. leveraging and systems change requirements of the DD Act)

UCEDD PM Work Group for DDPIE Responses--Measures Reduce the number of measures (benchmarks and performance standards); recommend two for each core function and one for collaboration—select those applicable to most UCEDDs Eliminate measures of structure and process Develop data definition for each measure and evaluate it in terms of applicability of the majority of the network

UCEDD PM Work Group for DDPIE Responses--Measures In developing the benchmarks, use consistent language and framework (e.g., “the network of UCEDDs…”) Focus now on design and benchmarks—wait until there is agreement on benchmarks before reviewing indicators

Next Steps Share recommendations with UCEDD Directors and obtain consensus Discuss recommendations with Westat and ADD Support ongoing work on Benchmarks, Indicators and Performance Standards