Lynn Elinson, Ph.D. Project Director

Slides:



Advertisements
Similar presentations
Making a Difference Improving the Quality of Life of Individuals with Developmental Disabilities and their families.
Advertisements

State Developmental Disabilities Councils. DDCs Jennifer G. Johnson, Ed.D., Supervisor Jennifer G. Johnson, Ed.D., Supervisor –
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
From QA to QI: The Kentucky Journey. In the beginning, we were alone and compliance reigned.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
 Reading School Committee January 23,
December 6, Exploring the Role of a PAC By the AB SpEd PAC.
Chapter 2 Flashcards.
PATIENT-CENTERED OUTCOMES RESEARCH INSTITUTE PCORI Board of Governors Meeting Washington, DC September 24, 2012 Sue Sheridan, Acting Director, Patient.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Developmental Disabilities Program Independent Evaluation (DDPIE) Project UCEDD Meeting – Technical Assistance Institute May 31, 2007 Lynn Elinson, Ph.D.
MODULE II 1 How are UCEDDs Connected?. Topics of Presentation 1. Administration on Intellectual and Developmental Disabilities (AIDD) 2. Association of.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
The Program Review Process What is Instructional Program Review?
Center for Community Inclusion & Disability Studies Community Advisory Committee Brief Orientation October 31, 2013.
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
1 The Basics: UCEDDs and the Consumer Advisory Committee MODULE I.
1 The Basics: UCEDDs and the CPAC Orientation. 2 Acronyms The five most common acronyms in this slideshow are: DD Act: Developmental Disabilities Assistance.
Fiscal Year 2008 UCEDD Grant Applications Jennifer Johnson, Ed.D. Administration on Developmental Disabilities Administration for Children and Families.
Developmental Disabilities Program Independent Evaluation (DDPIE) Project Jennifer Johnson Lynn Elinson Cynthia Thomas AUCD Annual Meeting October 31,
National Parent Leadership Development Project for ICCs and the A.P.P.L.E. Project Models of Parent Leadership Development.
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
Building the UCEDD of Tomorrow: Enhancing the Roles of Individuals with Developmental Disabilities and Families in UCEDD Activities Jennifer G. Johnson,
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
1 MODULE III Orientation to the UCEDD. 2 Introductions Name Part of state you are from Experience with disability Parent? Self-Advocate? Provider?
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Annual Report. Submission of UCEDD Annual Report DD Act: required each center to submit annual report to the Secretary (ADD) DD Act: required each center.
Strategic Plan Strategic Goals (Thrusts) 1. Achieve Performance Excellence CRJ uses metrics of performance to evaluate, manage and plan its.
BMH CLINICAL GUIDELINES IN EUROPE. OUTLINE Background to the project Objectives The AGREE Instrument: validation process and results Outcomes.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
AUCD’s TA Contract with ADD ADD TA Institute May 31, 2007 Update.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
African Centre for Statistics United Nations Economic Commission for Africa Proposed Framework for Monitoring, Evaluation and Reporting Negussie Gorfe.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
QUALITY OF THE MANAGEMENT PLAN 15 Points (recommend 5 pages)
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Strategies for Achieving Broad-based Diversity ADD Perspectives Jennifer G. Johnson, Ed.D.
AUCD’s TA Contract with ADD UCEDD Directors Meeting October 30, 2006.
Material produced under Phare 2006 financial support Phare TVET RO 2006/ Project financed under Phare EUROPEAN UNION MERI/ NCDTVET-PIU.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
State Development Information and tips to develop the Annual Work Plan 1.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
A New Look at Logic Modeling and ADRC Evaluation - The Georgia Experience Glenn M. Landers Amanda Phillips Martinez.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
UCEDD Program Announcement Jennifer Johnson, Ed.D. Administration on Developmental Disabilities Administration for Children and Families U.S. Department.
Strategic planning A Tool to Promote Organizational Effectiveness
Logic Models How to Integrate Data Collection into your Everyday Work.
PILOT SCHOOL PRINCIPAL EVALUATION
State Steering Committee
North Carolina Council on Developmental Disabilities
Designing Effective Evaluation Strategies for Outreach Programs
Department of Political Science & Sociology North South University
Resources for Consumer Advisory Committee (CAC) Orientation Project
Measuring Project Performance: Tips and Tools to Showcase Your Results
The Early Childhood Technical Assistance Center
UN Development Account Project on Improvement of Disability Statistics
2018 OSEP Project Directors’ Conference
Responsibilities and Duties of Members and Staff
UCEDD Response to DDPIE June 1, 2007
Albania 2021 Population and Housing Census - Plans
Monitoring and Evaluation
North Carolina Council on Developmental Disabilities
AUCD Pre-Conference Workshop
Technical and Advisory Meeting
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Lynn Elinson, Ph.D. Project Director Developmental Disabilities Program Independent Evaluation (DDPIE) Project Lynn Elinson, Ph.D. Project Director

Evaluation Standards* Utility Feasibility Propriety Accuracy *Joint Committee on Standards for Educational Evaluation

Purpose of the Independent Evaluation Demonstrate impact of DD Network programs on: Individuals Families Service providers State systems Provide feedback to ADD to help improve the effectiveness of its programs and policies Promote accountability to the public

Purpose for UCEDDs Identification of accomplishments from external organization Identification of areas that need improvement

DDPIE Project Independent evaluation 2 phases Phase 1 – development and testing of tools Phase 2 – full-scale evaluation Westat – contracted by ADD to implement Phase 1

Evaluation Standards Indicators What do we hope to achieve? What do we observe (measurement of indicators)? Comparison Are there differences/discrepancies? What is the nature and extent of the differences? What action needs to be taken?

Open Systems Model Structure (Input) Process Output (Product) Outcome Effectiveness Structure (Input) Process Output (Product) Outcome Have Hoda re-create this so it looks better on the slide. Inputs are those resources that are needed to set processes in motion and keep them running. Some examples of inputs are staff, policies, resource networks, facilities, and funding. Inputs must be in place before proposed processes can function properly. Processes are those event sequences and arrangements of staff, services, and resources needed to achieve the intended result(s). When inputs are in place and processes are functioning as intended, then outputs and outcomes are produced. Outputs, often referred to as products, are the “units” produced by processes supported by given inputs. An example is the number of staff trained to use particular strategies targeted at meeting the employability needs of people with disabilities. Outcomes refer to the intended results of creating certain outputs/products. While the product is the trained staff, the relevant outcome is the increased capacity of the staff to serve people with disabilities who seek their services. Of course, the ultimate outcome is improvement in the employment outcomes of people with disabilities. Effectiveness is defined as the relationship between the outcomes achieved and the processes used to affect those outcomes. One can ask the question, “Do the unique capacity-building strategies used by a grantee (or the grantees in general) in fact produce the desired outcomes?” The efficiency of capacity-building efforts is evaluated through a comparison of inputs and outputs. Given that outcomes are satisfactory, or remain constant, the relative efficiency can be assessed among different approaches to building system capacity to serve people with disabilities. In this model, efficiency is only relevant if positive outcomes are realized. Efficiency

Basic Evaluation Approach Performance-based approach Development of standards and indicators Measurement of indicators to determine level at which standards are being met Development of measurement matrices that contain standards, indicators, and performance levels (not developed; limited development; adequate development) Determination of overall performance at the national level

Evaluation Tools Measurement matrices - standards - indicators (structures, processes, outputs, outcomes) - performance levels Data collection instruments

INDEPENDENT EVALUATION FRAMEWORK OF INDICATORS DD Act Administered by ADD Programs DD Councils P&As UCEDDs Collaboration KEY FUNCTIONS FRAMEWORK OF INDICATORS STANDARDS PERFOR- MANCE LEVELS Standards will be used as benchmarks to evaluate the indicators, as measured. DD Councils Outreach Informing policymakers UCEDDs Training Comm. serv. Research Dissemination P&As Individual advocacy Outreach/public education Collab. Proj. develop. Proj. implement. Structure Process Output Outcome Level at which a standard is met. Three performance levels for each standard. MEASUREMENT MATRICES Will be developed to organize each program’s key functions, framework of indicators, standards and corresponding indicators, and performance levels for each standard.

Project Tasks and Timing October, 2005 – December, 2006 Collect and review background information. Establish Advisory Panel, Working Groups, and Validation Panels. Develop draft performance standards, indicators, and data collection tools. Work with Validation Panels to finalize matrices for pilot study. Train pilot study staff. January, 2007 – September, 2007 Conduct pilot study in up to 5 states. Write report to ADD with recommendations.

Advisory Panel Self-advocates Family members Representatives from 3 programs - UCEDD: Richard Carroll from Arizona Child/disability advocates Evaluation expert Federal representative (for PAIMI evaluation)

Working Groups 4 Working Groups (UCEDD, P&A, DD Council, Collaboration) Role: To assist Westat in developing draft measurement matrices that will be reviewed and endorsed by Validation Panels Process: In-person and telephone meetings; work offline

Criteria for Selection of UCEDD Working Group Members Director/Associate Director At least one state/jurisdiction has more than one UCEDD program Rural/urban Geographic distribution Reflects different types of UCEDDS (e.g., placement in medical school, school of education, standalone)

Working Group Activities Description of program Discussion and identification of key functions Identification of structures, processes, outputs, and outcomes for each key function Discussion of standards for structures, processes, outputs, and outcomes

What can UCEDDs do? Stay informed Open process - input welcome Participation in pilot study – random selection

UCEDD Working Group Members Carl Calkins, Ph.D. Kansas City, MO calkinsc@umkc.edu Tawara Goode, M.A. Washington, DC tdg2@georgetown.edu Gloria Krahn, Ph.D. Portland, OR krahng@ohsu.edu David Mank, Ph.D. Bloomington, IN dmank@indiana.edu Fred Orelove, Ph.D. Richmond, VA forelove@vcu.edu Fred Palmer, M.D. Memphis, TN fpalmer@utmem.edu Lucille Zeph, Ed.D. Orono, ME Lu.Zeph@umit.maine.edu

Westat Contact Information Lynn Elinson - LynnElinson@westat.com - 412 421-8610 Cynthia Thomas - CynthiaThomas@westat.com - 301 251-4364 Priyanthi Silva - PriyanthiSilva@westat.com - 301 610-5162 Bill Frey - FreyW1@westat.com - 301 610-5198