Performance-based Contracting and Maine’s State Personnel Development Grant (SPDG) Dawn Kliphan March 28, 2010.

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
Individual Education Plan Overview Presented By: Pamela Cameron Winter 2013.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
CHANGING ROLES OF THE DIAGNOSTICIAN Consultants to being part of an Early Intervention Team.
System Office Performance Management
Mathematics and Science Partnership Grant Title IIB Information Session April 10, 2006.
Applying for a STARTALK Grant: Designing a Winning Proposal November 22, 2009.
1 Purchasing and Procurement Processes Module Four Revision Date: 2/06/2015.
System Office Performance Management
Problem Solving Model Problem Solving Model NC DPI Summer Preparation Preparation & Implementation Implementation North Carolina.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
A SOUND INVESTMENT IN SUCCESSFUL VR OUTCOMES FINANCIAL MANAGEMENT FINANCIAL MANAGEMENT.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Bibb County Schools Standard 1: Vision and Purpose Standard: The system establishes and communicates a shared purpose and direction for improving.
PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson, Chair Services to Youth Facet May 8-12, 2013.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Training of Process Facilitators Training of Process Facilitators.
Individual Education Plan Overview Presented By: Pamela Cameron Fall 2014.
1 Community-Based Care Readiness Assessment and Peer Review Team Procedures Overview Guide Department of Children and Families And Florida Mental Health.
April 2010 Copyright © 2010 Mississippi Department of Education ARRA Update.
Implementing Formative Assessment Online Professional Development What Principals Need to know.
SUB-MODULE 5. MEANS OF VERIFICATION RESULTS BASED LOGICAL FRAMEWORK TRAINING Quality Assurance and Results Department (ORQR.2)
AdvancED TM External Review Exit Report Polk Pre-Collegiate Academy April 16– 17, 2014.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
Narrative reporting: good practices. Joint Technical Secretariat Seminar for Beneficiaries February 2012 Narva, Estonia.
Title I, Part A Preparing for Federal Program Monitoring Chris McLaughlin Virginia Department of Education Office of Program Administration and Accountability.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Why Do State and Federal Programs Require a Needs Assessment?
Sub-theme 4 Building blocks for NSDS 3 REPORT BACK National Skills Conference 2008 “Reflection on a decade of skills development for the future”:
Individualized Education Plan (IEP) GOALS: Provide an understanding of your role as well as other professionals involved. Demystify the basic workings.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction California Charter Schools Conference February 27, 2012 The.
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot February 4,
Presentation to the Portfolio Committee on the Social Security Agency February 2005.
New Title I Designee Training September 17,
ESEA FOR LEAs Cycle 1 Monitoring Arizona Department of Education Revised October 2015.
Kansas Educator Evaluation Bill Bagshaw Asst. Director Kansas State Department of Education February 25, 2015.
Chapter 3. Assessment Highlights of the Law:IDEA 1997 Strengthens the role of parents Ensures access to the general education curriculum Increases attention.
Quality Assurance Review Team Oral Exit Report School Accreditation AUTEC School 4-8 March 2012.
Presented By WVDE Title I Staff June 10, Fiscal Issues Maintain an updated inventory list, including the following information: description of.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
Title I, Part A Preparing for Federal Program Monitoring Lynn Sodat Virginia Department of Education Office of Program Administration and Accountability.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
The School Effectiveness Framework
BISD Update Teacher & Principal Evaluation Update Teacher Evaluation Committee November 29,
1 Community-Based Care Readiness Assessment and Peer Review Overview Department of Children and Families And Florida Mental Health Institute.
Council March PREVIOUS Reductionistic Whole = Sum of the Parts A Snapshot of the institution at a specific moment in time NEW Synergistic Whole.
Office of Service Quality
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
Colorado Accommodation Manual Part I Section I Guidance Section II Five-Step Process Welcome! Colorado Department of Education Exceptional Student Services.
Consumers, Health, Agriculture and Food Executive Agency 3rd Health Programme The Electronic Submission System (JA 2015) Georgios MARGETIDIS.
Fidelity: Maximizing the Effectiveness of Tier 2 September 24, 2013 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
MSP Summary of First Year Annual Report FY 2004 Projects.
PILOT SCHOOL PRINCIPAL EVALUATION
CONTRACT ADMINISTRATION
Clinical Practice evaluations and Performance Review
Webinar: ESSA Improvement Planning Requirements
Implementation Guide for Linking Adults to Opportunity
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Performance-based Contracting and Maine’s State Personnel Development Grant (SPDG) Dawn Kliphan March 28, 2010

Contracts: Tools for Project Management  Planning and Organization - Blueprint for the project  Communication - Clear roles and expectations  Fidelity and Accountability – measurable outcomes and outputs  Performance-based criteria and evaluation - can improve chances for successful outcomes  Formative and Flexible - short term and intermediate results drive strategies

Basic Elements Goals Objectives for each goal Strategies for each objective Indicators/Deliverables for objectives and strategies Resources/Inputs for the project

Rider A Structure Goal 1: Objective 1.1: Strategies: Indicators 1.1.a (Process): Indicators 1.1.b (Accountability): Salaries & Benefits Sub- contracts TravelStipendsSuppliesOtherIndirect Costs Total $$$$$$$$

Goals What you want to improve or change Broad-based statement Related to the agency’s mission Directly tied to the Design and Management Plan in the SPDG application Not limited by time ********************************************************** ≠How you will achieve the project

Example Goal 1: In accordance with Goal 1 of the SPDG management plan, services under this Agreement will expand and improve Maine’s system of recruitment, preparation, and certification of ASL interpreters.

Objectives Specific actions towards achieving the stated goal Measurable (increase, decrease or achieve) Constrained by time Challenging but achievable Start by thinking about outcomes you want to achieve in order to progress toward achieving the goal. What are the benchmarks from which to measure outcomes?

Example Objective 1.1: By June 15, 2011, the Provider will take the interim steps to achieve an on-line minor program with an intensive on-site summer component in educational interpreting and a post baccalaureate Certificate in Interpreter Training Program (ITP) to be issued by USM by August 31, 2011.

Strategies Specific Activities How to Achieve the Objective Who will be implementing the strategies Identifies relevant inputs Flexible – results can should be used to drive necessary changes and strategy adjustments in order to improve outcomes. Think about the outputs or evidence that would indicate achievement toward the objective.

Examples Strategy Using the USM protocol, develop the document to get permission to plan. Strategy Using USM’s approval process, obtain permission to proceed with developing the program. Strategy The USM Linguistics Department in collaboration with the CEHD will chart the courses for the program and identify those that are available, those that need to be developed, and those that need to be modified for multi-modal distance delivery.

Key Words

Deliverables Accountability Indicators (generally outcomes) Should specify due dates for measurements and data showing outcomes toward the objective Process Indicators (generally outputs) Measurements and data collected should be limited to meaningful data indicating progress toward the anticipated outcome.

Deliverables Data collected should be limited to those necessary for measuring, evaluating and reporting outcomes and performance Indicators should be aligned with those in the SPDG application and should be useful for federal reporting Depending on the objective they could be products and/or regular reports

Examples of Deliverables Indicator 1.1.a (Process): The Provider will deliver reports to the Department’s Project Manager on January 15, 2011, April 30, 2010, and August 15, 2010, depicting activities and results from the strategies that show progress toward meeting the objective. Data will include, but not be limited to: i. a list of courses that need to be developed, with course descriptions; iii. a list of courses that need to be modified for multi-modal distance delivery including the modifications needed for each course; iv. sources of evidence-based research and practices for the curriculum; v. samples of materials for distribution; and vi. data as requested by the Department’s Project Manager.

Example Indicator 1.1.b (Accountability): The Provider will deliver electronic notification to the Department’s Project Manager when permission is given by USM for the on-line minor program and for the Certificate in Interpreter Training Program. These notifications will include the charted courses for the programs and the beginning dates for program implementation.

Performance Measurement Considerations How will you ensure that the work is being done with fidelity? What kind of oversight is necessary? Do you have the staff to provide necessary oversight?

Some Instruments for Collecting Data and Measuring Performance Numbers (student data, certification data) Participant Evaluations and Self-Assessments On-site monitoring Collaborative meetings with Providers Regular reports (quarterly) Interviews with project participants Observations Sampling Standardized assessments Longitudinal data such as teacher retention and student growth in relation to those teachers Invoice information

Resources Budget by Objective Strategies identify resources from Provider Payment is contingent on completion of work during the billing period In-kind contributions Other federal resources used for monitoring, fiscal management and contract administration

Budget

Questions about this presentation? How does this process compare with what your state does? What indicators do you use in determining the effectiveness of contractual services? What are some of the barriers you experience in contract administration?