Download presentation
Presentation is loading. Please wait.
Published bySteven Webb Modified over 7 years ago
1
Migrant Education Program New State Directors’ Orientation Tutorial Module 8: Program Planning – Migrant Education Program Evaluation U.S. Department of Education Office of Migrant Education Tools for Program Improvement
2
Table of Contents Section Slide # Getting Started 3 What is Required 9
Performance Results to be Included in the Evaluation 22 Program Evaluation as Part of the Continuous Improvement Cycle 30 Setting the Context for the Migrant Education Program Evaluation 33 The Program Evaluation Plan 41 Conducting the Evaluation 61 The Written Evaluation Report 77 Wrapping Up 86
3
Getting Started In This Section Tutorial Objectives
How to Use the Tutorial Icons to Guide You Key Readings and Resources
4
Tutorial Objectives Module 8 will enable state directors to:
Understand the legislative and regulatory requirements for the Migrant Education Program (MEP) Evaluation, Understand the role of program evaluation as part of a continuous improvement cycle, Develop a program evaluation plan that aligns with Measurable Program Outcomes (MPOs) and state performance targets/Annual Measurable Objectives (AMOs), Ensure that Priority for Services students (PFS) are featured, Develop implications for program improvement from evaluation results, and Develop an action plan for the next MEP Evaluation.
5
How to Use the Tutorial For optimal benefit from the tutorial, you should: Allow sufficient time to read the slides, reflect on the information, and complete all activities on the slides or on the Quick Resource and Reflection Sheets (QRRS) that can be downloaded as worksheets; Read each slide as well as the information referenced in the slides; Engage with the “What Do You Think?” slides to facilitate interaction with the information (Answers will be provided directly following each of these slides.);
6
How to Use the Tutorial For optimal benefit from the tutorial, you should (cont.): Pause to reflect on your state program at the “Check-in” slides (A QRRS document will typically accompany these.); Complete the “Pop Quiz!” slides to reinforce key concepts; Review your state’s MEP documents and reports as directed; Develop an action plan using the worksheets provided; Add actionable items to your MEP planning calendar (See QRRS 14.2); and Contact your MEP Officer for follow-up questions.
7
Icons to Guide You The following icons will guide you in making best use of this tutorial: What Do You Think? Check-in Pop Quiz! Quick Reference & Reflection Sheet (QRRS) Action Planning Calendar Item Q! QRRS
8
Key Readings and Resources
You should have these documents readily available while completing the module, as the module will refer to these documents for more complete information on various topics. MEP Guidance on the Education of Migratory Children under Title I, Part C of the Elementary and Secondary Education Act of 1965, Chapter VIII Migrant Education Program Evaluation Toolkit developed by the Office of Migrant Education (OME) Your state’s MEP Service Delivery Plan (which should include the program evaluation plan) The most recent state MEP Evaluation Report
9
What is Required In This Section Basic Requirements What is Measured
Involvement of Local Operating Agencies (LOAs) in the Evaluation Connection to Program Improvement
10
Basic Requirements States must plan, implement, and evaluate programs and projects that ensure that the state and local operating agencies (LOAs) address the special educational needs of migratory children, including preschool migratory children. Section 1304(b)(1) of the ESEA, as amended The state must develop and update a written comprehensive state plan that, at a minimum, has the following components: Performance targets, Needs assessment, Measurable program outcomes, Service delivery, and Evaluation. 34 CFR §
11
Basic Requirements Each state education agency (SEA) must determine the effectiveness of the MEP through a written evaluation of the program that measures the implementation and results of the program against the State's performance targets, particularly for those students who have priority for service (PFS). 34 CFR § PFS children are those: Who are failing, or most at risk of failing, to meet the states challenging state academic content and student achievement standards; and Whose education has been interrupted during the regular school year. Section1304(d) of the ESEA
12
What is Measured SEAs are required to evaluate the effectiveness of the MEP in terms of: Program implementation and Program results. 34 CFR §
13
What is Measured Implementation and results of the MEP are measured against performance targets/annual measurable objectives (AMOs) that the state has established— For all children in reading and mathematics achievement, high school graduation, the number of school dropouts, and school readiness (if any) , and Any other performance targets that the state has identified for migrant children. 34 CFR § .
14
What is Measured Thus, implementation and results of the MEP should be measured against the: Measurable program outcomes (MPOs) the state has established for the MEP as part of its comprehensive Service Delivery Plan (SDP). Those MPOs are what the state MEP will produce to meet the needs of migratory children and help them achieve the state’s performance targets/AMOs. 34 CFR § (a)(3)
15
Involvement of Local Operating Agencies in the Evaluation
In evaluating the program, the SEA will need LOAs to collect data on migratory children who receive services from the program (implementation component) and the outcomes of these services (using the program’s measurable outcomes, as specified in the SDP). Evaluations of the program at the local level will also assist the state in its subgranting process (see Module 4).
16
Involvement of Local Operating Agencies in the Evaluation
The SEA must ensure that the LOA conducts the local evaluation properly, and The SEA should inform its LOAs in advance of any specific data that it will need to evaluate the statewide program, and how the LOAs should collect the data.
17
Connection to Program Improvement
SEAs and LOAs must use the results of the MEP evaluation to improve services for migrant children. 34 CFR § and
18
Pop Quiz! Instructions: Note True or False for each of the following statements related to the requirements for the MEP evaluation. Statement T F 1. The MEP evaluation collects and analyzes state-level data only. 2. The focus of the evaluation is on program results, not implementation. 3. The MEP evaluates instructional services and educational support services. 4. The MEP evaluation should include the evaluation of services for preschool migratory children.
19
Pop Quiz! - Response Number 1 is FALSE. The MEP evaluation includes data from both the local and state levels. Number 2 is FALSE. The evaluation focuses on the results and implementation of the MEP. Number 3 is TRUE. The MEP evaluates both instructional services and educational support services that enable migrant children to participate effectively in school. Number 4 is TRUE. MEP programs and projects must address the special educational needs of migratory children, including preschool migratory children.
20
Performance Results to be Included in the Evaluation
In This Section Performance Goals, Indicators, and Targets Government Performance and Results Act (GPRA) Measures
21
Performance Goals, Indicators, and Targets
Familiarity with the following key terms will assist you in understanding MEP evaluation requirements for reporting performance results: Performance goals, Performance indicators, Performance targets, and Measurable program outcomes (MPOs). See QRRS 8.1 – Reviewing Key Terms for Performance Assessment
22
Performance Goals, Indicators, and Targets
For purposes of program design and evaluation, the MEP will focus on ESEA Performance Goals 1 and 5, along with the indicators for each goal. MEP Guidance, Chapter VIII, B3
23
State Performance Goals, Indicators, and Targets
Performance Goals and Performance Indicators that the State’s MEP Evaluation Must Address Performance Goal 1: By , all students will reach high standards, at a minimum attaining proficiency or better, in reading/language arts and math. 1.1 Performance indicator: The percentage of students, in the aggregate and for each subgroup, who are at or above the proficient level in reading/language arts on the state’s assessment 1.2 Performance indicator: The percentage of students, in the aggregate and in each subgroup, who are at or above the proficient level in math or the state’s assessment Performance Goal 5: All students will graduate from high school. 5.1 Performance Indicator: The percentage of students who graduate from high school each year with a regular diploma 5.2 Performance Indicator: The percentage of students who drop out of school
24
Performance Goals, Indicators, and Targets
For Performance Goals 1 and 5, the U.S. Department of Education required all SEAs to submit performance targets in their consolidated state applications for each performance indicator and baseline data for the targets. States are not required to resubmit their performance targets during the period for which the ESEA, as currently enacted is authorized unless the state makes a significant change in one or more of them. (See Module 3). Several states have revised their performance targets via application for a waiver of certain ESEA requirements (also known as “ESEA Flexibility”). These revised targets are also referred to as revised Annual Measurable Objectives (AMOs).
25
Check-in Review your most recent MEP Evaluation report and respond to the following questions: To what extent are migrant students reaching the state’s performance targets under ESEA Goals 1 and 5, or any other state performance targets included in the MEP evaluation? Does the evaluation show the performance of priority for services (PFS) students relative to these targets? See QRRS 8.2 – State Performance Targets for ESEA Goals 1 and 5
26
Government Performance and Results Act (GPRA) Measures
In compliance with the Government Performance and Results Act (GPRA), ED/OME has adopted four GPRA measures for monitoring progress and maintaining accountability for the MEP on a national level. The GPRA measures present a national picture of the program, to which each state contributes. Therefore, GPRA measures should be part of the MEP evaluation.
27
Government Performance and Results Act (GPRA) Measures
Percentage of MEP students who scored at or above proficient on their state’s annual reading/language arts assessments in grades 3-8 and high school. Percentage of MEP students who scored at or above proficient on their state’s annual mathematics assessments in grades 3-8 and high school. Percentage of MEP students who were enrolled in grades 7-12 and graduated or were promoted to the next grade level. Percentage of MEP students who entered 11th grade who had received full credit for Algebra I or a higher mathematics course.
28
Program Evaluation as Part of the Continuous Program Improvement Cycle
In This Section Continuous Improvement Cycle Making Connections in the Planning Process
29
Continuous Improvement Cycle
30
Making Connections in the Planning Process
Program planning involves a continuous cycle of needs assessment, planning services, implementation, and evaluation. The MEP evaluation determines the degree to which the services identified in the Service Delivery Plan (SDP) (1) are implemented as planned, (2) truly meet the needs identified in the needs assessment, and (3) result in improved performance of migrant students as measured against the state’s performance targets/annual measurable objectives (AMOs) and the GPRA measures. The evaluation will inform updates of the needs assessment and changes in service delivery to improve the state MEP and services it provides to migrant children.
31
Setting the Context for the MEP Evaluation
In This Section Purpose of the MEP Evaluation Evaluation for Implementation and Results Responsibilities of Local Operating Agencies for the Migrant Education Program Evaluation Frequency of Evaluation
32
Purpose of the Migrant Education Program Evaluation
Program evaluation involves systematically collecting information about a program, or some aspect of a program, in order to determine the effectiveness of the program and to improve it. MEP Guidance, Chapter VIII, A1
33
Purpose of the Migrant Education Program Evaluation
The MEP evaluation allows SEAs and LOAs to: Determine whether the program is effective and document impact on migrant children, Improve program planning, Determine the degree to which services are implemented as planned and identify implementation problems, and Identify areas in which migrant children may need different MEP services. MEP Guidance, Chapter VIII
34
Evaluation for Implementation and Results
States should develop methods of disaggregating state assessment data and data on measurable outcomes in order to determine the impact of the MEP on all migrant children – and in particular those who have a priority for services (PFS). MEP Guidance, Chapter VII, C8 Data Sets Data on all migrant students Data on priority for services (PFS) students
35
Responsibilities of Local Operating Agencies for the Migrant Education Program Evaluation
So that its evaluation of the MEP is statewide, each SEA must ensure that its LOAs, as needed, conduct a local project evaluation that measures the implementation of the project and student performance against measurable outcomes (which are aligned to the state MEP’s measurable outcomes, as specified in the service delivery plan). MEP Guidance, Chapter VIII, C3
36
Frequency of Evaluation
Evaluation of Overall Implementation of the State MEP An SEA should conduct an evaluation of the implementation of MEP services on a 2-3 year cycle to determine whether any improvements are needed. Evaluation of Program Results SEAs and LOAs should evaluate the results of the program (e.g., the degree to which the program has met the measurable state and local outcomes) on an annual basis. MEP Guidance, Chapter VIII, C5
37
What Do You Think? Why do you think there is a difference in the recommended time frame for evaluating program implementation and program results?
38
What Do You Think? - Reflection
Did your response address the following points? Overall Program Implementation Evaluation – 2-3 year cycle Implementation of new programs or strategies usually takes two to three years to get staff up to speed, adapt to the context, and make mid-course adjustments for full implementation. Evaluation of Program Results – annually A results-based evaluation is necessary for monitoring student progress toward established goals; information that is collected annually will help determine if academic progress is occurring and will lead to exploration of interventions to foster progress in the cycle of continuous program improvement.
39
The Program Evaluation Plan
In This Section Measurable Program Outcomes (MPOs) Evaluation Questions The Data Collection Plan The Data Collection Task and Timeline The Evaluation Matrix
40
Measurable Program Outcomes (MPOs)
The MEP evaluation builds on the MPOs developed in the process of developing the SDP. (See Module 7.) MPOs are the backbone of the SDP and MEP evaluation. MPOs SDP Evaluation
41
Measurable Program Outcomes
Strong MPOs define: What services will be provided, What is expected to happen as a result of the MEP services, Which students will directly benefit from the services, and Time frame for the service.
42
Pop Quiz! To what extent to you think the following is a strong MPO?
Migrant children in grades 3-5, whose parents attend two reading and homework support workshops during SY , will increase their performance on the state reading assessment by at least 10%.
43
Pop Quiz! - Response You probably noticed that the sample MPO included each of the components of a strong MPO: Migrant children in grades 3-5 [which students], whose parents attend two reading and homework support workshops [what services will be provided] during SY [time frame], will increase their performance on the state reading assessment by at least 10% [what will happen as a result of the services].
44
Evaluation Questions Strong MPOs lead to evaluation questions that can: Measure the effectiveness of the strategies included in the SDP and Maintain alignment with state performance goals, indicators, and targets. A Program Alignment Chart is a useful tool to show the connection between all parts of the service delivery plan, culminating in evaluation questions that will show the effectiveness of the program. See QRRS 8.3 – Example of a Program Alignment Chart, also included in the Service Delivery Plan Toolkit, Table E.5.
45
Evaluation Questions Review the Alignment Chart (QRRS 8.3).
Identify the following: The state performance target related to reading and language proficiency, The concern regarding the barrier to migrant students meeting the state performance target, Strategies selected to address this concern, MPOs for the strategies and, Evaluation questions.
46
Evaluation Questions In reviewing the Alignment Chart in QRRS 8.3, note: The evaluation questions directly relate to the MPOs; they are designed to determine to what extent each MPO’s depiction of program success was achieved. The MPOs relate to the proposed strategies and are measurable ways to determine if the strategies produced the desired results and/or were implemented as described. The strategies address the needs identified in the Need Statements, which were developed based on data to support the concern about barriers to migrant students attaining the state performance target in reading/language arts.
47
What Do You Think? The Alignment Chart in QRRS 8.3 contains several evaluation questions for each of the MPOs. What are three benefits of having several evaluation questions related to each MPO? 1. 2. 3.
48
What Do You Think? - Reflection
Did your responses include any of the following reasons for multiple evaluation questions for each MPO? Can capture possible impacts on results, such as student attrition, or skill level upon entering the program. Can capture possible impacts on implementation, such as teacher level of experience, type of instruction, or frequency of supplemental instruction. Can triangulate results for greater validity. Can raise questions for further examination when discrepancies in data exist.
49
The Data Collection Plan
A well thought-out data collection plan will ensure that data collection is efficient, cost-effective, and systematic. It should specify what type of data is to be collected, from what sources, by whom, and in what time frame. It will ensure that the type of data collected is appropriate to measure the effectiveness of each MPO and to demonstrate migrant student performance relative to state performance targets.
50
The Data Collection Plan
Good questions to ask: What existing data can be utilized? CSPR data (summary data) Data from migrant student databases (child-specific data) What data collection strategies can be utilized across MPOs?
51
The Data Collection Plan
Good questions to ask (cont.): 3. If funds and resources are limited, what are the data collection priorities? Possible responses: Collect all required data (e.g., data related to state performance targets and GPRA measures). Collect data on the most critical needs that the MEP must address and data that will show effectiveness in addressing these needs. When possible, collect data that are the easiest to obtain (e.g., existing data from the CSPR and migrant student databases). Conduct data collection activities that can address several MPOs (e.g., parent interviews, teacher survey).
52
The Data Collection Plan
Good questions to ask (cont.) 4. How will the evaluator ensure that data are collected consistently at the state and local levels? Possible responses: Provide training and technical assistance to LOAs on their responsibilities for the MEP Evaluation. Develop a common Evaluation and Data Collection Plan for all LOAs. Include program evaluation as a condition for a subgrant award. Include a monitoring indicator for LOAs on program evaluation.
53
Data Collection Plan A Data Collection Task and Timeline is a tool which ensures that: Data collection is planned for each evaluation question, Tasks are identified, Evaluator and staff responsibilities are defined, and Deadlines are clear.
54
Sample Data Collection Plan
MPO Evaluation Questions Data Sources Person Responsible Deadline LOAs that conduct a school enrollment fair for migrant parents of kindergarten-aged students for SY 2013 will show a 25% increase in migrant student kindergarten enrollment from SY 2012. How many parents participated in school enrollment fair? How effective did participating parents find the fair in terms of: Information provided Friendliness Convenience What was the percent change in migrant student kindergarten enrollment from SY 2012 to SY 2013? Attendance records and parent interviews School enrollment records for SY 2012 and SY 2013 Evaluator (protocol development) LOA staff (attendance records and interviews) Clerical staff (transcription) LOA staff , LEA data staff August 1 October 1 December 1
55
The Evaluation Matrix An evaluation matrix is a useful tool for keeping track of the methods you are considering for collecting data. An evaluation matrix displays evaluation questions in alignment with the data collection strategies that will be used to answer them. The Evaluation Toolkit developed by OME, Section D, D.5 provides examples of an Evaluation Matrix.
56
Conducting the Evaluation
In This Section Evaluation Expertise Data Analysis Reviewing the Data and Discussing Implications
57
Evaluation Expertise The MEP Evaluation should be conducted by someone with specific expertise in evaluation. Sources for expertise may include: Internal Resources SEA program evaluation staff External Resources Evaluation resources at colleges and universities Other Evaluation consultants
58
Evaluation Expertise Tips on working with your evaluator:
Become familiar with the evaluator’s background, and ensure it is a match for the skills required for the MEP Evaluation. Discuss expectations, responsibilities, deadlines, and accountability. Involve the evaluator in the development of the CNA and SDP, if possible, especially in the development of the evaluation questions.
59
Evaluation Expertise Ensure that the evaluator knows the audience with whom the MEP Evaluation will be shared so that he/she may develop the written report appropriately. Interact frequently with the evaluator so that you can determine if the goal of measuring MEP effectiveness will be achieved and so that you can make recommendations for changes during the evaluation process if needed. Provide OME resources and guidelines to the evaluator.
60
Data Analysis Once the data have been collected, the evaluator will need to organize the data in a way to enable you and the program planning team to: Determine the extent to which migrant students are achieving the state performance targets, especially PFS students; Answer specific questions about the effectiveness and quality of the MEP; Identify program accomplishments; and Identify areas for program improvement.
61
Data Analysis The data should be presented in ways that are: Visual,
Succinct, Comprehensive, Organized, and Targeted toward the audience’s level of knowledge of data analysis. The Program Evaluation Toolkit developed by OME, Section E, provides background information on analyzing and interpreting data.
62
Reviewing the Data and Discussing Implications
Reviewing the data and discussing their implications are critical activities that connect the evaluation with program improvement. Once the data are analyzed and compiled for review, the program planning team should be convened to discuss the data. Adding data analysis to the Alignment Chart is a good way to review the connection between the data analysis, MPOs, strategies, needs, and state performance targets.
63
Reviewing the Data and Discussing Implications
The evaluator should be part of the discussion to answer questions and ensure the team is interpreting the data correctly. The evaluation should show the correlation between MPO results and state performance targets; if a high correlation does not exist, the team should discuss how to ensure that services are resulting in improved migrant student performance. The team should develop a set of recommendations that will be included in the written report and can be incorporated into the CNA and SDP.
64
What Do You Think? - Scenario
Review the following scenario and data analysis, and determine program implications. In order to increase the number of migrant students who applied to attend college, an LOA offered a four-hour Saturday morning information workshop to a group of 55 10th - 12th grade migrant students in two high schools to help them learn about preparing for college.
65
What Do You Think? Data Analysis
Migrant students who attended the workshop were given a pre-workshop survey asking if they intended to apply to college; 53% said they did. In a post-workshop survey, 82% said they intended to apply to college; 72% of the seniors who attended said they intended to apply to college. A sample of migrants students in the 10th -12th grade, who did not attend the workshop, were given a survey asking if they intended to apply to college; 18% said they intended to apply to college Of all the migrant students invited to attend the workshop, only 34% attended. School graduation data for the year in which the workshop was conducted showed that of the 19 migrant students who graduated and had attended the workshop, only 10 (52%) applied to college.
66
Reason for Not Attending
What Do You Think? Data Analysis Interviews with a sample of the students who did not attend the workshop indicated the following reasons. Reason for Not Attending (n=46) Can’t afford to go to college 92% Don’t have good grades 87% Don’t like school 41% Feel like I don’t belong in college 85% Had to work on the day of the workshop 15% Other 20%
67
What Do You Think? Implications 1. 2. 3.
Based on the data analysis related to the workshop for high school students, what are three implications for improving the college application rate for migrant students? 1. 2. 3.
68
What Do You Think? - Reflection
Following are some observations on the data: Based on the increased percentage of students who indicated their intent to apply to college in the pre- and post-surveys, there appeared to be increased interest in college attendance. Note: You can identify a correlation between the workshop attendance and increased interest in college attendance, but you cannot assume that the increased interest in college attendance is a direct result of the workshop. Other factors could have impacted the post-survey results.
69
What Do You Think? - Reflection
More observations on the data: Only a third (34%) of the migrant students who were eligible to attend the workshop actually attended. A small percentage (18%) of the sample of the migrant students who did not attend the workshop indicated that they planned to apply to college. The most significant reasons why students in the sample indicated that they would not apply to college were lack of finances, poor grades, and a sense they did not belong. A large discrepancy exists between the percentage of seniors who attended the workshop and indicated an intention to apply to college (72%), and the percentage of seniors who attended the workshop and actually applied to college (52%).
70
What Do You Think? - Reflection
Implications: Based on the percentage of invited students who did not attend the workshop, there needs to be a more targeted effort to increase attendance at a workshop of this kind. Based on reasons for non-attendance, the program needs to help migrant students become more aware of financial aid options, improve their grades, and develop confidence that they can attend college. The discrepancy between the data on the percentage of migrant seniors who attended the workshop and who indicated an intent to apply to college and the percentage of those who actually applied to college should be explored further. Additional data could help to determine the merit and content of a future workshop, and/or the need for additional follow-up.
71
Reviewing the Data and Discussing Implications
Considerations: Implications derive directly from the data. Be cautious not to attribute causality to correlational data. Multiple data sources (state and local student data, surveys, interviews) can provide greater insight on the effectiveness of a program or strategy. Implications can lead to recommending strategies to improve progress toward achieving MPOs and student results.
72
The Written Evaluation Report
In This Section Components of the Written Report Utilizing the Report for Program Improvement
73
Components of the Written Report
The MEP Guidance recommends the following components of the written report: Purpose – Why did the SEA conduct the evaluation? What is required? Who is the audience? How is the audience expected to utilize the report? MEP Guidance, Chapter VIII, D2
74
Components of the Written Report
2. Methodology – What evaluation process was used? What were the evaluation questions? What basic evaluation designs and data collection methods did the SEA use? What data were collected? What instruments were used? How were instruments tested for reliability and validity and relevance to the program? What was the timeframe? What methodological limitations existed?
75
Components of the Written Report
Results – What were the findings of the evaluation? Program Implementation – Did the agency implement the program as described in the program plan or application? Were there any discrepancies? What implementation barriers were encountered? Program Results – How did migrant students perform in relation to the state performance targets? How did priority for services students perform? How did the results compare to what might have been expected? What special factors were considered (e.g., student attrition)?
76
Components of the Written Report
Implications – What did the SEA and LOAs learn from the evaluation and how will what they learned be applied to improving the program? How effective is the state’s MEP in improving student outcomes? Are there projects or particular strategies that the SEA should continue, expand, or discontinue? What changes in MEP project implementation should take place? What new services should be considered? When will the SDP be revised to reflect these changes?
77
Check-in Review the current MEP Evaluation report for your state and consider the following. On the whole, does the report provide a clear picture of the effectiveness of the MEP? Does it show whether and how migrant students, and priority for services (PFS) students in particular, are progressing toward meeting the state performance targets? Does it show the extent to which specific strategies in the service delivery plan are working to address the needs identified in the needs assessment?
78
Check-in Does the report meet the requirements in the statute and regulations? If not, why not? Does it include the components recommended in the MEP Guidance? If not, why not? See QRRS 8.4 – Reviewing the MEP Evaluation Report
79
Utilizing the Report for Program Improvement
A program evaluation report is only useful to the extent that it is read and used to improve outcomes. The evaluation report should be: Visually appealing, Appropriate for multiple audiences, Specific in its implications for program improvement, Well-organized, Clearly written, and Used.
80
Utilizing the Report for Program Improvement
Communicating the Report Consider to whom the report should be sent and for what purpose. Disseminate the report to stakeholders with a customized cover letter that explains what a particular stakeholder should learn or do as a result of reading the report. LOAs are key stakeholders; consider conducting a training session or webinar to review the report and discuss ways LOAs can improve their services.
81
Wrapping Up In This Section Key Points Action Planning Resources
82
Key Points The MEP Evaluation:
Is part of the cycle of continuous program improvement, Measures both implementation and results of the MEP, Builds on MPOs that define MEP success, Must include particular focus on migrant students who have priority for services (PFS), and Must be used for program improvement.
83
Action Planning Consider the following questions:
When should you conduct the next MEP Evaluation? (When was the last Evaluation conducted?) Does the SDP include an evaluation plan? Does the SDP include strong MPOs? Does the SDP include evaluation questions that address both implementation and results? Is the evaluation plan aligned with the needs, strategies, and MPOs?
84
Action Planning 3. Who will serve as the MEP evaluator? How will this person be involved in the program planning process? 4. What processes are in place to obtain data from LOAs? 5. How will you customize the planning process for the size of your state and number of migrant students? See QRRS 8.5 – Program Evaluation Action Planning Add any actionable items to your MEP planning calendar.
85
Resources for the Migrant Education Program Evaluation
MEP Guidance on Education of Migratory Children under Title I, Part C, of the Elementary and Secondary Education Act of 1965 – explanation of guidelines to implement the laws and regulations related to the MEP Evaluation Program Evaluation Toolkit developed by OME– suggested step-by-step guide with tools and templates to develop the Program Evaluation MEP Officers – list of OME contact information Glossary of Terms – alphabetical listing of key terms applicable to migrant education (see Module 1)
86
New State Directors’ Orientation Tutorial
This tutorial was developed by The SERVE Center at The University of North Carolina at Greensboro under contract number ED-08-CO-0111. Content for this tutorial was developed through a review, compilation, and synthesis of Authorizing statutes and regulatory guidance, Information and resources obtained from the and websites, Other documents shared by the Office of Migrant Education, State Migrant Education Program websites and related documents, and Other websites supporting the educational welfare of migrant children and youth. Note: Some links in this tutorial take the user to external websites provided by other organizations. The U.S. Department of Education cannot guarantee the accuracy of the information at these sites. The inclusion of these links is not intended to reflect their importance, nor is it intended to endorse any views or products of these organizations. No official endorsement by the U.S. Department of Education of any product, commodity, service, or enterprise mentioned in this publication is intended or should be inferred. Note: All images included in this tutorial are used with appropriate licensing agreement, or are copyright cleared or open source.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.