Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation: A focus on Title I, Part A Julie E. McLeod

Similar presentations


Presentation on theme: "Evaluation: A focus on Title I, Part A Julie E. McLeod"— Presentation transcript:

1 Evaluation: A focus on Title I, Part A Julie E. McLeod
Supervisor of Federal Program Evaluation Hillsborough County Public Schools Courtney Zmach, Ph.D. Coordinator, Research and Program Evaluation Collier County Public Schools

2 Evaluation Overview of Program Evaluation Evaluating Title I, Part A
Ideas/sample template Courtney

3 “Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.” Courtney --Albert Einstein

4 What is Program Evaluation?
Process to assess the implementation and effects of a program Involves: Carefully collecting information Analyzing data Drawing conclusions about implementation and/or effects of program Courtney

5 Evaluation vs. Needs Assessment
Needs Assessment is “a systematic process for determining goals, identifying discrepancies between desired and actual performance, and establishing priorities for action.” The evaluation of the implementation and effectiveness of last year’s program can also function as the upcoming year’s needs assessment. Courtney

6 Guiding Principles Evaluation is not just about demonstrating success, it is also about learning why things don’t work. Evaluation is not about finding out about everything, but about finding the things that matter. Evaluation allows you to continually improve your project, both during its implementation and planning new projects. Courtney Determine overall effectiveness of program (did it meet its intended goals?) Implementation - Identify changes to improve the program Increase the understanding of specific strategies and help schools determine the usefulness of activities they have undertaken Educate stakeholders Meet accountability and compliance requirements

7 Conducting a Successful Evaluation
Take time to plan Set aside adequate resources Begin the evaluation during the initial stages of program implementation Promote participation of stakeholders in the evaluation Ensure confidentiality of responses Consider cultural issues Courtney Invest in planning: Before you begin, develop a plan that details what you are planning to evaluate, the timeframe for conducting the evaluation, who will do the evaluation, what resources are available, and what you plan to do with the findings. Ideally, the evaluation will be planned in conjunction with the program development. If possible, involve the evaluator/evaluation team in writing the program objectives. Set aside adequate resources for the evaluation: Resources include not only funds to support the evaluation, but also staff time to complete evaluation activities. Begin the evaluation during the initial stages of program implementation: It is always best to begin the evaluation even before the program is implemented. Having information about the program from the very beginning enables you to make modifications if you determine that any aspect of the program is not working. Obviously, it is far better to make changes in a planned program early in the implementation phase than to carry out a program knowing that some aspect of it is not working well. An example might be an LEA including the implementation of a new reading program as one of the strategies in the schoolwide program plan and/or Title I program application. If after the first two administrations of the progress monitoring, the LEA and/or school determines that a large number of students are not meeting expectations then the LEA and/or school would need to determine the cause and provide additional support. Changes in the plan may include providing additional support in the form of coaching and/or professional development to teachers, or reducing the group size to ensure that the students struggling the most are provided with the individual attention as needed. This is an example of changing a program design in its early stages, when it becomes apparent that student academic success is likely to be much lower than expected. If this change in program design had not been made early in the implementation phase, the program may not have produced the desired outcomes. Promote participation of all stakeholders in the evaluation: Participation of all of the stakeholders in the evaluation is critical to its success. One way to encourage stakeholder ownership of and responsibility for the evaluation is to involve them in the evaluation. This can be accomplished during the planning and/or data collection phase of the program evaluation. Be realistic about the burden of an evaluation: Evaluations require work. Even if an outside evaluator is used, the evaluation process will require a time commitment for the members of the evaluation team. Depending on the type of data collection planned for the evaluation, there may be the need to interview staff, review records, or distribute and collect surveys. The time commitment is another reason that it is important to explain to staff and other stakeholders why you need an evaluation and how it will benefit them. Ensure confidentiality of responses: Obtaining data regarding the feelings of individuals can be sensitive; especially if they think their jobs or children will be negatively impacted. When promising confidentiality of responses to the survey instruments, it is important to adhere to this commitment through all evaluation activities. Consider cultural issues: You will want to ensure that the evaluation is relevant to and respectful of the cultural backgrounds and individuality of program participants.

8 Program Evaluation Framework
Engage Stakeholders Describe the Program Focus the Evaluation Design Gather Credible Evidence Justify Conclusions Ensure Use and Share Lessons Learned Standards Utility Feasibility Propriety Accuracy Courtney Steps Engage stakeholders, including those involved in program operations; those served or affected by the program; and primary users of the evaluation. For additional details, see Engaging Stakeholders  [PDF - 13KB]. Describe the program, including the need, expected effects, activities, resources, stage, context and logic model. For additional details, see Describing the Program  [PDF - 22KB]. Focus the evaluation design to assess the issues of greatest concern to stakeholders while using time and resources as efficiently as possible. Consider the purpose, users, uses, questions, methods and agreements. For additional details, see Focusing the Evaluation Design  [PDF - 15KB]. Gather credible evidence to strengthen evaluation judgments and the recommendations that follow. These aspects of evidence gathering typically affect perceptions of credibility: indicators, sources, quality, quantity and logistics. For additional details, see Gathering Credible Evidence  [PDF - 14KB]. Justify conclusions by linking them to the evidence gathered and judging them against agreed-upon values or standards set by the stakeholders. Justify conclusions on the basis of evidence using these five elements: standards, analysis/synthesis, interpretation, judgment and recommendations. For additional details, see Justifying Conclusions  [PDF - 15KB]. Ensure use and share lessons learned with these steps: design, preparation, feedback, follow-up and dissemination. For additional details, see Ensuring Use and Sharing Lessons Learned  [PDF - 15KB] as well as a checklist  [PDF - 18KB] of items to consider when developing evaluation reports. Standards Utility standards ensure that an evaluation will serve the information needs of intended users. Feasibility standards ensure that an evaluation will be realistic, prudent, diplomatic and frugal. Propriety standards ensure that an evaluation will be conducted legally, ethically and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results. Accuracy standards ensure that an evaluation will reveal and convey technically adequate information about the features that determine worth or merit of the program being evaluated. Source:

9 Define the Program & Develop the Evaluation Plan
What are you trying to evaluate? What questions do you want answered? What information/data need to be collected to answer these questions? When/how will collection of information occur? Courtney

10 Courtney

11 Types of Evaluations Implementation - examine the delivery of the program, the quality of its implementation, the organizational context, procedures, personnel, and so forth Outcome - examine the effects or outcomes of the areas of interest; summarize data by describing, can include cost-benefit analysis Courtney Implementation – under what conditions does a program work? Summative evaluations, examine the effects or outcomes of some object -- they summarize it by describing what happens subsequent to delivery of the program or technology; assessing whether the object can be said to have caused the outcome; determining the overall impact of the causal factor beyond only the immediate target outcomes; and, estimating the relative costs associated with the object.

12 Types of Data Quantitative Qualitative
Numerical (test scores, tallies, etc.) Qualitative Text-based (open-ended survey responses, focus groups, interviews) Courtney

13 Framing Your Questions
Focus of Evaluation Evaluation Questions Process How well was the project designed & implemented? Outcome To what extent did the project meet the overall needs? Was there any significant change and to what extent was it attributable to the project? Learnings What worked and what did not? What were unintended consequences? Investment Was the project cost effective? Was there another alternative that may have represented a better investment? What next Can the project be scaled up? Can the project be replicated elsewhere? Is the change self-sustaining or does it require continued intervention? Courtney

14 Evaluating Title I, Part A
Why do we do it? How do we do it? Sample Template Recommended Reading: Guide for Evaluating Title I, Part A Programs (FLDOE Bureau of Federal Education Programs) Julie

15 “Title I Evaluation” in a Nutshell
Annual review of strategies in Schoolwide (or Targeted Assistance) Plan to determine if the program is directly or indirectly contributing to the desired outcome (increased student achievement) Julie

16 NCLB Federal Regulations for Schools
Title 34, Part 200, Section (c) – Core elements of a schoolwide program: (c) Evaluation. A school operating a schoolwide program must— (1) Annually evaluate the implementation of, and results achieved by, the schoolwide program, using data from the State's annual assessments and other indicators of academic achievement; (2) Determine whether the schoolwide program has been effective in increasing the achievement of students in meeting the State's academic standards, particularly for those students who had been furthest from achieving the standards; and (3) Revise the plan, as necessary, based on the results of the evaluation, to ensure continuous improvement of students in the schoolwide program. Julie Implementation = strategies from SIP/SWP Results = goals/objectives from SIP/SWP FCAT/FSA NGSSS  FCAT

17 NCLB Federal Regulations for LEAs
Title 34, Part 200, Section —Local Review  (a) Each LEA receiving funds under subpart A of this part must:  Use the results of the state assessment system to review annually the progress of each school service to determine if the school is making AYP Publicize and disseminate the results of its annual progress review to parents, teachers, principals, schools, and the community; and Review the effectiveness of actions and activities that schools are carrying out under subpart A of this part with respect to parental involvement, professional development, and other activities assisted under subpart A of this part. Julie AYP is calculated by state NCLB SPAR report is created by state and posted on web; schools/district should disseminate Actions/activities = strategies from SIP/SWP/project application

18 Relevant FLDOE Compliance Items
Compliance Item AIA-3: The Local Educational Agency (LEA) shall ensure that schools implementing schoolwide programs conduct a comprehensive needs assessment (CNA) of the entire school, while taking into account the needs of migratory children, which is based on student achievement data related to the state academic content standards and the state academic achievement standards. Section 1114(b)(1)(A), P.L ; 34 CFR, Part 200, Section (a)(1) Compliance Item HIA-1: The Local Educational Agency (LEA) shall annually evaluate the Title I program and report the results in the following areas: the LEA's progress in achieving the objectives in its approved application; the effectiveness of the project in meeting the purposes of the program; and the effect of the project on students being served by the project. EDGAR 34 CFR Part 75 Section Julie Did I get them all?

19 Benefits of Title I Evaluation
Provide information to administrators, project staff, school personnel, and parents Examine the operation of the school (as detailed in the program plans): Implementation of instructional strategies Degree of parental involvement Other elements that support increased student achievement Assist district and school personnel in making decisions about the program to best meet the needs of students Determine whether the school/program has met its objectives as stated in the SWP (school level) or Project Application (district level). Julie

20 Engage the Stakeholders
Individuals involved in day-to-day operations Teachers Administrators Support staff Individuals and groups served by the program Students Parents Community members Individuals in a position to make recommendations and/or decisions regarding the program SAC School planning team School administrators District personnel Julie – do we need this slide??

21 Evaluating Title I has two parts –
Implementation Outcomes

22 Implementation Evaluation
Have the activities/strategies been implemented and to what degree? SWP/SIP (school) Project Application (LEA) Were activities/strategies implemented with fidelity? If not, why? What were the barriers to implementation? Were staff properly trained?

23 Outcome Evaluation Did the school (or LEA) meet the goals/objectives of their SWP/SIP (Project Application)? Did the students meet the state academic achievement standards? Did achievement increase (especially for those who were furthest from meeting standards)? For those students who are not proficient, what specific content areas are the most troublesome? Determining the overall impact of the program or causal factor is difficult with Title I because of the lack of comparison group. What percentage of students, as a whole and in disaggregated groups, has achieved proficiency relative to the state’s academic content and achievement standards and how does this compare to the percentage that achieved proficiency before schoolwide plan implementation? What do other student achievement data indicate about student progress toward meeting the state standards, including pre- and post-test scores, grades, quarterly reading achievement results, or other diagnostic classroom or school-based results? How do subgroups compare to each other? Are specific groups lagging behind other groups? Is the LEA/school closing the achievement gap between minority and non-minority students? For those students who are not proficient, what specific content areas are the most troublesome?

24 Potential Data Sources
Progress monitoring data Graduation rate State CRT (FCAT 2.0/FSA) Attendance End-of-Course (EOC) Exams Promotion/retention rates State reading assessment (FAIR) Demographic data Discipline data Kindergarten Readiness Professional Development participation Classroom walk-through data Evaluations/Feedback from workshops and meetings Surveys/Focus Groups Courtney

25 Where to find the data: Florida School Accountability Reports : FCAT Demographic Reports: PK-12 Public School Data Publications and Reports: services/pk-12-public-school-data-pubs-reports/index.stml ACT-SAT-AP Reports: Education, Information, and Accountability Services Publications and Reports: Performance Profiles: School Environmental Safety Incident Reporting (SESIR) System Reports: School Public Accountability Reports (SPAR): prd.doe.state.fl.us/eds/nclbspar/index.cfm It is important to keep in mind that some of these websites are updated sporadically. Be sure to check the dates for all of the data you are pulling from these sites. Julie

26 General Pointers Weave federal and state monitoring requirements with what is wanted/needed at the local level Determine what you want to learn about your program or what would help most Incorporate the project application goals, objectives, and strategies Include tables and graphs to illustrate data Craft appropriate recommendations for program improvement

27 Ideas and Examples www.ectacfl.net/Evaluators.html
The ECTAC Evaluation Team has created “templates” to provide a structure for evaluating the implementation of the program Need 3 Sample

28 Resources on the ECTAC Site

29 Evaluation Matrices

30 Evaluation Template

31 Sample

32 Your Turn…

33 Questions??????

34 richard.janiak@ yourcharlotteschools.net
Thank you! Julie McLeod Dr. Rich Janiak x258 yourcharlotteschools.net Dr. Courtney Zmach Jan Mahowski (727) X2020


Download ppt "Evaluation: A focus on Title I, Part A Julie E. McLeod"

Similar presentations


Ads by Google