Download presentation
1
Program Evaluation
2
Evaluation Systematic investigation of merit or worth using information gathered to make that decision (Guskey, 2000) Needed in physical education to Keep program current and dynamic Inform curricular change decisions
3
Should evaluation strengthen ends or means?
4
Purpose of Program Evaluation
To determine a new program plan To document the validity and/or importance of the expectations To document the way in which the program is being implemented To determine the effect of the program on participants To provide recommendations for revisions based on identified weaknesses
5
Curriculum Evaluation
Examine curricular goals Student performance assessments Views of stakeholders Teacher evaluations Facilities assessments
6
Defensible Data Considers Reliability Validity Findings are replicable
Appropriateness of measures
7
Program Implementation
Are the students enrolled in the program representative of the type of students for whom the program was planned? Is the program being implemented by representative teachers in the teacher-student ratios? Has the content planned for inclusion been taught?
8
Program Effectiveness
Program evaluation seeks to describe the number of students who are making gains on the program objectives Evaluation of the program is merely an extension of the evaluation of individual students
9
Program Effectiveness
Did change occur? Was the change statistically significant? Was the effect educationally significant? Can effects be replicated? Did the observed effects result from the program?
10
If students don’t meet the program outcomes, one must consider:
Characteristics of the teacher Characteristics of the students Characteristics of the instructional setting or context Characteristics of program implementation Strength of relationships often provide insight regarding potential program revisions
11
Program Improvement Document individual student achievement and assess the nature & impact of the hidden curriculum as well as intended outcomes Consider possible changes in program objectives or modifications of the existing program standards
12
Determining needed changes
Knowledge of ‘what’ to improve must be supplemented with information suggesting ‘why’ the weakness exists Weakness observed in program implementation usually results from a lack of knowledge about the process that is involved in planning, implementing, or evaluating; therefore, INSERVICE
13
Evaluation Models Desired outcome model
Primary focus is student achievement Eval. limited by those outcomes that can be precisely stated and for which objective measures can be developed Insensitive to ‘process’ and humanistic aspect of education
14
Evaluation Model Goal-free model: Attention goes beyond outcomes to all that is relevant Follow a checklist Use a wide variety of techniques Product tests e.g. fitness tests, motor skill tests Self-reports may be utilized Use dress outs, absences, assignments
15
Evaluation Model Goal-free Primary value: evaluation is more complete and representative Disadvantage: may rely too heavily on subjective information; may not look at a full range of evaluative needs
16
Developing the eval. plan
Look at total picture rather than isolated “units” Plan to evaluate the effects of program that do not easily lend themselves to measurement e.g. affective development If state mandated standards are in curriculum, eval must be structured with mandates in mind
17
Selection of Eval Instruments
Outcomes based eval will use quantitative data from objective tests to assess changes in students these can provide formative & summative data check individual curr. models for examples of eval. tools
18
Qual. vs Quan. Eval Qualitative eval processes are description, disclosure of meaning, and judgment used when the perspective of curriculum does not specify mastery Quantitative eval: inferences made by statistical significance on the most easily observed characteristics of the environment most often used during content mastery
19
Quantitative vs Qualitative
Curr. eval will generally use both quantitative and qualitative types of eval. Types of instruments suggested for program needs assessment may also be used for program evaluation
20
The instructional process
Qualitative evals tend to be used Study student-teacher interactions study the target of the teacher’s attention verbal interactions nature of discipline classroom climate
21
Preformative Evaluation
Prior to activity, program, or project Identifies goals Estimates impact Analysis of program implementation Helps to avoid costly mistakes
22
Formative Evaluation Occurs during activity Helps to redirect
Time, money, personnel, and resources Proactive Occurs multiple times
23
Summative Evaluation Occurs at conclusion of project
Determines what was accomplished Used for accountability Frequently uses quantitative data
24
Indirect Measures Afterschool program participation
Non-school program participation Student readiness Enrollment in elective classes Attendance, dress, and participation
25
Systematic Model A performance assessment based on authentic assessment The work sampling system assesses and documents a full range of skills, behaviors & values components: developmental checklists, portfolio collection, summary reports
26
Student Fitness Levels
Many schools choose to focus on How to get fit or devising personal plans Caution about Expecting all students to achieve a certain level Setting criterion for particular tests (e.g., a 6-minute mile) Curriculum aligning with fitness goals
27
NASPE STARS Time Teacher Student health and safety
Qualifications Professional development Professional involvement Student ratio Student health and safety Facilities and equipment Program mission Curriculum Instructional practices Student assessment Inclusion Communication Program evaluation
28
PECAT Physical Education Curriculum Analysis Tool
Based on NASPE standards Developed by CDC in partnership with experts
29
Student Assessment: Portfolio Use
Keeps track of student progress Provides students an opportunity to assess own accomplishments Determines the extent to which learning objectives have been mastered Helps parents understand their child’s effort & progress Serves as basis for program evaluation
30
Portfolio Evaluation Methods
Reflection: by students; by parents; by peers all should compare the entries to the standards for the evaluation Conferences: meetings with individuals, small groups to discuss individual growth & achievement compared to teacher’s judgment Progress report: look holistically, create rubrics
31
Developmental checklists
Used for observing, recording, & evaluating behaviors The performance indicators reflect expectations for developmentally appropriate activities; ratings by “not yet”, “in process, “proficient” e.g. uses strength and control to perform fine motor tasks uses eye-hand control to perform fine motor tasks
32
Portfolio Collection Samples are selected that are common to all learners Other items that capture the uniqueness of individual learners may also be chosen the learner is allowed to be involved in the selection process and may judge the quality of own work
33
Summary Report The checklists & portfolios are reviewed & judged
judgments in terms of “developing as expected” or “needs improvement” progress is “as expected” or “not as expected” Report gives comments on strengths & weaknesses as well as steps to support the learner’s academic growth
34
Evaluation Summary Good evaluation
Informs programmatic change Occurs on a regular basis Is planned Is based on multiple data sources Data should inform decision, not make it
35
How do you look at instructional effectiveness, course productivity, and program effectiveness in regards to the curriculum that you are mapping?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.