Download presentation
1
Formative and Summative Evaluations
Instructional Design For Multimedia
2
Evaluation Phases Formative Evaluation Implementation
Summative Evaluation Analysis Design Development Formative Evaluation
3
Formative Evaluation Occurs before implementation
Determines the weaknesses in the instruction so that revisions can be made Makes instruction more effective and efficient
4
Formative Evaluation Is Especially Important When…
Designer is novice Content area is new Technology is new Audience is unfamiliar Task performance is critical Accountability is high Client requests/expects evaluation Instructions will be disseminated widely Opportunities for later revision are slim
5
Formative Evaluation Phases
Design Reviews Expert Reviews One-to-One Evaluation Learner Validation Small-Group Evaluation Field Trials Ongoing Evaluation
6
Design Reviews Should take place after each step of the design process
Goal Review Review of Environment and Learner Analysis Review of Task Analysis Review of Assessment Specifications
7
Design Reviews Goal Review
Question Does the instructional goal reflect a satisfactory response to the problems identified in the needs assessment? Possible Methods Have client review and approve learning goals
8
Design Reviews Environment and Learner Analysis Review
Question Do the environment and learner analyses accurately portray these entities? Possible Methods Collect survey or aptitude data Give a reading test to sample learners Survey managers to confirm attitudes
9
Design Reviews Task Analysis Review
Question Does the task analysis include all of the prerequisite skills and knowledge needed to perform the learning goal, and is the prerequisite nature of these skills and knowledge accurately represented? Possible Methods Test groups of learners with and without prerequisite skills Give a pretest to a sample audience on the skills to be learned
10
Design Reviews Assessment Specification Review
Question Does the test items and resultant test blueprints reflect reliable and valid measures of the instructional objectives? Possible Methods Have experts review assessment items Administer assessment instruments to skilled learners to determine practicality
11
Expert Reviews Should take place when instructional materials are in draft form Experts include: Content Experts Instructional Design Experts Content-specific Educational Experts Learner Experts
12
Expert Reviews Content Experts
Subject matter experts (SMEs) review for accuracy and completeness Is the content accurate and up-to-date? Does the content present a consistent perspective? Example: Physics expert
13
Expert Reviews Instructional Design Experts
Reviews for instructional strategy and theory Is the instructional strategies consistent with principles of instructional theory? Example: Instructional Designer
14
Expert Reviews Content-Specific Educational Expert
Reviews for pedagogical approach in content area Is the pedagogical approach consistent with current instructional theory in the content area? Example: Science education specialist
15
Expert Reviews Learner Expert
Reviews appropriateness such as vocabulary, examples and illustrations Are the examples, practice exercises, and feedback realistic and accurate? Is the instruction appropriate for target learners? Example: 6th grade teacher
16
Expert Reviews Process
Distribute draft material to experts Collect comments and prioritize into categories such as: Critical Revisions should be made immediately Non-critical Disregard or address at a later date More Info Find more data or information
17
Learner Validation Try instruction with representative learners to see how well they learn and what problems arise as they engage with the instruction One-to-One Evaluations Small Group Evaluation Field Trials
18
Learner Validation One-to-One Evaluation
Present materials to one learner at a time Typical problems that might arise are: Typographical errors Unclear sentences Poor or missing directions Inappropriate examples Unfamiliar vocabulary Mislabeled pages or illustrations Make revisions to instruction Conduct more evaluations if necessary
19
Learner Validation One-to-One Evaluation Process
Present materials to student Watch student interact with material Employ “Read-Think-Aloud” method Continually query students about problems they face and what they are thinking Assure student that problems in the instruction are not their fault Tape record or take notes during session Reward participation
20
Learner Validation Small Group Evaluation
Present materials to 8-12 learners Administer a questionnaire to obtain general demographic data and attitudes or experiences Problems that might arise are: Students have more or less entry level skills than anticipated Course was too long or too short Learners react negatively to the instruction Make revisions to instruction Conduct more evaluations if necessary
21
Learner Validation Small-Group Evaluation Process
Conduct entry-level and pretests with students Present instruction to students in natural setting Observe students interacting with materials Take notes and/or videotape session Only intervene when instruction cannot proceed without assistance Administer posttest Administer attitudinal survey or discussion Reward participation
22
Learner Validation Field Trials Evaluation
Administer instruction to 30 students Problems that might arise: Instruction is not implemented as designed Students have more or less entry-level skills Assessments are too easy or difficult Course is too long or too short Students react negatively to instruction Make revisions Conduct more field trials if necessary
23
Learner Validation Field Trials Evaluation Process
Administer instruction students in normal setting, in various regions and with varying socioeconomic status Collect and analyze data from pretests and posttests Conduct follow-up interviews if necessary Conduct questionnaire with instructors who deliver the training
24
Formative Evaluation Ongoing Evaluation
Continue to collect and analyze data Collect all comments/changes made by teachers who deliver the instruction Keep track of changes in learner population Revise instruction or produce new material to accompany instruction as needed
25
Formative Evaluation Summary
Conduct design reviews after each stage of design including goals, environment and learner analysis, task analysis and assessment specifications Conduct expert reviews with content, instructional design, content-specific educator and learner experts Conduct one-to-one evaluations with students Conduct small-group evaluations with 8-12 students Conduct field trials with 30 or more students Conduct ongoing evaluations
26
Summative Evaluation Occurs after implementation (after program has completed full cycle) Determines the effectiveness, appeal, and efficiency of instruction Assesses whether the instruction adequately solves the “problem” that was identified in the needs assessment
27
Summative Evaluation Phases
Determine Goals Select Orientation Select Design Design/Select Evaluation Measure Collect Data Analyze Data Report Results
28
Summative Evaluation Determine Goals
Identify questions that should be answered as a result of the evaluation Does implementation of the instruction solve the problem identified in the assessment? Do the learners achieve the goals of the instruction? How do the learners feel about the instruction? What are the costs of the instruction, what is the return on investment (ROI)? How much time does it take for learners to complete the instruction? Is the instruction implemented as designed? What unexpected outcomes result from the instruction?
29
Summative Evaluation Determine Goals
Select indicators of success If program is successful, what will we observe it in: Instructional materials? Learner’s activities? Teachers knowledge, practice and attitudes? Learner’s understanding, processes, skills, and attitudes?
30
Summative Evaluation Select Orientation
Come to an agreement with client on most appropriate orientation of evaluation Objectivism – Observation and quantitative data collected to determine the degree to which the goals of the instruction have been met Subjectivism – Expert judgment and qualitative data not based on instructional goals
31
Summative Evaluation Select Design of Evaluation
What data will be collected, when, and under what conditions? Instruction, Posttest Pretest, Instruction, Posttest Pretest, Instruction, Posttest, Posttest, Posttest
32
Summative Evaluation Design or Select Evaluation Measures
Payoff Outcomes - Review statistics that may have changed after instruction was implemented Learning Outcomes - Measure for an increase in test scores Attitudes - Conduct interviews, questionnaires, and observations Level of Implementation - Compare design of program to how it is implemented Costs - Examine costs to implement and continue program, personnel, facilities, equipment, and material
33
Summative Evaluation Collect Data
Devise a plan for the collection of data that includes a schedule of data collection periods
34
Summative Evaluation Analyze Data
Analyze the data so that it is easy for the client to see how the instructional program affected the problem presented in the needs assessment.
35
Summative Evaluation Report Results
Prepare a report of the summative evaluation findings that includes: Summary Background Description of Evaluation Study Results Discussion Conclusion and Recommendations
36
Summative Evaluation Summary
Determine the goals of the evaluation Select objective or subjective orientation Select design of evaluation plan Design or select evaluation measures Collect the data Analyze the data Report the results
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.