Download presentation
Presentation is loading. Please wait.
Published byChloe Garrett Modified over 9 years ago
1
Dr. Jeff M. AllenDepartment of Technology and Cognition Assessment of Courses An Open Forum through the University Forum on Teaching, Learning and Assessment Sponsored by the Office of the Vice-President of Academic Affairs Can be Downloaded at: www.coe.unt.edu/Allen/download.htm
2
Dr. Jeff M. AllenDepartment of Technology and Cognition Session Objectives Discuss General Classroom Assessment. Discuss Outcome Oriented Evaluation. Facilitate an Open Discussion about Assessment.
3
Dr. Jeff M. AllenDepartment of Technology and Cognition Needs Assessment What do you know about assessment or evaluation? What are your expectations of this session?
4
Dr. Jeff M. AllenDepartment of Technology and Cognition What is the Difference between Education & Training?
5
Dr. Jeff M. AllenDepartment of Technology and Cognition Consider your reaction if your child came home and told you: We attended sex education today! or We attended sex training today!
6
Dr. Jeff M. AllenDepartment of Technology and Cognition Evaluation Myth I can’t Measure the Results of my Program.
7
Dr. Jeff M. AllenDepartment of Technology and Cognition Using different “Levels” of Evaluation: Comfort (Environment) Consider...
8
Dr. Jeff M. AllenDepartment of Technology and Cognition Using different “Levels” of Evaluation: Comfort (Environment) Formative (Progress) Consider...
9
Dr. Jeff M. AllenDepartment of Technology and Cognition Using different “Levels” of Evaluation: Comfort (Environment) Formative (Progress) Summative (Learning Outcomes) Consider...
10
Dr. Jeff M. AllenDepartment of Technology and Cognition Using different “Levels” of Evaluation: Comfort (Environment) Formative (Progress) Summative (Learning Outcomes) Transfer (Application of Learning) Consider...
11
Dr. Jeff M. AllenDepartment of Technology and Cognition Contextual Learning Why do I have to learn this? Where am I going to use it? How does this relate to where I’m going in life? Consider...
12
Dr. Jeff M. AllenDepartment of Technology and Cognition Evaluation Myths I Don’t Need To Justify My Existence, I Have a Proven Track Record
13
Dr. Jeff M. AllenDepartment of Technology and Cognition Evaluation Myth My Doesn’t Require Evaluation, So Why Should So Why Should I Do It?
14
Dr. Jeff M. AllenDepartment of Technology and Cognition What if is no longer in that position? What about tough economic times? Budget Cuts? What if someone doesn’t have the same amount of faith? Consider...
15
Dr. Jeff M. AllenDepartment of Technology and Cognition Why Evaluate?
16
Dr. Jeff M. AllenDepartment of Technology and Cognition Goals of Assessment (Classroom) Basis for Planning & Directing Determine if Training / Education makes a Difference Assess Progress & Achievement Continuous Improvement
17
Dr. Jeff M. AllenDepartment of Technology and Cognition Goals of Assessment (Programs) Basis for Planning & Directing Assess Progress & Achievement Evaluate Curriculum / Programs / Staff / Facilities Certify / Rate / Accredit (for example: SACS & THECB) Monitor Expenditures of $$$
18
Dr. Jeff M. AllenDepartment of Technology and Cognition Goals of Assessment (Programs) Justify Existence Accountability Pressure from Other Organizations Marketing Continuous Improvement
19
Dr. Jeff M. AllenDepartment of Technology and Cognition Please Remember... Not everything is quantifiable. Consider evaluation EARLY. Some benefits cannot be measured immediately.
20
Dr. Jeff M. AllenDepartment of Technology and Cognition A Complete Results Oriented Model Source: Phillip, J. (1991). Handbook of Training Evaluation and Measurement Methods (2 nd Ed.). Houston, TX: Gulf. Latest Edition: Phillips, J. (1997) – 3 rd Ed. ISBN #0-88415387-8
21
1. Conduct a Needs Analysis 2. Develop Tentative Objectives 3. Establish Baseline Data 4. Select Evaluation Methods/Designs 5. Finalize Program Objectives 7. Estimate Program Cost/Benefits 8. Prepare and Present Proposal 9. Design Evaluation Instruments 10. Determine & Develop Content 11. Design or Select Training and Development Methods 12. Test Program and Make Revisions BEFORE 13.Implement and Conduct Program 14.Evaluate/Collect Data at Proper Stages 15.Analyze and Interpret Data DURING 16. Make Adjustments 17. Calculate Return-on- Investment 18. Communicate Results AFTER Results-Oriented Model
22
1. Conduct a Needs Analysis 2. Develop Tentative Objectives 3. Establish Baseline Data 4. Select Evaluation Methods/Designs 5. Finalize Program Objectives 7. Estimate Program Cost/Benefits 8. Prepare and Present Proposal 9. Design Evaluation Instruments 10. Determine & Develop Content 11. Design or Select Training and Development Methods 12. Test Program and Make Revisions BEFORE 13.Implement and Conduct Program 14.Evaluate/Collect Data at Proper Stages 15.Analyze and Interpret Data 16. Make Adjustments 17. Calculate Return-on- Investment 18. Communicate Results DURING AFTER Results-Oriented Model
23
13.Implement and Conduct Program 14.Evaluate/Collect Data at Proper Stages 15.Analyze and Interpret Data (Improvement Notes) DURING 1. Conduct a Needs Analysis 2. Develop Tentative Objectives 3. Establish Baseline Data 4. Select Evaluation Methods/Designs 5. Finalize Program Objectives 7. Estimate Program Cost/Benefits 8. Prepare and Present Proposal 9. Design Evaluation Instruments 10. Determine & Develop Content 11. Design or Select Training and Development Methods 12. Test Program and Make Revisions BEFORE Results-Oriented Model
24
16. Make Adjustments 17. Calculate Return-on- Investment 18. Communicate Results 13.Implement and Conduct Program 14.Evaluate/Collect Data at Proper Stages 15.Analyze and Interpret Data (Improvement Notes) 1. Conduct a Needs Analysis 2. Develop Tentative Objectives 3. Establish Baseline Data 4. Select Evaluation Methods/Designs 5. Finalize Program Objectives 7. Estimate Program Cost/Benefits 8. Prepare and Present Proposal 9. Design Evaluation Instruments 10. Determine & Develop Content 11. Design or Select Training and Development Methods 12. Test Program and Make Revisions BEFORE DURING AFTER Results-Oriented Model
25
1. Conduct a Needs Analysis 2. Develop Tentative Objectives 3. Establish Baseline Data 4. Select Evaluation Methods/Designs 5. Finalize Program Objectives 7. Estimate Program Cost/Benefits 8. Prepare and Present Proposal 9. Design Evaluation Instruments 10. Determine & Develop Content 11. Design or Select Training and Development Methods 12. Test Program and Make Revisions BEFORE 13.Implement and Conduct Program 14.Evaluate/Collect Data at Proper Stages 15.Analyze and Interpret Data (Improvement Notes) DURING 16. Make Adjustments 17. Calculate Return-on- Investment 18. Communicate Results AFTER Results-Oriented Model Continuous Improvement
26
Dr. Jeff M. AllenDepartment of Technology and Cognition Limitations of Evaluation & Assessment Evaluation/Assessment (generally) Does Not Solve Problems, It Serves to Identify Them. Improvement is Next!
27
Dr. Jeff M. Allen University of North Texas Department of Technology and Cognition P.O. Box 311337 Denton, Texas 76203-1337 Email: Jallen@unt.edu Phone: (940)565-4918 To Download Presentation: http://www.coe.unt.edu/Allen/download.htm (Available until February 2000)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.