SCIS ID Workshop: Learning Objectives & Exam Development John Ollerenshaw & Marguerite Koole Educational Media Development Athabasca University.

Slides:



Advertisements
Similar presentations
Developing and writing learning objectives
Advertisements

Forsyth.schoolwires.com Ways People Use Grades Communication to parents and others Student self-evaluation Select, identify, or group students Provide.
[Insert faculty Banner] Consistency of Assessment
Introduction to Grading Grading is one of the most important activities a faculty member does. Many problems in teaching arise because of grading issues.
Assessment Curriculum Leadership Workshop for Science Head Teachers Science Unit Curriculum Directorate.
OSCEs Kieran Walsh OSCE Means “objective structured clinical examination” Assesses competence Developed in light of traditional assessment methods.
[Insert faculty Banner] Consistency of Assessment
Tyler’s Model of Curriculum Development
PRIOR LEARNING ASSESSMENT RESEARCH PROJECT IN NURSING Mount Royal College, Calgary, Alberta Purpose of the Research: To assess the impact of.
The ABCs of Assessment Improving Student Learning Through New Approaches to Classroom Assessment.
Testing What You Teach: Eliminating the “Will this be on the final
Gary D. Borich Effective Teaching Methods 6th Edition
© 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Instrumentation Chapter Seven.
7 Strategies of Assessment for Learning
ASSESSMENT TERMS Need to make decisions about students on a continual basis – sometimes HIGH STAKES (important and difficult to change), often LOW STAKES.
X-class Survey Please respond to each statement based on your own personal opinion. For each statement, please indicate whether you: A. Agree B. Unsure.
From M-Library to Mobile ESL: Athabasca University as Advocate in Mobile Learning Stella Lee Instructional Media Analyst Athabasca University.
Empowering learners with mobile learning Dr. Mohamed Ally Colin Elliott Tracey Woodburn.
Open Educational Resources: Recent experience developing university courses in Barbados Griff Richards Director, Technology Enhanced Knowledge Research.
Implementing Undergraduate-Faculty Teaching Partnerships in Your Classroom Anna L. Ball Neil A. Knobloch University of Illinois, Urbana-Champaign.
Copyright 2001 by Allyn and Bacon Standardized Testing Chapter 14.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
Research Methods in MIS
About the tests PATMaths Fourth Edition:
Tutorial of Instructional Design
Standardized Test Scores Common Representations for Parents and Students.
MEASUREMENT AND EVALUATION
Measurement and Data Quality
GUIDE FOR (Doctorate) Servei de Gestió Acadèmica ONLINE REGISTRATION.
Standardized Tests. Standardized tests are commercially published tests most often constructed by experts in the field. They are developed in a very precise.
Student Growth Measures in Teacher Evaluation Module 2: Selecting Appropriate Assessments 1.
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
Assessment Group for Provincial Assessments, June Kadriye Ercikan University of British Columbia.
Classroom Assessment and Grading
MEASUREMENT CHARACTERISTICS Error & Confidence Reliability, Validity, & Usability.
University of Durham D Dr Robert Coe University of Durham School of Education Tel: (+44 / 0) Fax: (+44 / 0)
Instrumentation (cont.) February 28 Note: Measurement Plan Due Next Week.
Developing Structured Activity Tools. Aligning assessment methods and tools Often used where real work evidence not available / observable Method: Structured.
EDU 385 Education Assessment in the Classroom
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
GUIDE FOR (Doctorate) Servei de Gestió Acadèmica ONLINE REGISTRATION.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
APRIL 16, 2013 A review of Chapters CHAPTER FOURTEEN The Role of Assessment VOCABULARY.
Language testing and assessment Kalbos gebėjimų tikrinimas ir vertinimas Spring term 2011.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
POL 242 Introduction to Research Methods Assignment Five Tutorial Indexes July 12, 2011 Anthony Sealey
Chapter 10: Measurement Slavin Reliability & Validity Types of Measures Behavioral Observation Systems Sample Size Determination.
Assessment and Testing
Developing Assessment Instruments
Assessing Learners with Special Needs: An Applied Approach, 6e © 2009 Pearson Education, Inc. All rights reserved. Chapter 5: Introduction to Norm- Referenced.
Assessment in Higher Education Teaching. What is an assessment? “An assessment is an activity, assigned by the professor, that yields comprehen- sive.
Multiple Answer Questions. Multiple Answer Question Types Look for patterns in the answers Questions with two “sets” of answers – The answer is one from.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Athabasca University Canada’s Open University. Mission Athabasca University, Canada’s Open University, is dedicated to the removal of barriers that.
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Georgia Milestones End of Grade (EOG) Assessment Grades 3, 4, and 5
RELIABILITY BY DONNA MARGARET. WHAT IS RELIABILITY?  Does this test consistently measure what it’s supposed to measure?  The more similar the scores,
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.
Essentials of Specifications Grading. Raises academics standards considerably Raises the standard for faculty communication Responsible for providing.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
Development of Assessments Laura Mason Consultant.
CHAPTER 3: Practical Measurement Concepts
Concept of Test Validity
Statistical Inference: One- Sample Confidence Interval
Excel Exam Review March 29,
Principles of Assessment
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Presentation transcript:

SCIS ID Workshop: Learning Objectives & Exam Development John Ollerenshaw & Marguerite Koole Educational Media Development Athabasca University

Outline 1. Introduction 2. Sample unit 3. What are learning objectives? 4. Advice for writing learning objectives 5. Review the objectives in the sample unit 6. Exams & the AU mandate 7. What makes a fair exam? 8. Review the sample exam

Sample Unit Browse through the sample unit Pay attention to the learning objectives What would you correct?

Learning Objectives “A learning objective is a statement that tells what learners should be able to do when they have completed a segment of instruction” (Smith & Regan, 1999, p. 84).

Exam Development University mandate Norm-reference exams Criterion-referenced exams Validity, reliability & practicality

AU Mandate Athabasca University, Canada’s Open University, is dedicated to the removal of barriers that restrict access to, and success in, university-level studies and to increasing equality of educational opportunity for adult learners worldwide.

Types of Exams Criterion-referenced Determine competency Identify gaps or areas of remediation Based on unit objectives Not concerned with curves Fits with AU mandate Norm-referenced Used for ranking and comparison of students in order to restrict entry into programs or courses Not helpful in determining gaps Use curves Does not fit with AU mandate

What makes a good exam? Review the sample exam What’s wrong with the exam?

Validity Does the question/exam measure what it is supposed to measure? Are the questions consistent with the objectives? Are the questions representative of the range of objectives? Are the objectives adequately sampled?

Reliability Does the exam consistently measure what it is supposed to measure? Eliminate guessing?  number of items per objective =  reliability Use an odd number of questions per objective. Statistics Chronbach’s alpha

Practicality Conditions & scenarios close to real life or those presented in unit of learning Consideration for Cost Marking time Lag time Does the student have enough time to complete the exam?

Other stuff... Is the value of each question clearly indicated? Are the instructions clear?

Sample Exam Review the sample exam Use the exam checklist

Things to remember... All readings, commentary, exercises, activities, and exam questions should be based on learning objectives!

More information: In Moodle, you will find annotated copies of the sample unit and exam as well as other related resources. Please follow the instructions below. Log into Select zCourse COMP ID Create a new account Enrolment key: SCISid.