Teaching, Learning, and Testing: Finding Congruence

Slides:



Advertisements
Similar presentations
Designing Instruction Objectives, Indirect Instruction, and Differentiation Adapted from required text: Effective Teaching Methods: Research-Based Practice.
Advertisements

Copyright ©2007 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. Gary D. Borich Effective Teaching Methods, 6e Gary.
Clinical Coach Standardisation Meeting August 2011.
Use of Education Simulations and Serious Games in Language Instruction The Interagency Language Roundtable George P. Schultz national Foreign Affairs Training.
Copyright 2001 by Allyn and Bacon Social Cognitive and Constructivist Views of Learning: Chapter 9.
Measuring Learning Outcomes Evaluation
Instructional Strategies Instructional strategies – refer to the arrangement of the teacher, learner, and environment Many different types – we will explore.
Assessment Literacy for Language Teachers by Peggy Garza Partner Language Training Center Europe Associate BILC Secretary for Testing Programs.
Click to edit Master title style  Click to edit Master text styles  Second level  Third level  Fourth level  Fifth level  Click to edit Master text.
A presentation by Elena Chiaburu
Learning Objectives. Objectives Objectives: By the conclusion to this session each participant should be able to… Differentiate between a goal and objectives.
ACADEMIC DIRECTION / TESTING OFFICE. Language proficiency scale 0 No practical proficiency 1 Elementary 2 Fair Limited working 3 Good Minimum professional.
Curriculum and Learning Omaha Public Schools
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Language Implications of NATO’s Expanding Roles Dr. Ray T. Clifford BILC Conference, San Antonio 21 May 2007.
Chapter 1 Defining Social Studies. Chapter 1: Defining Social Studies Thinking Ahead What do you associate with or think of when you hear the words social.
Advanced Language Learners Levels V, VI, VII (2) Using age-appropriate activities, students master novice tasks, expand their ability to perform intermediate.
Bloom’s Taxonomy.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
DLIFLC 7-9 FEB 01 Diagnostic Assessment Thomas S. Parry Directorate of Continuing Education Defense Language Institute BILC Professional Seminar 2005 Sofia,
New Pathways to Academic Achievement for K-12 English Learners TESOL March 26, 2009 Anna Uhl Chamot The George Washington University.
INSTRUCTIONAL OBJECTIVES
How to design better questions!
Bloom’s Taxonomy Benjamin Samuel Bloom He was one of the greatest minds to influence the field of education. He was born on February 21, 1913 in Lansford,
Revised Bloom’s Taxonomy Designing Aligned Instruction.
Bloom’s Taxonomy The Concept of “Levels of Thinking”
COURSE AND SYLLABUS DESIGN
Aligning Program Goals, Instructional Practices, and Outcomes Assessment Dr. Ray T. Clifford BILC Conference, Budapest 29 May 2006.
Write your personal definition of “cognitive rigor” What do rigorous academic environments look and sound like?
FSM NSTT Teaching Competency Test Evaluation. The NSTT Teaching Competency differs from the three other NSTT tests. It is accompanied by a Preparation.
CURRICULUM EVALUATION. Citation and Skill Focus  Charles, R. I., et al. (1999). Math, Teacher’s Edition, Vol 2. New York: Scott Foresman-Addison.
BILC Conference Athens, Greece 22 – 26 June 2008 Ray T. Clifford
Ways of doing Needs Assessment
Techniques and Principles in Language Teaching
50 Years of BILC: The Evolution of STANAG – 2016 and the first Benchmark Advisory Test Ray Clifford 24 May 2016.
Bloom’s Taxonomy Investigating Cognitive Complexity
Using Cognitive Science To Inform Instructional Design
Off-the-Job Training Methods
Experimental Psychology
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Using the North Carolina Teacher Evaluation Rubric Proactively
IB Assessments CRITERION!!!.
How Accurate Are Self-Assessments of Second Language Proficiency?
DECTFL Fall Conference October 6, 2017
EDU704 – Assessment and Evaluation
Teaching and Learning with Technology
Planning for Social Studies Instruction
CHAPTER 4: INSTRUCTIONAL GOALS AND OBJECTIVES
Learning and Teaching Principles
Kuwait National Curriculum
Teaching with Instructional Software
IN THE NAME OF “ALLAH” THE MOST BENIFICENT AND THE MOST MERCIFUL
Writing Objectives in Blooms Taxonomy
Understanding by Design Ensuring Learning through Lesson Design
Home.com By: Vicky Parisien.
Sheltered Instruction Observation Protocol
Applying Psychology to Teaching
Vocabulary and good language learners
The curriculum The curricullum tells «What and how the learners should learn» at specific levels of the education system. It includes the objectives and.
Critically Evaluating an Assessment Task
Creating Meaningful Student Learning Outcomes
Q uality uestioning Henrico County Public Schools
Section VI: Comprehension
Creative Activities and Curriculum for Young Children
Writing Learning Outcomes
Applying Psychology to Teaching
Learning Assessment Learning Teaching Dr. Md. Mozahar Ali
Norman L Webb.
Skills for Learning, Life and Work
Presentation transcript:

Teaching, Learning, and Testing: Finding Congruence BILC Professional Seminar Zagreb, Croatia 16 October 2018 Ray Clifford

An old proverb says, Time is the best teacher.

But it has also been noted that: Time may be the best teacher, but it eventually kills all of its students.

A seminal work in Education is Bloom’s Taxonomy of Educational Objectives Note: the author called this book, “The most-quoted, least-read book ever published.” There is a revised version titled, A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives, Abridged Edition. Editors, Lorin W. Anderson, et al. Longman, New York. 2001 www.celt.iastate.edu/teaching/RevisedBlooms1.html

The revised version divides the original taxonomy into two dimensions: 1. Ways of thinking and 2. Types of knowledge . This revision is summarized in the chart on the next slide. Thinking. The columns represent each of the six levels of the Cognitive Process Dimension—ranging from lower-order thinking skills at the left to higher-order thinking skills at the right. Knowing. The rows represent the Knowledge Dimension—ranging from concrete knowledge at the bottom through abstract knowledge at the top.

Bloom’s Revised Taxonomy (With added highlighting and annotation) Common Combinations Cognitive Dimension Remember Understand Apply Analyze Evaluate Create Knowledge Dimension Metacognitive Procedural Conceptual Factual

Revised Bloom’s Taxonomy: Greater Accuracy, but Greater Complexity 4 levels of knowledge. 6 levels of cognition. 4 x 6 equals 24 interactions (or combinations). Is this increased accuracy useful? How much complexity can humans handle?

“How Many Variables Can Humans Process. ” Graeme S. Halford, et al “How Many Variables Can Humans Process?” Graeme S. Halford, et al. Psychology Science, Vol 16, No. 1, pp. 70-76, American Psychology Society, 2005. Page 70. n=30 / Three quotes from the research findings: “As the order [the number] of the interaction[s] increases, the number of variables increases.” “Results showed a significant decline in accuracy and speed of [problem] resolution from three-way to four-way interactions.” “Furthermore, performance on a five-way interaction was at [the] chance level.”

(Trained individuals) “How Many Variables Can Humans Process?” Graeme S. Halford, et al. Psychology Science, Vol 16, No. 1, pp. 70-76, American Psychology Society, 2005. n=30 (Trained individuals) Problem Complexity 2-way 3-way 4-way 5-way* 2 Problems Right 30 26 13 NA 1 Problem Right 4 12 0 Problems Right 10 My interpretation: Good OK Poor Terrible * One problem, n = 22. Chance responses, consistent with results extrapolated from the other study.

Can Bloom’s revised taxonomy be simplified to make it more useable? The Knowledge and Cognitive Dimensions can be (re)combined into three levels: Direct Application of factual knowledge Near transfer of conceptual knowledge. Far transfer of abstract knowledge. These three levels can be applied to learning and to testing activities.

The expected level of learning: (3 Levels of Learning Outcomes) Direct Application Near Transfer Far Transfer 11

The 1st Level of Learning Outcomes With direct application learning, students… Memorize and practice specific responses. Focus on the content of a specific course, textbook, or curriculum. Learn only what is taught. 12

The 2nd Level of Learning Outcomes With near transfer learning, students… Go beyond rote responses and use rehearsed and semi-rehearsed responses. Focus on a specific set of tasks and communicative settings and respond within those domains. Apply what they learn within a range of familiar, predictable settings. 13

The 3rd Level of Learning Outcomes With far transfer learning, students… Develop the ability to transfer what is learned from one context to another. Acquire the knowledge and skills needed to respond spontaneously to new, unknown, or unpredictable situations. Learn how to continue learning and to become independent learners. 14

The testing method used: (3 Types of Tests) Achievement Performance Proficiency 15

Achievement tests measure: The 1st Type of Test Achievement tests measure: Practiced, memorized responses. Acquisition of what was taught. The content of a specific textbook or curriculum. 16

Performance tests measure: The 2nd Type of Test Performance tests measure: Rehearsed and semi-rehearsed responses. One’s ability to respond in constrained, familiar, and predictable settings. The transfer of learning to similar situations. 17

Proficiency tests measure: The 3rd Type of Test Proficiency tests measure: Whether skills are transferable to new tasks. One’s spontaneous, unrehearsed abilities. The general ability to accomplish tasks across a wide variety of real-world settings. 18

Advanced Proficiency Requires Far Transfer Learning A By-Level Proficiency Summary with Text Types (Green = Far Transfer, Blue = Near Transfer, Red = Direct Application) NATO LEVEL FUNCTION/TASKS CONTEXT/TOPICS ACCURACY All expected of an educated NS [Books] Accepted as a well-educated NS All subjects 5 Tailor language, counsel, motivate, persuade, negotiate [Chapters] Wide range of professional needs Extensive, precise, and appropriate 4 Support opinions, hypothesize, explain, deal with unfamiliar topics [Multiple pages] Practical, abstract, special interests Errors never interfere with communication & rarely disturb 3 2 Narrate, describe, give directions [Multiple paragraphs] Concrete, real-world, factual Intelligible even if not used to dealing with non-NS Q & A, create with the language [Multiple sentences] Intelligible with effort or practice 1 Everyday survival Random Unintelligible Memorized [Words and Phrases] 19

A Personal Observation

Tests Define Instructional Expectations Students (especially adults learners) don’t want to waste their time studying what is not going to be “needed”. For students (and often commanders, teachers, and administrators); the tests used and not a course’s stated learning objectives define what is “needed”. Therefore, tests can have a negative or a positive impact on learning.

“Washback” Effects of Testing Testing has a negative impact when: Educational goals are reduced to those that are most easily measured. Testing procedures do not reflect course goals, for instance… Giving multiple choice tests in speaking classes. Using grammar tests as a measure of general proficiency.

“Washback” Effects of Testing Testing has a positive impact when: Tests reinforce course objectives. Tests act as change agents for improving teaching and learning. The test results are useful for students, teachers, and administrators.

Aligning Teaching and Testing Promotes Learning Direct Application <=> Achievement Memorized responses using the content of a specific textbook or curriculum. Near Transfer <=> Performance Rehearsed ability to communicate in specific, familiar settings. Far Transfer <=> Proficiency Unrehearsed general ability to accomplish real-world communication tasks across a wide range of topics and settings. 24

When teaching and testing are not aligned, learning suffers. Direct Application Teaching + Proficiency Testing => Learners will fail the tests. Learners won’t be prepared for the tests. Motivation will be reduced. Far Transfer Teaching + Achievement Testing => You’ll get Direct Application learning. Students will adjust their learning down to the level of the tests. 25

When teaching and testing are not aligned, learning suffers. Direct Application Teaching + Proficiency Testing => Learners will fail the tests. Far Transfer Teaching + Achievement Testing => You’ll get Direct Application learning. Note # 1: The type of test used can limit the level of students’ learning. 26

Curriculum, Instructional, and Testing Constraints Can Also Reduce the Scope of Learning… High academic goals are set and learner outcomes are defined. 1. Developers include only examples of the most frequently occurring or important goals in a textbook. 2. Teachers present as much of the textbook as time allows. 3. Students are tested on a sample of items drawn from the textbook. 4. Instructional Goals and Learning Outcomes Textbook Teaching Test

Curriculum, Instructional, and Testing Constraints Can Reduce the Scope of Learning… High academic goals are set and learner outcomes are defined. 1. Developers include only examples of the most frequently occurring or important goals in a textbook. 2. Teachers present as much of the textbook as time allows. 3. Students are tested on a sample of items drawn from the textbook. 4. Instructional Goals and Learning Outcomes Textbook Teaching Test Note # 2: The reduced content of the tests used can limit the breadth of students’ learning.

An Alternative Teaching and Testing Model Course developers sample from the real-world domain areas to create a textbook. 2a. Textbook Set instructional goals and define expected learner outcomes. 1. Real-world Instructional Domains: cognitive understanding, psychomotor skills, and affective insights. Teachers adapt text materials to learners’ abilities, diagnose learning difficulties, adjust activities and add supplemental materials to help students apply new knowledge and skills in constrained achievement and performance areas, and then in real-world proficiency settings. 3. Teacher Test developers use an independent sample of the real-world domain areas to create proficiency tests that are not based on the textbook. 2b. Students Students practice, expand, and then demonstrate their unrehearsed extemporaneous proficiency across a broad range of real-world settings that are not in the textbook. 4. Test

Simply stated, “good” teaching (reinforced by “good” testing) can expand the scope of learning. High academic goals are set and learner outcomes are defined. 1. Developers include only examples of the most frequently occurring or important goals in a textbook. 2. Teachers present as much of the textbook as time allows. 3. Students are tested on a sample of items drawn from the textbook. 4. Instructional Goals and Learning Outcomes (STANAG 6001?) Textbook Teaching Test

So, what constitutes “good” teaching and testing? Making your teaching/learning activities match the level of your desired learner outcomes. Using the kind of the tests that match the level of those learning outcomes. Achievement tests for direct application objectives. Performance tests for near transfer objectives. Proficiency tests for far transfer objectives.

If those suggestions for “good” teaching are followed, a teaching and testing model will emerge that: Will not be based on successively derived, reduced subsets of the original objectives. Will maintain the students’ focus on the course’s true learning objectives. Will reinforce the teacher’s role as a “facilitator” of learning, rather than as a “presenter” of information.

Keep it simple! 3 x 3 x 3 = 27!!! Desired Learning Outcomes Doesn’t work. The mind can only handle 3 interactions at a time. Desired Learning Outcomes a. Direct application b. Near transfer c. Far transfer Keep it simple! Work with a maximum of three variables. Reduce complexity by aligning levels; i.e. use all “a”, all “b”, or all “c” levels. Teaching Activities a. Presentation b. Application c. Problem solving Testing a. Achievement b. Performance c. Proficiency

And once again… If things align, everything’s fine! Thank you.