Objectives Identify the role of thinking in learning

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Thinking Skills and Personal Capabilities Unit 1
School Based Assessment and Reporting Unit Curriculum Directorate
Analyzing Student Work
Bringing it all together!
Developing a Thinking Culture. Define Rank What does it mean ‘Thinking Skills’ ?
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Well written objectives will… Provide clear direction for instruction Convey what is to be learned to students Provide a clear guide for assessment.
Advance Organizer Explain how good thinking enhances learning
Consistency of Assessment
Problem Based Lessons. Training Objectives 1. Develop a clear understanding of problem-based learning and clarify vocabulary issues, such as problem vs.
Principles of High Quality Assessment
SUNITA RAI PRINCIPAL KV AJNI
Thinking: A Key Process for effective learning “The best thing we can do, from the point of view of the brain and learning, is to teach our learners how.
Thinking, reasoning and working mathematically
Measuring Learning Outcomes Evaluation
Use of Process Tools One of the main tasks in facilitation is the effective generation and management of information. Process Tools help to manage information.
Using Situational awareness and decision making
Science Inquiry Minds-on Hands-on.
Analytical Thinking.
UNIT 9. CLIL THINKING SKILLS
Interactive Science Notebooks: Putting the Next Generation Practices into Action
CHALLENGES AND OPPORTUNITIES FOR CRITICAL ANALYSIS IN ASSESSMENT.
Welcome!.
The Comprehensive School Health Education Curriculum:
Mind Map of Edward De Bono’s Thinking Hats
Session Outcomes Identify the role of thinking in learning (e.g., what good thinking enables us to do better) Use a ‘model of thinking’ to ‘model good.
Thinking Actively in a Social Context T A S C.
Home, school & community partnerships Leadership & co-ordination Strategies & targets Monitoring & assessment Classroom teaching strategies Professional.
Interstate New Teacher Assessment and Support Consortium (INTASC)
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
A Framework for Inquiry-Based Instruction through
What now? Is this the best? PROBLEM SOLVING AS A STRATEGY.
Standards-Based Science Instruction. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Classroom Assessment A Practical Guide for Educators by Craig A
I want good Thinking on this This involves Critical Thinking – have I seen this problem before, what are the likely causes, what information do I need.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Teaching Today: An Introduction to Education 8th edition
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
1 Ideas of Problem-based Learning As a learner-centred process, problem- based learning meets the learners' interests and as such gives room for developing.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Frame for this Workshop Making the simple complicated is commonplace; making the complicated simple, awesomely simple, that's creativity Teach Less Learn.
Transfer Like a Champ! By Michelle Brazeal. Transfer Training Why do we teach?
Good Agricultural Practices Teaching Adult Learners.
Assessment and Testing
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
IST_Seminar II CHAPTER 12 Instructional Methods. Objectives: Students will: Explain the role of all teachers in the development of critical thinking skills.
ACE TESOL Diploma Program – London Language Institute OBJECTIVES You will understand: 1. A variety of interactive techniques that cater specifically to.
Teaching and Learning with Technology, 4e © 2011 Pearson Education, Inc. All rights reserved. Chapter 3 Designing and Planning Technology- Enhanced Instruction.
Educational Methods The bag of tricks Direct Instruction/Lecture ä Advantages ä Teacher controlled ä Many objectives can be mastered in a short amount.
Reflective Thinking. Reflective thinking Critical thinking and reflective thinking are often used synonymously. However, where critical thinking is used.
Bloom’s Taxonomy The Concept of “Levels of Thinking”
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
#1 Make sense of problems and persevere in solving them How would you describe the problem in your own words? How would you describe what you are trying.
Session: Instructional methods; How to prepare practical exercise/case study 24 th January 2013 Dr. Eliona Kulluri Bimbashi (University of Tirana)
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Teaching Children About Food Safety Food Safety Professional Development for Early Childhood Educators.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
Planning for Instruction and Assessments. Matching Levels Ensure that your level of teaching matches your students’ levels of knowledge and thinking.
Module II Creating Capacity for Learning and Equity in Schools: The Mode of Instructional Leadership Dr. Mary A. Hooper Creating Capacity for Learning.
Classroom Assessment A Practical Guide for Educators by Craig A
Analyzing Student Work Sample 2 Instructional Next Steps
Unit 7: Instructional Communication and Technology
Designing and Planning Technology-Enhanced Instruction
Assessment The purpose of this workshop / discussion is to extend further teachers’ understanding of the Department's Assessment Advice. This workshop.
Presentation transcript:

Objectives Identify the role of thinking in learning Use a ‘model of thinking’ to ‘model good thinking’ Identify the components of a Thinking Curriculum Evaluate a range of active learning strategies for promoting types of thinking Produce real-world performance tasks

Good thinking, what’s that? This involves Critical Thinking – have I seen this problem before, what are the likely causes, what information do I need to clearly interpret what’s occurring....? I want good Thinking on this Good thinking, what’s that?

Problems of Definition “But the heart of the problem is our failure to define such terms as critical thinking, problem solving, metacognition, reasoning, and abstract thinking. Without adequate definition and training, teachers lack the knowledge and skills to teach and test for these desirable but elusive human qualities” (Haladyna, 1997)

Thinking – A major key for effective learning “The best thing we can do, from the point of view of the brain and learning, is to teach our learners how to think” (Jenson, 1996, p.163) “Thought is the key to knowledge. Knowledge is discovered by thinking, analyzed by thinking, organized by thinking, transformed by thinking, assessed by thinking, and, most importantly, acquired by thinking” (Paul, 1993 vii) Thinking is the key cognitive process that builds Understanding

Knowledge, Rote-learning as well as Thinking are important in effective learning Debates about the relative merits of teaching content Vs process, transmission of knowledge Vs discovery learning, thinking Vs rote learning, etc, only cloud rather than help effective pedagogy. For example, there is now virtual agreement among cognitive psychologists that effective thinking - however defined - needs an extensive and well organized knowledge base. As Resnick (1989) summarizes: Study after study shows that people who know more about a topic reason more profoundly about that topic than people who know little about it. (p.4) Similarly, Satinover (2001), drawing from recent brain research makes the case for the importance of repetition in the learning process: …these mundane chores are precisely what turns the fourth brain from a mass of randomness into a intellect of dazzling capacity. “Genius,” according to Thomas Edison, “is one percent inspiration and ninety-nine percent perspiration. Of “critical thinking skills,” he had nothing to say. (p.49)

Copyright 2010: D. Sale & SM Cheah. All Rights Reserved Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention What is thinking? Thinking is the conscious and goal-directed mental activity we do in order to solve problems Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

we would not have to think In a perfect world, we would not have to think Because we would never have to solve any problems

Find me a girlfriend – potential wife Wife leaves me for Brad Pitt - What to do, lah?

A Model of Thinking Meta-cognition Comparison & Contrast Inference & Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention A Model of Thinking Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Generating Possibilities Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Generating Possibilities What do we do when we generate possibilities? Generate many possibilities Generate different types of possibilities Generate novel possibilities Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis All creative products involve the combining of old ideas or elements in new ways Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Comparison and Contrast Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Comparison and Contrast Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis What do we do when we compare and contrast? Identify what is similar between things - objects/options/ideas, etc Identify what is different between things Identify and consider what is important about both the similarities and differences Identify a range of situations when the different features are applicable Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Analysis What do we do when we analyse? Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Analysis Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis What do we do when we analyse? Identify relationship of the parts to a whole in system /structure/model Identify functions of each part Identify consequences to the whole, if a part was missing Identify what collections of parts form important sub-systems of the whole Identify if and how certain parts have a synergetic effect Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Inference and Interpretation Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Inference and Interpretation What do we do when we make inferences and interpretations? Identify intentions and assumptions in data Separate fact from opinion in data Identify key points, connections, and contradictions in data Make meaning of the data/information available Establish a best picture to make predictions Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Evaluation What do we do when we evaluate? Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Evaluation What do we do when we evaluate? Decide on what is to be evaluated Identify appropriate criteria from which evaluation can be made Prioritize the importance of the criteria Apply the criteria and make decision Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Meta-cognition What are we doing when we are meta-cognitive? Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Meta-cognition What are we doing when we are meta-cognitive? Aware that we can think in an organized manner Actively thinking about the ways in which we are thinking Monitoring and evaluating how effective we are thinking Seeking to make more effective use of the different ways of thinking and any supporting learning/ thinking strategies /tools Meta-cognition Comparison & Contrast Inference & Interpretation Evaluation Generating Possibilities Analysis Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

CDIO Skills Workshop 1 Diploma in Chemical Engineering CP5033 Plant Safety & Loss Prevention Thinking about your thinking What assumptions did I made? How can I spot an error if I make one? Do I know what do I need to know? Put simply, meta-cognition is Being aware of one’s thinking, evaluating how well we are using the range of specific types of thinking and taking necessary corrective action Commitment to task Willingness to exert self control of learning Being aware of the level of attention needed for a task and be able to adjust their focus accordingly. Evaluation, planning, and regulation help students gain executive control of behaviour. These processes are the primary focus of many definitions of metacognition. Evaluation refers to students' ongoing assessments of their knowledge or understanding, resources, tasks, and goals. Planning involves the purposeful selection of strategies for specific tasks and is dependent on declarative and conditional knowledge. Regulation includes the monitoring and revision of progress toward goals. Evaluation, planning, and regulating should take place at before, during, and after stages of tasks. Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

Pedagogic Themes of a Thinking Curriculum Thinking as the ‘tool’ for understanding Integration of teaching, learning and assessment Real world competency and problem-solving TC Performance-based assessment Positive dispositions and beliefs that support effective learning This slides identifies the key underpinning pedagogical assumptions of the Thinking Curriculum. The various components are derived from established learning theories and recent brain research, as well as our own professional knowledge of how meaningful learning is best achieved A science of learning approach to pedagogic design

Systematic infusion of types of thinking in the curriculum Learning Outcomes Types of Thinking Assessment System Instructional Strategies This slides identifies the 3 most important components of the curriculum (learning outcomes, Instructional methods/strategies and assessment), which need to be planned in unison to achieve consistency of purpose. For example, the assessment strategy must validly assess the learning outcomes.Similarly, the instructional methods and strategies must be those which offer the best opportunities for facilitating the types of learning identified in the learning outcomes In basic terms this means that the types of thinking incorporated in the Learning Outcomes must be effectively taught through the Instructional Strategies used and accurately measured in the Assessment System. (Aligned curriculum design)

Infusion Approach 1 Curriculum Compare & contrast Analysis Inference & interpretation Evaluation Generating possibilities Metacognition Thinking Skills This method of incorporating thinking into the curriculum is advocated by the work of Professor Robert Swartz, who has written extensively in this area. In his model, he identifies a range of thinking skills, which are to be appropriately and systematically infused into the content of a curriculum. He argues that thinking is best taught - and learned – in the context of the content of the curriculum. Curriculum

thinking that underpin competent performance Real world applications Infusion Approach 2 Specific types of thinking that underpin competent performance Real world applications of the subject content This method of infusion was derived from my own work and advocates a more ‘bottom up’ approach than Swartz’s top down method. Rather than start with a menu of thinking skills, begin instead with a consideration of what you want form the curriculum. I suggest that most curriculum should attempt to provide a competency focus, whereby learning resembles what occurs in the real world. Having identified the real world activities that students will be expected to perform, the types of thinking are derived by examining what highly competent professional do in performing these activities. The following slide demonstrates this process with an example Curriculum

Identifying the Types of Thinking Step 1 Refocus the curriculum towards real world activities or competency Step 2 Identify the types of thinking that underpin competent performance in these real world activities through COGNITIVE MODELING In doing this it is useful to start by asking the question: How does a highly competent person think in the effective execution of this activity? Example from a Business Law Module: Predict possible legal outcomes in the event of a breach of contract Analyse the components of a contract Compare and contrast the expected and the actual behaviour of defendants Make inferences and interpretations concerning the behaviour Evaluate the possibility of specific outcomes

Writing learning outcomes Write in direct performance terms – focusing on: the Type of Thinking or Product Outcome Analyse the impact of pollution on water quality Compare and contrast a range of retaining structures Generate new design options for marketing a health food Predict the outcomes of specified legal scenarios Conduct a product packaging tests for a specified product Prepare a voyage passage plan Write a programme in Java script to animate a range of figures Prepare a tender report Having identified the areas of competence and underpinning types of thinking in your curriculum, you can now construct the appropriate learning outcomes. Whether the learning outcomes focus on the types of thinking or product outcomes does not matter. What is essential is that the learning outcomes embody the essential content knowledge, types of thinking and other process or attitudinal components felt appropriate for effective learning in this curriculum

Promoting thinking – general instructional principles Systematically teach and model the types of thinking, taking students through the range of cognitive operations for each type of thinking Use appropriate language to direct and reinforce types of thinking (e.g., “Lets compare & contrast these two reactors”) Use structured questions to promote specific types of thinking (e.g., what inferences and interpretation scan we make about cloning from this data) Involve students in real world learning tasks which necessitate direct use of the types of thinking Consistently promote values and dispositions conducive to good thinking and effective learning (e.g., persistence, flexibility, attention to detail, etc) We are now quite clear on the types of instructional methods and strategies that help to promote thinking and meaningful learning. The uses of active and collaborative methods, real world based learning tasks, a classroom climate of inquiry and mutual support are fully supported by a wide research of recent research on how we learn and related brain behaviour.

Instructional methods and strategies that provide opportunities for thinking Questioning Small group activities that involve specific types of thinking (e.g. buzz groups, rounds, poster board tours, etc) Co-operative learning structures Case studies Projects Role play Performance tasks that involve specific types of thinking Discussion/Debates Thinking Tools, e.g., Mind mapping, ‘Thinking Hats’, Plus-Minus-Interesting, Forced Associations, etc

The Power of Questions “Questions are the primary way we learn virtually everything” “Thinking itself is nothing but the process of asking and answering questions” “Questions immediately change what we focus on and, therefore, how we feel” (Anthony Robbins, 2001, pp.179-8)

Using Questions The effective use of questions is a powerful means of promoting specific types of thinking, for example: What are the similarities and differences between Hepatitis A and HIV? In what ways are these differences significant? What inferences and interpretations can be drawn from the data on HIV infection in Asia? How might we evaluate the effectiveness of the present HIV prevention programme? What is the relationship between HIV infection and poverty? What other ways might we make people more aware of HIV infection?

Ways in which meta-cognitive thinking can be developed & enhanced: Make students Aware of this distinctively human capability and how it works Explain and demonstrate how metacognition works Illustrate with a range of examples why metacognition is so important in learning and personal success Build metacognitive thinking into specific learning activities (e.g., project work Get students to reflect on and document the quality of their thinking, identifying challenges faced in their learning and how they have gone about tackling these challenges Facilitate and reinforce metacognition through other ‘Teachable Moments’ Whenever metacognitive thinking would be valuable to enhancing thinking and learning Copyright 2010: D. Sale & SM Cheah. All Rights Reserved

What do we mean by Cooperative Learning Structures? A Structure is a content free way (tool) for organising social interaction in the classroom. Content is placed into a structure to create a Activity which necessitates cooperative learning. Activities are then designed into lessons to meet specific learning outcomes (e.g., activating prior learning, promoting types of thinking, reinforcing key content understanding, developing social skills, etc)

known as Think-Pair-Share) Timed Pair Share Basic Theme: In pairs, students share with a partner for a predetermined time while the partner listens carefully. Then partners switch roles Steps 1 Teacher announces a topic and states 4 Partner B acknowledges what was the question/problem each student will learned (e.g., “One thing I learned as I have to share on listened to you was…”) 2 Teacher provides instructions on how to 5 Pairs switch roles: Partner B speaks; select partner and allocates time for task Partner A listens 3 In pairs, Partner A speaks; Partner B 6 Partner A acknowledges learning listens A useful adaptation of this is to allow a THINK time before the sharing – known as Think-Pair-Share)

Numbered Heads Together Basic Theme: Students are presented with a question or problem- they “put their heads together”, generate and explore possible answers/solution Steps 1 The teacher has students numbered off 3 The teacher tells the students to “put their within groups, so that each student has heads together”, discuss their possible answers, a number: 1, 2, 3, 4. agree their best answer and make sure that all group members know the ‘correct’ answer 2 The teacher asks a question or presents a problem and gives ‘think time’ for 4 After a defined period of time (or when the students individually students indicate they are ready) the teacher calls a number (1, 2, 3, or 4), and all students with that number can raise their hands to respond

Circle the Sage Basic Theme: Each team-mate gathers around a different “Sage” to learn the content; they then return to compare notes Steps 1 Teacher identifies “Sages.” 2 “Sages” spread out around the 4 Sages teach; disciples take notes room and stand 3 Each member of each team 5 Disciples return to their teams, and gathers around a different sage, compare notes with team-mates. to become a “Disciple.”

Thinking Tools and Techniques Mindmapping (A learning & thinking tool) Thinking Hats (A thought management tool) Plus-Minus-Interesting (A simple practical tool for identifying positives, negatives and unsure elements in a situation) Force-Field Analysis (A critical and creative thinking tool for managing change) Forced Associations (A creative thinking technique to break out of traditional patterns of perception and thinking) PO (A creative thinking technique) SCAMPER (A creating thinking tool) Morphological Matrix (A creative thinking tool for creating multiple combinations) Note: thinking tools and techniques don’t do the thinking, they only provide a means for organizing your thinking

Mind Map of Edward De Bono’s Thinking Hats White Hat Blue Hat Facts only No opinions Metacognition Overview Red Hat Green Hat Feelings Own view Creative New ideas Black Hat Negative Logical Yellow Hat Positive Optimistic Mind Maps can promote all types of thinking as well as aid memory and learning

Plus-Minus-Interesting

Forced Associations (Random Triggers) Forced Associations is a technique for linking another thinking pattern into the one we are presently using. We do this by selecting a random concrete noun from a different field and combining it with the problem under consideration. For example, we might be looking at ways to make lifts quicker. By choosing a random word ‘Mirror’ could lead to installing mirrors by lifts. As we know this is a popular solution for ‘slow lifts’. The lift doesn’t go faster, but people waiting don’t notice this as they look in the mirror. Force Associate with ‘Mirror’

PO (Provocative Operation) PO involves making deliberately provocative statements, which seek to force thinking out of established patterns. Examples: “Everybody should go to prison” “Lets abolish schools” Having made a provocative statement, it is then necessary to suspend judgement and use the statement to generate ideas. For example, you can generate ideas by examining: The consequences of the statement What the benefits could be? What would need to change in order to make it a sensible statement? What would happen if a sequence of events changed?

S A P R C M E Substitute Combine Adapt Magnify, Minify, Modify SCAMPER is a checklist that helps to think of ways to improve existing products or create new ones Substitute Combine Adapt Magnify, Minify, Modify Put to other use Eliminate Reverse

Morphological Matrix X This tool encourages new possibilities through combining options OPTIONS X OPTIONS

Force-Field Analysis Current Situation Desired Situation Potency: 7 6 5 4 3 2 1 1 2 3 4 5 6 7 :Potency Forces driving change Forces resisting change Equilibrium The objective is to move the balance to the right, which can be achieved by: identifying forces, their causes and strength planning and acting to assist the driving forces planning and acting to reduce the resisting forces using some of the resisting forces against each other if possible

What are ‘Real World’ Learning Tasks? “Central to a pedagogy that seeks to promote the development of good thinking is the systematic use of well constructed and managed learning tasks that reflect real world activity and involve the use of specific types of thinking. (Wasserman, 1993, p.20) Such tasks are often referred to as Performance-Tasks as they concentrate on the thoughtful application of knowledge in real life contexts

Rationale for using Real World Tasks “Methods which are permanently successful in formal education … go back to the types of situation which causes reflection out of school in ordinary life. They give pupils something to do, not something to learn; and the doing is of such a nature as to demand thinking, or the intentional noting of connections; learning naturally results”. (Dewey, 1916) “Real-world learning has a backbone of problem-solving, production of work-authentic products, and investigation and research, in which all knowledge, processes, and techniques connect and are used. Most people are motivated to learn when engaged in a problem or project they care about”. (Glasgow, 1997)

Types of Real World Tasks Real work projects and tasks Simulations Problem solving through case studies Problem-based learning (PBL) activities Presentations Any activity that essentially models what would be done by people in the world of work

In groups of 3-4, design and conduct a small experiment to Example 1: Design and conduct a small experiment to test the Halo Effect In groups of 3-4, design and conduct a small experiment to test the Halo Effect in person perception.You may choose the particular focus for the experiment, but it must: Clearly test the Halo Effect in person perception Be viable in terms of accessing relevant data Meet ethical standards in conducting experiments with persons Follow an established method and procedure Produce results that support or refute the hypothesis Once completed, the experiment should be written up in an appropriate format of approximately 2000 words. It should document the important stages of the experiment and compare and contrast the data found with existing findings on the Halo Effect. This is the learning task/performance test item used to assess the learning outcomes identified in the previous slide.

Example 2: Design A Food Package Select a food product and design the packaging that you think will give it best marketability. You must be able to identify the product attributes, protection and enhancement needed to satisfy the functional and marketing requirements, and use suitable packaging material(s) and package type. The work produced should reflect the quality of your thinking in the following areas: identify the criteria for evaluating the marketability of a product analyze the components of a product that constitute an effective design generate new ways of viewing a product design beyond existing standard forms predict potential clients response to the product given the information you have monitor the development on the group’s progress and revise strategy where necessary An example of a task scenario from an engineering module. The key types of thinking have been cued for the students. It is important to note that by cuing the types of thinking for students, this does not do the work for them. This is no different from knowing what you have to do in a driving test – it does not make the test any easier, you must still have the competence to meet the driving standards. Once students become familiar with the types of thinking and develop competence in using them, it becomes less necessary to provide these cues. -----------------------------------------------------------

Steps in designing performance tasks Step 1: Identify clearly the knowledge, skills and processes to be incorporated into the task For this step it is important to: Choose specific topic areas in your curriculum that encompass key underpinning knowledge (e.g., central concepts, principles, procedures) and skills essential for understanding and performance in real world applications. Identify the types of thinking that are important for promoting student understanding and subsequent competence in these topic areas. For example, generating possibilities, analysis, comparison and contrast, inference and interpretation, evaluation, etc. Identify other process skills (e.g., communication, team-working, managing learning, etc) that are important for competent performance in the identified areas. In a Thinking Curriculum, a lot of focus is placed on students being actively and collaboratively involved in real world problem-solving. The use of real life performance tasks, projects, case studies and other simulated real world activity is essential. However, it is important that these tasks are carefully designed to effectively promote the learning outcomes of the curriculum. In this and the following slide, the key steps and notes of guidance are provided to help you design these tasks. Remember, in the first set of slides “Underpinning model of learning…” – competent performance involves the dynamic use of knowledge, thinking, doing and desire. You will probably need to do a fair bit of thinking and doing in order to produce good learning tasks. However, the effort put in will be worth the benefits gained in terms of supporting instruction and helping students to learn effectively. Also, interesting and challenging tasks usually motivate students much more than traditional classroom learning activities.

Steps in designing performance tasks Step 2: Produce the learning task It is important that the task: Clearly involves the application of the knowledge, skills and processes identified from Step 1. Is sufficiently challenging, but realistically achievable in terms of student’s prior competence, access to resources, and time frames allocated. Successful completion involves more than one correct answer or more than one correct way of achieving the correct answer Clear notes of guidance are provided, which: Identify the products of the task and what formats of presentation are acceptable (e.g. written report, learning materials, portfolio, oral presentation, etc) Specify the parameters of the activity (e.g. time, length, areas to incorporate, individual/collaborative, how much choice is permitted, support provided, etc) Cue the types of thinking and other desired process skills Spell out all aspects of the assessment process and criteria.

Key considerations in producing a marking scheme Performance areas assessed to reflect learning objectives Performance criteria for each performance area Marks weighting for each performance area to reflect table of specifications/assessment blueprint sources of Performance evidence to be used (e.g., written/oral questioning, product, observation, etc) Format for marking scheme – checklist, rating scale/ scoring rubric

Marking Formats for performance assessments marking scheme rubric analytic holistic checklist Decide on the basis of level of Inference in making assessment decision analytic or holistic rubric – what’s the difference, and on what basis would you decide?

Decide format on the basis of whether the item involves High or Low Inference Low inference items are those where the performances being tested are clearly visible and there is a widely established correct answer (e.g., conducting a fire drill, setting up an experiment) Here a Checklist is most appropriate High inference items involve performances that are less directly visible and/or more open to subjective judgement (e.g., creative writing, managing a team) Here a rating scale/scoring rubric is most appropriate A major challenge to test design is to produce tasks that require low inference scoring systems. Unfortunately, many worthwhile student outcomes reflecting higher order thinking lend themselves more to high inference scoring.

Scoring Rubrics (rating scale) A scoring rubric is a prepared scoring system for assessing performance in activities where professional judgement is involved in the assessment decision. There are two main types of rubrics: Holistic (focuses on overall assessment of a product, process or performance - without judging the component parts separately) Analytic (assesses – scores – each individual ‘part’ of an assessment activity and then totals an overall score There are benefits and limitations to each – what do you think they are?

Holistic versus Analytic Rubrics Holistic rubrics enable a focus on the overall performance and are more economical in terms of assessment time. They are typically used for summative assessment and where some variation in reliability in parts of the assessment components can be accepted, provided the overall assessment decision has good validity and reliability. In contrast, analytic rubrics enable a greater focus on the specific elements of the areas of learning involved and make possible a much better utilization of formative assessment in the assessment process. This has considerable benefits, as Gibbs (2008) highlights:   Research in schools has identified that the way that teachers provide and use feedback, and engage students with feedback, makes more difference to student performance than anything else that they can do in the classroom. (p.6)

What rubrics can and cannot do… It is also important to remember that the rubric does not make the assessment decision; this is the responsibility of the assessing lecturer Rubrics provides a guiding frame for focusing attention on the key elements/constructs (performance criteria) of the assessment area and summary descriptors of a range of performances.

Developing a checklist Identify the important components - procedures, processes or operations - in an assessment activity for example, in conducting an experiment one important operation is likely to be the generation of a viable hypothesis For each component, write a statement that identifies competent performance for this procedure, process or operation in the above example, the following may be pertinent: A clear viable hypothesis is described Allocate a mark distribution for each component - if appropriate this is likely to reflect its importance or level of complexity If you follow this process carefully, you will produce a valid and user-friendly checklist. Checklist are most useful in situations where the assessment decision seeks to identify a competent/ not competent judgement on performance. Note: Checklist are most useful for low inference items –where the performance evidence is clearly agreed and there is little disagreement relating to effective or ineffective performance (e.g., observable steps)

Table of Specifications Assessment checklist for Assignment 1: Design and conduct a small experiment to test the Halo Effect Performance Areas/criteria: The context of the experiment is accurately described  A clear viable hypothesis is presented  The method/procedure is appropriate  There is no infringement on persons  Findings are clearly collated and presented  Valid inferences and interpretations are drawn from the data and comparison is made with existing data  7. The write-up of the experiment meets required conventions  These are the main operations and processes to be assessed form the item. Note the performance criteria for each of the seven areas is not provided. The allocation of marks for each performance area will reflect the weighting allocated in the Table of Specifications

Developing a scoring rubric Define the performance area/learning targets for an assessment (must relate to learning outcomes) for example, ‘Valid inferences and interpretations are drawn from the data and comparison is made with existing data’ Identify and describe the key attributes that underpin competence for each performance area (preferably observable and measurable) Using the above example (attributes – concept, types of thinking) Validity inference and interpretation comparison and contrast Write a concise description of performance at a range of levels from very good to very poor for example, 5 = very good; 1 = very poor The rating scale/scoring rubric is an adaptation of the checklist system in that it produces a range of descriptions of performance, typically 5 levels from very poor to very good. This enable levels of qualitative decision making in the assessment process. The following four slides show the relationship between a set of learning outcomes, a performance-based learning task/assessment item and these two formats for a marking scheme. Note: Rating Scales/Scoring Rubrics are most for useful for high inference items – where the performance evidence requires considerable professional judgement in making an assessment decision

Scoring Rubric for Example 1: Valid inferences & interpretations are drawn, comparison with existing data is made Score Description 5 All valid inferences are derived from data. Interpretations are consistently logical given the data obtained. All essential similarities and differences with existing data are identified and their significance fully emphasised. 4 Most of the valid inferences are derived from data. Interpretations are mainly logical given the data obtained. Most essential similarities and differences with existing data are identified and their main significance emphasised. 3 Some valid inferences are derived from data. Some logical interpretations are made from data obtained. Some essential similarities and differences with existing data are identified and their significance partly established. 2 Few valid inferences are derived. Interpretation of findings are limited . Comparison and contrast with existing data is partial and its significance not established. 1 Failure to make valid inferences and interpretations