‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem.

Slides:



Advertisements
Similar presentations
Karl Donert, National Teaching Fellow HERODOT Project coordinator HERODOT: Benchmarking Geography.
Advertisements

LBSS Faculty of Law Business and Social Sciences Law Accountancy Business and Management Central and East European Studies Economics Economic and Social.
Principles of Assessment and Feedback for Learning Ulster Principles of Assessment and Feedback for Learning School Board Briefing [School/Dept name] [Facilitator.
Designed by Education Quality & Enhancement staff at the University of Exeter. Based on work by the Viewpoints project at Ulster University (funded by.
Building Student Independence 1. Staying connected 2.
USING ONLINE TOOLS IN ASSESSMENT TASKS: HOW CAN WE HELP STUDENTS DEVELOP GOOD ACADEMIC PRACTICES? HLST Annual Conference: parallel workshop Dr Erica Morris.
Leadership for the Common Core in Mathematics, University of Wisconsin-Milwaukee Linking Assessment Targets to Instructional Tasks and DOK This.
Yvan Rooseleer – BiASC – MAY 2013
Consistency of Assessment
What should learners understand? Defining Understanding Goals for Disciplined Inquiry.
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Enhancing the assessment of student learning Richard James
The Academic Assessment Process
Faculty of Health & Social Care School of Nursing Shape your own future.
INACOL National Standards for Quality Online Teaching, Version 2.
Professional Growth= Teacher Growth
Classroom Assessment A Practical Guide for Educators by Craig A
New Advanced Higher Subject Implementation Events
Evaluation: A Challenging Component of Teaching Darshana Shah, PhD. PIES
Becoming a Teacher Ninth Edition
Overall Teacher Judgements
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
The Art of the Designer: creating an effective learning experience HEA Conference University of Manchester 4 July 2012 Rebecca Galley and Vilinda Ross.
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem.
EDU 385 Education Assessment in the Classroom
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing Program Quality with the Autism Program Environment Rating Scale.
'Learner Autonomy through the production of a public history resource' Emma Robertson.
Designing a Work Integrated Assessment Collaborate Project Bringing together staff, students and employers to create employability focused assessments.
College of Science and Engineering Learning and Teaching Strategy Planning Meeting Initial Reflections Nick Hulton.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Session Four: Assessment of learning in the classroom Short Course in Learning and Teaching in the Classroom Janet Holmshaw and Jeff Sapiro Middlesex University,
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Professional Learning and Development: Best Evidence Synthesis Helen Timperley, Aaron Wilson and Heather Barrar Learning Languages March 2008.
Chapter 5 – Flexible Grouping in Success for All Learners Presenter: Darren Ellison.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
Community Service-Learning: Design, Implementation and Evaluation Cheryl Rose, Canadian Association for Community Service-Learning.
Analyze Design Develop AssessmentImplement Evaluate.
Students seizing responsibility: A revolution of collegiality Amie Speirs, Zoe Welsh, Julia Jung and Jenny Scoles Introduction: In our project Students.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Australian Teacher Performance and Development Framework Consultation proposal.
Assessing, Recording and Reporting Citizenship A Collaborative Approach.
Identifying Assessments
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
IB: Language and Literature
Select Slides… Spring 2013 Training Strengthening Teaching and Learning through the Results of Your Student Assessment of Instruction (SAI) For Faculty.
Dr. Dan Y. Jacobsen: The LAOS-project at NTNU; UNIQUAL Conference
Creating a Standards-Based Classrooms An Overview of Adapting and Adopting Research Based Instruction to Enhance Student Learning.
1 CECV Intervention Framework Module 5A Learning & Teaching EFFECTIVE INTERVENTION.
ASSESSMENT and EVALUATION (seeing through the jargon and figuring out how to use the tools)
D EVELOPING A COHERENT AND PROGRESSIVE APPROACH TO CAREERS EDUCATION IN HE: A CASE STUDY Katharine Price Edwards and Shawna McCoy.
Planning the Learner Experience Linda Rolfe & Cerian Ayres Petroc.
Assessment for Learning Centre for Academic Practice Enhancement, Middlesex University.
Competencies and consequences … choices to make April
Task Design in Undergraduate Mathematics Leigh Wood Chair, Standing Committee on Mathematics Education Australian Mathematical Society Associated Dean,
The Continuum of Interventions in a 3 Tier Model Oakland Schools 3 Tier Literacy Leadership Team Training November
Good teaching for diverse learners
Learning Environments
Technical Business Consultancy Project
A community of learners improving our world
The Continuum of Interventions in a 3 Tier Model
Classroom Assessment A Practical Guide for Educators by Craig A
World of work How do tasks bring the WoW into the classroom?
A curriculum design tool to align assessment with technology
Parent-Teacher Partnerships for Student Success
Standard for Teachers’ Professional Development July 2016
Presentation transcript:

‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem in order to test a theoretical understanding. In employment though problems tend to be very real, and the data that you need to work with rarely comes in coherent, standardised forms. It is usually in 'messier' formats that need to be interpreted to be of use. Using a real world problem and real world data helps to develop skills in analysis, interpretation and evaluation. Multiple Assessment Points Move to a more distributed pattern of assessment; consider introducing ‘surprise’ points Assessments are often delivered in the form of one summative assessment, e.g. an exam or essay, at the end of a period of formal learning. In employment however, ‘assessment’ or evaluation points tend to occur frequently. In addition, timing is often out of individual control, and consequently it can be necessary to juggle competing tasks at short notice. Using multiple assessment points helps to develop reflective thinking, whilst ‘surprise’ points support task prioritisation. Varied Audiences Aim to set explicit audiences for each assessment point In higher education the audience for an assessment is implicitly the academic that sets it, who will naturally be already aligned in some way with the course and / or module. This contrasts with employment, where the audience can be peers, but is more often the client or another external third party, with different values, priorities and expectations. Having to think for a different audience on an assessment provokes greater reflective thinking, and requires new types of synthesis. Light Structure Lightly structure the overall assessment; reward student approaches Most thinking on assessment suggests that there should be explicit guidance to students concerning how and where marks are attained. However in employment part of the challenge for the individual and/or team is the structuring of the work that needs to be completed. Tasks need to be identified, processes decided, and priorities allocated. Using a light structure approach encourages students to plan tasks and goals in order to solve a bigger problem, strengthening their project management and prioritisation skills. Peer / Self Review Include peer and/or self review explicitly in the assessment process Typically the review of assessments (i.e. feedback) in formal education is only provided by teaching staff. In employment, however, much of the review process comes in multiple forms, e.g. informal peer feedback from colleagues, formal and informal reviews from clients, and self-review of personal performance. Including peer and/or self review explicitly within an assessment helps students to develop critical thinking skills, and encourages articulation and evidencing. Collaborative Working Create teams of students from the outset, encourage collaboration Many forms of assessment require working alone, yet employment invariably requires some form of collaboration and team work, and often with unknown and perhaps even challenging individuals. Encouraging students to work collaboratively and in teams improves their ability to negotiate and discuss, and develops their understanding of team roles and role flexibility. For more information contact Richard Osborne Project blog available at Designing a Work Integrated Assessment The concept of a work‐integrated assessment is an assessment where the tasks and conditions are more closely aligned to what you would experience within employment. Collaborate Project Bringing together staff, students and employers to create employability focused assessments enhanced by technology Module: Co-ordinator: Year:

Module: Co-ordinator: Year: The concept of a work‐integrated assessment is an assessment where the tasks and conditions are more closely aligned to what you would experience within employment. ‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem in order to test a theoretical understanding. In employment though problems tend to be very real, and the data that you need to work with rarely comes in coherent, standardised forms. It is usually in 'messier' formats that need to be interpreted to be of use. Using a real world problem and real world data helps to develop skills in analysis, interpretation and evaluation. Multiple Assessment Points Move to a more distributed pattern of assessment; consider introducing ‘surprise’ points Assessments are often delivered in the form of one summative assessment, e.g. an exam or essay, at the end of a period of formal learning. In employment however, ‘assessment’ or evaluation points tend to occur frequently. In addition, timing is often out of individual control, and consequently it can be necessary to juggle competing tasks at short notice. Using multiple assessment points helps to develop reflective thinking, whilst ‘surprise’ points support task prioritisation. Varied Audiences Aim to set explicit audiences for each assessment point In higher education the audience for an assessment is implicitly the academic that sets it, who will naturally be already aligned in some way with the course and / or module. This contrasts with employment, where the audience can be peers, but is more often the client or another external third party, with different values, priorities and expectations. Having to think for a different audience on an assessment provokes greater reflective thinking, and requires new types of synthesis. Light Structure Lightly structure the overall assessment; reward student approaches Most thinking on assessment suggests that there should be explicit guidance to students concerning how and where marks are attained. However in employment part of the challenge for the individual and/or team is the structuring of the work that needs to be completed. Tasks need to be identified, processes decided, and priorities allocated. Using a light structure approach encourages students to plan tasks and goals in order to solve a bigger problem, strengthening their project management and prioritisation skills. Peer / Self Review Include peer and/or self review explicitly in the assessment process Typically the review of assessments (i.e. feedback) in formal education is only provided by teaching staff. In employment, however, much of the review process comes in multiple forms, e.g. informal peer feedback from colleagues, formal and informal reviews from clients, and self-review of personal performance. Including peer and/or self review explicitly within an assessment helps students to develop critical thinking skills, and encourages articulation and evidencing. Collaborative Working Create teams of students from the outset, encourage collaboration Many forms of assessment require working alone, yet employment invariably requires some form of collaboration and team work, and often with unknown and perhaps even challenging individuals. Encouraging students to work collaboratively and in teams improves their ability to negotiate and discuss, and develops their understanding of team roles and role flexibility. For more information contact Richard Osborne Project blog available at Designing a Work Integrated Assessment Collaborate Project Bringing together staff, students and employers to create employability focused assessments enhanced by technology Overview This example shows a blank sheet, designed to be used by collaborative teams of staff, students and employers who have already expressed a desire to create a work‐integrated assessment. Six ‘dimensions’ are shown which are the hallmarks of o work-integrated assessment. The sheet is a thinking tool to help the design team reflect on how a current assessment might be marked on the six dimensions of the radar chart, and what changes might be made to move along these six dimensions. The model is designed to be used in three stages: (1) An analysis stage to understand how the assessments for a module currently map to the dimensions, followed immediately by; (2) A design stage where changes to the assessments are proposed that will create movement along the dimensions, followed much later by; (3) An evaluation stage after the assessments have run to assess the impact of designed changes. Overview This example shows a blank sheet, designed to be used by collaborative teams of staff, students and employers who have already expressed a desire to create a work‐integrated assessment. Six ‘dimensions’ are shown which are the hallmarks of o work-integrated assessment. The sheet is a thinking tool to help the design team reflect on how a current assessment might be marked on the six dimensions of the radar chart, and what changes might be made to move along these six dimensions. The model is designed to be used in three stages: (1) An analysis stage to understand how the assessments for a module currently map to the dimensions, followed immediately by; (2) A design stage where changes to the assessments are proposed that will create movement along the dimensions, followed much later by; (3) An evaluation stage after the assessments have run to assess the impact of designed changes.

Module: Co-ordinator: Year: ‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem in order to test a theoretical understanding. In employment though problems tend to be very real, and the data that you need to work with rarely comes in coherent, standardised forms. It is usually in 'messier' formats that need to be interpreted to be of use. Using a real world problem and real world data helps to develop skills in analysis, interpretation and evaluation. Multiple Assessment Points Move to a more distributed pattern of assessment; consider introducing ‘surprise’ points Assessments are often delivered in the form of one summative assessment, e.g. an exam or essay, at the end of a period of formal learning. In employment however, ‘assessment’ or evaluation points tend to occur frequently. In addition, timing is often out of individual control, and consequently it can be necessary to juggle competing tasks at short notice. Using multiple assessment points helps to develop reflective thinking, whilst ‘surprise’ points support task prioritisation. Varied Audiences Aim to set explicit audiences for each assessment point In higher education the audience for an assessment is implicitly the academic that sets it, who will naturally be already aligned in some way with the course and / or module. This contrasts with employment, where the audience can be peers, but is more often the client or another external third party, with different values, priorities and expectations. Having to think for a different audience on an assessment provokes greater reflective thinking, and requires new types of synthesis. Light Structure Lightly structure the overall assessment; reward student approaches Most thinking on assessment suggests that there should be explicit guidance to students concerning how and where marks are attained. However in employment part of the challenge for the individual and/or team is the structuring of the work that needs to be completed. Tasks need to be identified, processes decided, and priorities allocated. Using a light structure approach encourages students to plan tasks and goals in order to solve a bigger problem, strengthening their project management and prioritisation skills. Peer / Self Review Include peer and/or self review explicitly in the assessment process Typically the review of assessments (i.e. feedback) in formal education is only provided by teaching staff. In employment, however, much of the review process comes in multiple forms, e.g. informal peer feedback from colleagues, formal and informal reviews from clients, and self-review of personal performance. Including peer and/or self review explicitly within an assessment helps students to develop critical thinking skills, and encourages articulation and evidencing. Collaborative Working Create teams of students from the outset, encourage collaboration Many forms of assessment require working alone, yet employment invariably requires some form of collaboration and team work, and often with unknown and perhaps even challenging individuals. Encouraging students to work collaboratively and in teams improves their ability to negotiate and discuss, and develops their understanding of team roles and role flexibility. For more information contact Richard Osborne Project blog available at Designing a Work Integrated Assessment The concept of a work‐integrated assessment is an assessment where the tasks and conditions are more closely aligned to what you would experience within employment. Collaborate Project Bringing together staff, students and employers to create employability focused assessments enhanced by technology Analysis Stage The analysis stage is an opportunity to think about how the assessments for the module under discussion currently map to the dimensions model. It may well be the case that some aspects of the current assessments are already well developed in terms of these dimensions. The purpose of the tool is to help the collaborative team visualise this, and highlight areas which aren’t perhaps as well developed as others. No explicit attempt has been made to quantify the scales of the axes. Teams are encouraged to use the boundaries implied by the top and bottom of the scales, together with their interpretation of the best and worst aspects of their current assessments, as overall guides for rating within the model. Analysis Stage The analysis stage is an opportunity to think about how the assessments for the module under discussion currently map to the dimensions model. It may well be the case that some aspects of the current assessments are already well developed in terms of these dimensions. The purpose of the tool is to help the collaborative team visualise this, and highlight areas which aren’t perhaps as well developed as others. No explicit attempt has been made to quantify the scales of the axes. Teams are encouraged to use the boundaries implied by the top and bottom of the scales, together with their interpretation of the best and worst aspects of their current assessments, as overall guides for rating within the model.

Module: Co-ordinator: Year: ‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem in order to test a theoretical understanding. In employment though problems tend to be very real, and the data that you need to work with rarely comes in coherent, standardised forms. It is usually in 'messier' formats that need to be interpreted to be of use. Using a real world problem and real world data helps to develop skills in analysis, interpretation and evaluation. Multiple Assessment Points Move to a more distributed pattern of assessment; consider introducing ‘surprise’ points Assessments are often delivered in the form of one summative assessment, e.g. an exam or essay, at the end of a period of formal learning. In employment however, ‘assessment’ or evaluation points tend to occur frequently. In addition, timing is often out of individual control, and consequently it can be necessary to juggle competing tasks at short notice. Using multiple assessment points helps to develop reflective thinking, whilst ‘surprise’ points support task prioritisation. Varied Audiences Aim to set explicit audiences for each assessment point In higher education the audience for an assessment is implicitly the academic that sets it, who will naturally be already aligned in some way with the course and / or module. This contrasts with employment, where the audience can be peers, but is more often the client or another external third party, with different values, priorities and expectations. Having to think for a different audience on an assessment provokes greater reflective thinking, and requires new types of synthesis. Light Structure Lightly structure the overall assessment; reward student approaches Most thinking on assessment suggests that there should be explicit guidance to students concerning how and where marks are attained. However in employment part of the challenge for the individual and/or team is the structuring of the work that needs to be completed. Tasks need to be identified, processes decided, and priorities allocated. Using a light structure approach encourages students to plan tasks and goals in order to solve a bigger problem, strengthening their project management and prioritisation skills. Peer / Self Review Include peer and/or self review explicitly in the assessment process Typically the review of assessments (i.e. feedback) in formal education is only provided by teaching staff. In employment, however, much of the review process comes in multiple forms, e.g. informal peer feedback from colleagues, formal and informal reviews from clients, and self-review of personal performance. Including peer and/or self review explicitly within an assessment helps students to develop critical thinking skills, and encourages articulation and evidencing. Collaborative Working Create teams of students from the outset, encourage collaboration Many forms of assessment require working alone, yet employment invariably requires some form of collaboration and team work, and often with unknown and perhaps even challenging individuals. Encouraging students to work collaboratively and in teams improves their ability to negotiate and discuss, and develops their understanding of team roles and role flexibility. For more information contact Richard Osborne Project blog available at Designing a Work Integrated Assessment The concept of a work‐integrated assessment is an assessment where the tasks and conditions are more closely aligned to what you would experience within employment. Collaborate Project Bringing together staff, students and employers to create employability focused assessments enhanced by technology Design Stage The design stage is where changes to the assessments are proposed that will create movement along the axes. The adding (or potentially removing) of components is discussed that will help to create movement along the axes as appropriate. Technology “Top Trumps” have been designed to be used in conjunction with this model, which suggest digital technologies that could support the proposed changes. Examples might be a wiki to o support the Collaboration dimension, a project management tool to support the Structure dimension, or perhaps a blog to support the Time dimension. Design Stage The design stage is where changes to the assessments are proposed that will create movement along the axes. The adding (or potentially removing) of components is discussed that will help to create movement along the axes as appropriate. Technology “Top Trumps” have been designed to be used in conjunction with this model, which suggest digital technologies that could support the proposed changes. Examples might be a wiki to o support the Collaboration dimension, a project management tool to support the Structure dimension, or perhaps a blog to support the Time dimension.

Module: Co-ordinator: Year: ‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem in order to test a theoretical understanding. In employment though problems tend to be very real, and the data that you need to work with rarely comes in coherent, standardised forms. It is usually in 'messier' formats that need to be interpreted to be of use. Using a real world problem and real world data helps to develop skills in analysis, interpretation and evaluation. Multiple Assessment Points Move to a more distributed pattern of assessment; consider introducing ‘surprise’ points Assessments are often delivered in the form of one summative assessment, e.g. an exam or essay, at the end of a period of formal learning. In employment however, ‘assessment’ or evaluation points tend to occur frequently. In addition, timing is often out of individual control, and consequently it can be necessary to juggle competing tasks at short notice. Using multiple assessment points helps to develop reflective thinking, whilst ‘surprise’ points support task prioritisation. Varied Audiences Aim to set explicit audiences for each assessment point In higher education the audience for an assessment is implicitly the academic that sets it, who will naturally be already aligned in some way with the course and / or module. This contrasts with employment, where the audience can be peers, but is more often the client or another external third party, with different values, priorities and expectations. Having to think for a different audience on an assessment provokes greater reflective thinking, and requires new types of synthesis. Light Structure Lightly structure the overall assessment; reward student approaches Most thinking on assessment suggests that there should be explicit guidance to students concerning how and where marks are attained. However in employment part of the challenge for the individual and/or team is the structuring of the work that needs to be completed. Tasks need to be identified, processes decided, and priorities allocated. Using a light structure approach encourages students to plan tasks and goals in order to solve a bigger problem, strengthening their project management and prioritisation skills. Peer / Self Review Include peer and/or self review explicitly in the assessment process Typically the review of assessments (i.e. feedback) in formal education is only provided by teaching staff. In employment, however, much of the review process comes in multiple forms, e.g. informal peer feedback from colleagues, formal and informal reviews from clients, and self-review of personal performance. Including peer and/or self review explicitly within an assessment helps students to develop critical thinking skills, and encourages articulation and evidencing. Collaborative Working Create teams of students from the outset, encourage collaboration Many forms of assessment require working alone, yet employment invariably requires some form of collaboration and team work, and often with unknown and perhaps even challenging individuals. Encouraging students to work collaboratively and in teams improves their ability to negotiate and discuss, and develops their understanding of team roles and role flexibility. For more information contact Richard Osborne Project blog available at Designing a Work Integrated Assessment The concept of a work‐integrated assessment is an assessment where the tasks and conditions are more closely aligned to what you would experience within employment.. Collaborate Project Bringing together staff, students and employers to create employability focused assessments enhanced by technology Evaluation Stage The final stage using the dimensions model is an evaluation stage after the assessments have run, which will evaluate the overall impact of the designed changes. A separate evaluation tool has been designed for use with the model, which creates a third polygon to illustrate visually how those involved with the assessment think they have developed in terms of the six dimensions. The dimensions themselves are not directly referenced, instead 48 skill cards are rated by participants on a 7-stage Likert type scale. The cards are pre-coded with one of the six dimensions, and hence a score for each dimension can be extracted from the data by the evaluators, and applied to the model. Evaluation Stage The final stage using the dimensions model is an evaluation stage after the assessments have run, which will evaluate the overall impact of the designed changes. A separate evaluation tool has been designed for use with the model, which creates a third polygon to illustrate visually how those involved with the assessment think they have developed in terms of the six dimensions. The dimensions themselves are not directly referenced, instead 48 skill cards are rated by participants on a 7-stage Likert type scale. The cards are pre-coded with one of the six dimensions, and hence a score for each dimension can be extracted from the data by the evaluators, and applied to the model.