David Millard Karen Fill Hugh Davis Lester Gilbert Gary Wills Learning Societies Lab, University of Southampton, UK.

Slides:



Advertisements
Similar presentations
Peer-Assessment. students comment on and judge their colleagues work.
Advertisements

Supporting further and higher education e-Learning and Pedagogy overview Helen Beetham Programme Consultant.
Secondary Education Section Committee TEACHING AND LEARNING IN THE FOUNDATION SUBJECTS Geographical Association Conference 2002.
Peer peer-assessment & peer- feedback
Generalized Model for Program Planning
PROFESSIONAL REFLECTIVE JOURNALS AND TIME LOGS Tools for an Effective Field Placement.
South Harrison Community School Corporation “Upper Blocks” For Grades 4-6.
Peer assessment and group work event and practical workshop RSC WM Stimulating and supporting innovation in learning.
© University of Reading October 2014 E-Portfolio Tools for Reflective Practice in Initial Teacher Training From Blackboard to.
The Innovation Base Tom Franklin Franklin Consulting Hilary Dexter University of Manchester.
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in.
‘Real World’ Problem / Data Set an overall real world problem, supported by real world data Purely academic learning might require a theoretical problem.
Behavior Domain, Behavior Determinants and Behavior Change Health Behavior: CHAPTER 21.
INTRODUCTION TO E-TUTORING. TOPICS TO BE COVERED: Definition of an etutor What is and etutor? Who is an etutor? Competencies required Role of the traditional.
Partner reward – a help or a hindrance to effective business development? Peter Scott Peter Scott Consulting
FREMA: e-Learning Framework Reference Model for Assessment David Millard Yvonne Howard Learning Technology Group University of Southampton, UK.
FREMA: e-Learning Framework Reference Model for Assessment Yvonne Howard David Millard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University.
The mind beyond time and space: Online collaborative knowledge building using Knowledge Forum Learning Community Project, CITE, HKU.
1/19. 2/19 WP6 Presentation Pythagoras Karampiperis, WP6 Leader Demetrios Sampson, CERTH CB Representative Antonis Kokkonos, CERTH WP6 Team Member Advanced.
FREMA: e-Learning Framework Reference Model for Assessment David Millard Yvonne Howard IAM, DSSE, LTG University of Southampton, UK.
FREMA : e-Learning Framework Reference Model for Assessment Assessment: Priorities and Challenges for 2006 Hugh Davis Learning Technologies University.
FREMA: e-Learning Framework Reference Model for Assessment Design Patterns for Wrapping Similar Legacy Systems with Common Service Interfaces Yvonne Howard.
FREMA Lester Gilbert Dave Millard Yvonne Howard An ELF Reference Model Project In the JISC Distributed e-Learning Programme e-Learning Framework Reference.
David Millard Karen Fill Hugh Davis Lester Gilbert Gary Wills Learning Societies Lab, University of Southampton, UK Towards a Canonical View of Peer Assessment.
FREMA: e-Learning Framework Reference Model for Assessment Lester Gilbert David Millard Yvonne Howard University of Southampton, UK University of Strathclyde,
FREMA : e-Learning Framework Reference Model for Assessment FREMA Overview David Millard Learning Technologies University of Southampton, UK.
FREMA : e-Learning Framework Reference Model for Assessment Hugh Davis, Yvonne Howard, David Millard,
FREMA : e-Learning Framework Reference Model for Assessment FREMA Workshop Hugh Davis David Millard Yvonne Howard Learning Technologies University of Southampton,
Domain Modelling the upper levels of the eframework Yvonne Howard Hilary Dexter David Millard Learning Societies LabDistributed Learning, University of.
1 Classroom management and partnerships Managing pupil groupings.
Goal Understand the impact on student achievement from effective use of formative assessment, and the role of principals, teachers, and students in that.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
Student Success: why are we being asked to ensure it and how can we support it? Jamie Thompson and team Support Northumbria Conference 23 April 2007.
Designing a Work Integrated Assessment Collaborate Project Bringing together staff, students and employers to create employability focused assessments.
Leanne CameronNo 1 Leanne Cameron Macquarie University Visualising Learning Design with Pre-Service Teachers.
Domain Modeling In FREMA David Millard Yvonne Howard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University of Southampton, UK.
Chapter 7 Copyright © 2010 Pearson Education, Inc. Publishing as Prentice Hall 7-1 Risk Management.
Developing learner competency in the clinical environment GRACE Session 3 GRACE Program.
How to use Thematic Units……. The key to successful thematic unit development and teaching is careful and thoughtful planning, combined with a thorough.
Workshop 1: Introduction to the portfolio
 T-Flip September 3rd The use of peer assessment encourages students to believe they are part of a community of scholarship. In peer assessment.
Domain Modeling In FREMA Yvonne Howard David Millard Hugh Davis Gary Wills Lester Gilbert Learning Societies Lab University of Southampton, UK.
EASiHE Gary Wills Bill Warburton Lester Gilbert E-Assessment in Higher Education A JISC project in Institutional Innovation.
Maryland College and Career Readiness Conference Summer 2015.
Designing Effective Trainings 1. Steps to Creating Learning Objectives At the end of this session, participants will be able to: Identify ways to get.
Instructional Rounds: Fall 2015 Debrief A Collaborative Approach to Improving Teaching & Learning City, Elmore, Fiarman, and Teitel. Instructional Rounds.
An Online Support System to Scaffold Real-World Problem Solving Presenter: Chih-Ming Chen Advisor: Min-Puu Chen Date: 11/10/2008 Ge, X., & Er, N. (2005).
Unit II PERFORMANCE FEEDBACK.
1 Session 2 Professional Development. Learning Outcomes  Consider various modules of professional development and relate to personal experience  Formulate.
Key Characteristics of Participatory Learning 1. Well Defined Objectives: Participatory learning requires setting, clarifying objectives with the students,
Joint Information Systems Committee 09/03/2016 | | Slide 1 Toolkit and Demonstrator Calls Section Title Tish Roberts JISC programme Manager.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Kim Taylor Denise Arseneau Tammy Gallant
GROUP LEARNING TEACHING and ASSESSMENT Give me a fish I eat for a DayTeach me to fishI eat for a life time.
Contract management 1. Acquiring software from external supplier This could be: a bespoke system - created specially for the customer off-the-shelf -
Present apply review Introduce students to a new topic by giving them a set of documents using a variety of formats (e.g. text, video, web link etc.) outlining.
 Spring 2014 MCED 3430 Kelly Chaney.  Shared Group Grade o The group submits one product and all group members receive the same grade, regardless of.
Developing and Organizing Leadership Committees Jim Rhodes, Ag/4-H Youth Development Major County.
Transforming Learning
Feedback hints and tips
Student Engagement in the Secondary Math Classroom
Adapted from PPT developed by Jhpiego corporation
ECE362 – Principles of Design
Teaching Multi-Grade Classes
Ensuring Success through Assessment – Involve Students
What is a WebQuest? Guided search for information
Chapter 7 Corporate Governance.
Competency-Based Approach
FREMA: e-Learning Framework Reference Model for Assessment
Presentation transcript:

David Millard Karen Fill Hugh Davis Lester Gilbert Gary Wills Learning Societies Lab, University of Southampton, UK

Introduction What is PeerPigeon? –JISC funded 6 month project –Based at Southampton –12 person months Build a Set of Services to support the process of Peer Review Objectives –Develop a Peer Assessment SUM for the e-Framework –Develop Service Descriptions for Services to Distribute Resources in a Peer Assessment Scenario (hence the Pigeon bit!) –Develop simple web client to demonstrate those services –Investigate REST Services within the e-Framework

Peer Assessment Peer Assessment has many advantages: –Giving a sense of ownership of the assessment process, improving motivation –Encouraging students to take responsibility for their own learning –Treating assessment as part of learning, so that mistakes are opportunities rather than failures –Practicing the transferable skills needed for life-long learning, especially evaluation skills –Using external evaluation to provide a model for internal self- assessment of own learning –Encouraging deep rather than surface learning. Bostock S., “Student peer assessment”, Higher Education Academy Article, 16 Mar 2001

Case Studies Simple - The simplest form of peer review is where authors and reviewers are paired together Round Robin - Where participants are grouped, and each participant reviews the work of each other participant in their group. Group Activity - Where a group of authors work together to produce an artefact, and then that artefact is reviewed by a third party. Group Review - Where a group of authors work together to produce an artefact, and then individually review the efforts of their group. Committee Review - Where a group of reviewers act together and look at several different artefacts in order to produce one review. In the research community we are familiar with this as the conference committee stage of peer review Multiplicity - Where multiple authors create multiple artefacts which are then independently reviewed by multiple reviewers. For example, where students give a presentation and answer questions and are assessed by their classmates on both

Common Review Cycle All these cases can be thought of as being built of common review cycles 1.The cycle can be started in any one of its three states. For example, to begin an activity the student may be asked to Generate an artefact, to Submit an existing artefact, or the tutor may provide it, in which case the first task is to Distribute it. 2.The cycles can be interleaved, and occurring in parallel as well as in sequence. 3.Each stage within the process may involve 1...n participants (authors/tutors/reviewers), producing 1...m resources (artefacts/reviews/marks).

Multiplicity: n students, m tutors each student delivers a presentation and answers questions (i.e. two artefacts) students and tutors review/mark the presentations only tutors review/mark the answers

CycleCreators Authors/ Reviewers Resources Artefacts/ Reviews Receivers Reviewers /Authors 111n+1 2nm2 31m Multiplicity: n students, m tutors each student delivers a presentation and answers questions (i.e. two artefacts) students and tutors review/mark the presentations only tutors review/mark the answers

Use Case for PeerPigeon What is PeerPigeon?

PeerPigeon in FREMA

Future Plans Develop a definition of the Assessment Plan –Should contain: A Peer Review Pattern (an ordered description of the cycles of peer review and the roles of the participants in each cycle). A number of actual Participants (possible arranged into Groups) that populate the roles in the plan. A Schedule of upcoming dates and times, that ties the pattern to a real timescale. –Possible standards IMS LD QTI Factor a set of services from the Use Case –Author a Peer Assessment SUM in FREMA Build and test