Download presentation
Presentation is loading. Please wait.
Published byChasity Waide Modified over 10 years ago
1
Making effective use of student feedback on innovative practice to improve educational outcomes SIT Tertiary Learning and Teaching Conference – Te Ao Hou 3 October 2014 Peter Coolbear
2
October 2014 Plan of presentation Planning interventions to improve teaching and learning and demonstrating they work Types of evidence Some examples Fitting it into the larger picture of student evaluation and self- assessment Clarity of purpose; obligations to learners; it’s a cyclical process
3
October 2014 Planning innovation Identify problem / need / opportunity Identify possible intervention Implement
4
October 2014 Planning innovation Identify problem / need / opportunity Identify possible intervention Implement Does it feel good?
5
October 2014 Planning innovation Identify problem / need / opportunity Identify possible intervention Identify what success looks like Implement Does it feel good? Measure
6
October 2014 Planning innovation Identify problem / need / opportunity Identify possible intervention Identify what success looks like Implement Does it feel good? Dialogue with students Measure
7
October 2014 Planning innovation Identify problem / need / opportunity Identify possible intervention Identify what success looks like Implement Measure Dialogue with students Risk – processes not inclusive of all students
8
October 2014 Collecting evidence to inform practice improvement Anne Alkema Heathrose Research
9
October 2014 Choosing the right measures Measures must be fit for purpose: What do you really want to know? How does your model of success model inform that decision? How precisely do you want to know? How representative do your data need to be?
10
October 2014 What is the nature of the evidence on which self- assessment is based? Types of evidence Formal Informal Benchmarking Data Information Process Output Outcome Timeliness Historical trends Retrospective Real time Validity Analytical capability
11
October 2014 Examples The good and not so good What do we want to know and do we really want to know it? How sophisticated do we want to get in our analysis? Informing action
12
October 2014 Enhancing success for Māori Learners in the Health Sciences Critical incident methdologies Dr Elana Taipapaki Curtis University of Auckland Equity Award 2012
13
October 2014 Active engagement with students about their learning. From post-it notes to flipped classrooms to problem based learning to knowledge maps
14
Dedicated education units for nursing – MIT and CMDHB October 2014 Employer is keen to do more Students complain when they can’t participate
15
October 2014 Michael Mintrom (University of Auckland) Creating team spirit and a culture of excellence among course participants - Politics 767: Managing Research Projects : Triangulated evidence that this course supported a 1.0 shift in GPA Caro McCaw (Otago Polytechnic) - The Path of Nexus: Māori student success in a design school context: the students tell the story about their enhanced learning environment
16
October 2014 Examples And one not so good …….. Intervention: developing an assessment on-line. Staff really excited and put in a lot of work Staff claimed to see improved quality of work, but no evidence of improvement of overall grades. Evidence that some students found the on-line intervention intimidating.
17
October 2014 It’s not all straight-forward What happens if the data is ambiguous? …. and it often is. How do we document for internal and external purposes? How do we dis-entangle what we want to know from other evidence collection within the organisation? Moral obligations The problem of causality And finally ….. We’ve got a great intervention, how do you sustain it?
18
October 2014 The disjointed nature of evaluation: pockets of earnest endeavour Formal programme evaluation Engagement surveys Institutional satisfaction surveys Formal teacher evaluation processes Programme review Real-time discussion with students
19
October 2014 This is mainly compliance stuff This really helps me with my teaching / practice This also works to help my students learn how to learn This is a fundamental part of organisational development How do you see the student evaluation processes your organisation in the context of your job as a teaching professional? Why?
20
October 2014 Sarah Stein, Dorothy Spiller, Stuart Terry, Trudy Harris, Lynley Deaker and Jo Kennedy. University of Otago, University of Waikato, Otago Polytechnic “Closing the loop”
21
October 2014 Case Study – Paul Paul has been teaching in the Sciences for 30 years. He thinks teaching is important and aims to explain core concepts to his students in a clear and accessible manner. He always tries to find fresh ways of making the material engaging for his students. He is interested in student feedback gathered through formal appraisal, but feels that often students cannot make objective judgements because their understanding of the subject is inevitably partial. He does not refer to student appraisals to inform his teaching unless he gets an unusually low score. In that instance, he may go back to the comments to try to find out what has happened. At the same time, he is not keen on ‘knee-jerk reactions’ in response to student appraisals as there is a curriculum that must be covered. Paul does not talk with colleagues or students about his appraisals. Stein et al., 2012
22
October 2014 Case Study – Mere Mere is an educator on a degree programme. She sees her role as prompting students to think about, and engage with, social justice issues. She talks about transformative learning and her hope is that the learning experiences she provides will be transformative for her students. She is an avid collector of student feedback and is committed to closing the feedback loop. She believes that students need to be listened to and shown that their views matter. She does have some concerns about the quality and usefulness of the questions on the standard formal evaluation questionnaires and tries to collect feedback throughout the course and discuss it with students. She feels that the institution is too focussed on the quality dimensions of the evaluation and that it does not promote and support the professional development benefits of the instrument strongly enough. Stein et al, 2012
23
October 2014 The purpose of collecting evidence Informing reflective practice (personal evaluative self-assessment) Team evaluative self-assessment Organisation evaluative self-assessment Assisting student learning Assisting students in learning about being effective learners Support for promotion Support for organisational decision-making Public relations and marketing
24
October 2014 The purpose of collecting evidence Informing reflective practice (personal evaluative self-assessment) Team evaluative self-assessment Organisation evaluative self-assessment Assisting student learning Assisting students in learning about being effective learners Support for promotion Support for organisational decision-making Public relations and marketing
25
October 2014 Responsibilities to students The data and information we collect is largely about students or about things that matter to students. What information do students get back? What information about actions do students get back? Is the information used to prompt any actions by students in support of their own learning.
26
October 2014 And finally, the problem of causality Intervention put in place Effect observed Intervention put in place Effect observed
27
October 2014 In the end it’s about closing the loop
28
October 2014 Thank you www.akoaotearoa.ac.nz
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.