Sarah Stein, Jo Kennedy, Trudy Harris, Stuart Terry, Lynley Deaker, Dorothy Spiller Presentation at the Higher Education Research and Development Society.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

The Journey – Improving Writing Through Formative Assessment Presented By: Sarah McManus, Section Chief, Testing Policy & Operations Phyllis Blue, Middle.
Agenda For Today! Professional Learning Communities (Self Audit) Professional Learning Communities (Self Audit) School Improvement Snapshot School Improvement.
School of Medicine FACULTY OF MEDICINE AND HEALTH Does interprofessional education and working have any impact on perceptions of professional identity.
Quality Assurance and Quality Enhancement Relationships and Perspectives Professor Barry Jackson PVC, Director of Learning & Teaching Middlesex University.
Building capacity for assessment leadership via professional development and mentoring of course coordinators Merrilyn Goos.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Making effective use of student feedback on innovative practice to improve educational outcomes SIT Tertiary Learning and Teaching Conference – Te Ao Hou.
User Satisfaction Why? User Satisfaction Surveys are conducted to ensure we receive feedback from our customers in order to gauge.
Student Evaluations: How Do They Influence Teacher Thinking And Behaviour? In search of the evidence.
Project Monitoring Evaluation and Assessment
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Deanne Gannaway Facilitating Change in Higher Education Practices.
Dr Jo Maddern Centre for the Development of Staff and Academic Practice Institute of Education, Graduate and Professional Development INSPIRING TEACHING,
Consistency of Assessment
Ritual or reality: do student evaluations have any effect on teacher thinking and practices? Presentation at the Australasian Higher Education Evaluation.
Deb Hearle and Nina Cogger.  Cardiff University: Periodic Review & Re-validation  Health Professions Council: Re-Approval  College of Occupational.
CRICOS Provider No 00025B Strategies for enhancing teaching and learning: Reflections from Australia Merrilyn Goos Director Teaching and Educational Development.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Improving Students’ understanding of Feedback
Analytical methods for Information Systems Professionals Week 13 Lecture 1 CONCLUSION.
The Academic Assessment Process
Problem Analysis Intelligence Step 2 - Problem Analysis Developing solutions to complex population nutrition problems (such as obesity or food insecurity)
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
Measuring Learning Outcomes Evaluation
Developing, Implementing, and Evaluating Cultural Competency and Equality IN Nurse Training : What Are We Learning? Results From an Action Research Project.
INACOL National Standards for Quality Online Teaching, Version 2.
Standards and Guidelines for Quality Assurance in the European
Research Presentation Directions
IAEA International Atomic Energy Agency The IAEA Safety Culture Assessment Methodology.
Proposal Writing.
Reflective practice Session 4 – Working together.
Research, evidence and engaging learning Profiling the influence of school librarianship Penny Moore
PACINA Online A Mini-Tutorial on Reading and Interpreting a Published Information Needs Analysis facilitated by Andrew Booth, ScHARR, University of Sheffield.
Student Evaluations of teaching: Do they matter?
Families as Partners in Learning Principals and teaching staff Why are partnerships important?
National Center for Urban School Transformation Improving Climate & Culture in Urban Schools National Center for Urban School Transformation.
Student Evaluations: How Do They Influence Teacher Thinking And Behaviour? In search of the evidence.
Exploring the use of QSR Software for understanding quality - from a research funder’s perspective Janice Fong Research Officer Strategies in Qualitative.
DIOCESAN EDUCATION SERVICE Inspection 2012 S48 Diocesan Inspection.
Valuing evaluation: A Case Study of Professional Development to Support Academic Engagement in Online Evaluation Processes and Outcomes Dr. Diana Quinn.
LECTURE 2 - DTLLS Assessment. Research into the impact of assessment tells us that students learn best when assessment is:  Evenly timed  Represents.
ONLINE VS. FACE-TO-FACE: EDUCATOR OPINIONS ON PROFESSIONAL DEVELOPMENT DELIVERY METHODS BY TERESA SCRUGGS THOMAS Tamar AvineriEMS 792x.
Effective Pedagogy It’s Just A Matter of Time Graeme Aitken School of Education The University of Auckland Based on material originally.
Creating Questionnaires. Learning outcomes Upon completion, students will be able to: Identify the difference between quantitative and qualitative data.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Professional Administrative Support for Adult Learning Pro- SAL PROJECT INFORMATION.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
DIOCESAN EDUCATION SERVICE Inspection S48 Diocesan Inspection and Catholic life.
Exploring Evidence.
How do teachers use research findings to improve their practice?
Performance and Development Teacher Librarian Network
Qualitative Research January 19, Selecting A Topic Trying to be original while balancing need to be realistic—so you can master a reasonable amount.
Performance Stories Evaluation - A Monitoring Method to Enhance Evaluation Influence Riad Naji, Catriona King, Richard Habgood.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Student Evaluation: What Are the Perspectives of Medical Students on the Graduate Entry Program and Traditional Five Year Program and How Do They Influence.
Transforming Patient Experience: The essential guide
PRESENTATION AT THE TECHNOLOGICAL UNIVERSITIES QUALITY FRAMEWORK Professor Sarah Moore, Chair, National Forum for the Enhancement of Teaching and Learning.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT LECTURE NO
CAPS: COACHING TEACHERS Facilitator: Dr. Lynne Paradis BELIZE LITERACY PROGRAM June 2011.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
KEVIN SMITH & KIM HORTON JULY 2015 Educational research and teaching Wales.
Conducting a research project. Clarify Aims and Research Questions Conduct Literature Review Describe methodology Design Research Collect DataAnalyse.
Quality Assurance processes
Anthony Williams, Maria Northcote, Jason Morton and John Seddon
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
This presentation will include:
Teaching Excellence & the Career Map
February 21-22, 2018.
Presentation transcript:

Sarah Stein, Jo Kennedy, Trudy Harris, Stuart Terry, Lynley Deaker, Dorothy Spiller Presentation at the Higher Education Research and Development Society of Australasia (HERDSA) conference, Hobart, 2012.

 Title: “Unlocking the Impact of Tertiary Teachers’ Perceptions of Student Evaluations of Teaching”  Ako Aotearoa (National Centre for Tertiary Teaching Excellence) National Project Fund grant ($150k) 

from left: top: Stuart, Sarah, Lynley front: Jo, Dorothy, Trudy Otago Polytechnic Stuart Terry University of Waikato Dorothy Spiller Trudy Harris University of Otago Sarah Stein Jo Kennedy Lynley Deaker 3

 academics’ hostility towards student evaluations  academics are resigned to the notion of student evaluations (Beran & Rokosh, 2009)  =/= improvements in teaching (Kember, Leung & Kwan, 2002) or serious engagement for development (Beran & Rokosh, 2009; Burden, 2008)  two (competing?) purposes for student evaluations: 1.audit (monitoring, gauging teaching effectiveness) 2.development of teaching and courses  claims that the two purposes are complementary (e.g., Ramsden, 1992) BUT perceptions are very important 4

 interrelated factors (contextual, philosophical, practical and personal) influence academics’ perceptions of evaluations and their use of them:  validity and reliability are still cited, despite the evidence (e.g., Benton & Cashin, 2012; Marsh, 1987; Theall & Franklin, 2001)  institutional expectations and community norms (e.g., Nasser & Fresko, 2002)  limitations of student judgement (Aleamoni, 1981)  quality of the institution’s evaluation/appraisal instruments (e.g., Ballantyne, Borthwick & Packer, 2000; Penny & Coe, 2004)  institutional ownership and use of evaluations (e.g., Edström, 2008; McKeachie, 1997; Nasser & Fresko, 2002)  individual teacher’s teaching beliefs (Hendry, Lyon & Henderson-Smart, 2007) and emotions (e.g., Moore & Kuol, 2005) 5

Perceptions do matter “it doesn’t matter much what the institution’s intended purpose is [or what research evidence suggests]. What is important is what the individual teachers perceive to be the purpose” (Edström, 2008, p. 100, emphasis in original) 6

 How do current formal student evaluation/appraisal processes and practices influence teachers’ thinking and behaviours in relation to student learning at all stages of the teaching and learning cycle?  What are the perceptions that tertiary teachers hold about student evaluation/appraisal?  What factors (causes, influences) affect these views?  How do tertiary teachers engage with student evaluations/appraisals? 7

 overarching interpretivist research approach (Erikson, 1998)  combination of quantitative and qualitative data:  online questionnaire  semi-structured interviews  key ideas and issues identified through the literature review contributed to the development of the data gathering tools and to the analysis of the resulting data 8

 Likert-scale and open response questions  4 parts:  Section A - explored current practices (Q1-8)  Section B - explored perceptions of student evaluation data and influence on practice (Q9-22)  Section C - demographic information (Q23-32)  Section D - interview availability (Q33)  2,426 teaching staff invited  1,065 responses received (44%) 9

 20 from each institution, purposively selected (Patton, 1990) from volunteers (Q33)  range of academic discipline, career stage, seniority  core interview questions based on the key themes identified in the questionnaire responses  teaching and learning beliefs  students’ capacity to make judgements  personal, emotional factors  other factors e.g., timing, promotion, engagement with evaluation 10

 Questionnaire comments & interviews  thematic analysis - involved searching for themes, using a constant comparative technique (Dye, Schatz, Rosenberg & Coleman, 2000; Silverman, 2001)  Likert scale questionnaire questions  ANOVA, Kruskal–Wallis test 11

 widespread recognition in the questionnaire that collecting evaluation/appraisal data was worthwhile (Q17 Do you personally think it is worthwhile to gather student evaluation/appraisal data about teaching and courses/papers?) combined institutions by institution 12

 over half the respondents find their centralised system effective at gathering useful/meaningful data for them (1 or 2 rating). Sixteen percent, on average, found the centralised system not effective (4 or 5 rating) (Q19 How effective is your institution’s centralised evaluation/appraisal system in gathering useful/meaningful student data for you?) combined institutions by institution 13

Q18 Please explain your answer to Q17  enhancing role of evaluations [Evaluation] is great way to learn about what is good and what is bad - hard to be totally objective about your methods etc and students are great at honesty in this forum! (Q18 sub theme 1b, OP, lecturer teaching position, 0-5 years’ tertiary teaching experience, permanent, Sciences)  limiting role of evaluations What is not good in our system is the standard format which is unsuited to so many diverse courses. (Q18 sub-theme 2c, WU, lecturer teaching position, years’ tertiary teaching experience, continuing, Humanities) I do not however, approve of the institution's tendency to use them [evaluations] as weapons against staff. A heavy-handed hierarchical approach from academics with little knowledge of the course or the students and even less interest in teaching or its context, is counter-productive. (Q18 sub theme 2a, OU, lecturer teaching position, years’ tertiary teaching experience, permanent, Humanities) 14

 mistrust about students’ reliability You should treat them [student evaluations] with a pinch of salt. Students don’t have a long term perspective. Personality is a big factor. (OU, interview) I used to believe that [students can make judgements about teaching], but now I no longer believe that. I think in terms of how…students are believing they are buying a qualification..it’s more like purchasing their degree. (WU, interview) Students have bullied staff and they use evaluations as an opportunity to dump on staff. (OP, interview)  preferences for other forms of student feedback The rating questions are rather useless but perhaps useful for a promotion committee to make broad judgements. That is their sole value, nothing else. The reason for that is that they do not specifically tell you what is wrong or what is right. The comments do that best. Also, as mentioned earlier, the statistical rigour in many of these appraisals would make a real statistician seriously question their meaning. (Q18 sub-theme 2a, WU, lecturer teaching position, years’ tertiary teaching experience, continuing, Sciences) 15

 suspicion of institutional use of student evaluations It is valuable, but only if taken in context! One negative comment and 100 positive ones is a very good result. However, management have a tendency to focus on that one comment. Often there are reasons other than the quality of the teaching for negative evaluations. (Q18 sub-theme 2a, OP, senior teaching position, 6-10 years’ tertiary teaching experience, permanent, Health Sciences)  lack of faith in the process and instrument The institution tries to do too much with this limited data. (OU, interview) The student evaluations instrument is unreliable, contrived and manipulated. (OU, interview)  suspicion of manipulation by colleagues People are more careful to choose questions that are more likely to yield a positive response. (OU, interview) 16

 Teachers are generally positively disposed towards the student evaluations, although not well informed about student evaluation purposes and effective use;  Perceptions of student evaluations seem to be connected to:  their expressed teaching beliefs and emotions;  their concerns with the quality of student evaluation instruments;  misgivings about students' competency to judge;  disenchantment with a student evaluation system that can be manipulated easily by academics;  lack of institutional support for, and recognition of, teaching; and  their preoccupation with research. (Context and personal experience determined the extent of these views.)  Evaluation tends to be seen as an individual and private activity.  Many teachers (predominantly university) have grave reservations about institutional reliance on a single evaluation instrument to measure the quality and effectiveness of teaching/courses. 17

For institutions:  ensure that there is a clear alignment between institutional vision/policy statements and processes of implementation;  recognise and acknowledge that student evaluation is first and foremost about development, and therefore the developmental and auditing purposes of student evaluation should be clarified within that frame;  be aware that expectations about roles and responsibilities in evaluation/appraisal can be ambiguous, and so connections among performance, evaluation and reward need to be clearly understood by all (teaching and administration/personnel staff and students). 18

For evaluation systems:  recognise and acknowledge the that staff perceptions about evaluation vary and provide appropriate support and resources, to address teacher expectations and needs, without compromising institutional intents and purposes  include processes and practices (including an ongoing professional development strategy) that target both developmental and auditing purposes, while recognising the complementarity of the purposes  recognise that multiple forms of evaluation are more likely to represent a ‘well-rounded’ description of teaching and courses. 19

Sarah Stein Higher Education Development Centre (HEDC) University of Otago Dunedin, New Zealand  Look for the full and summary reports, soon to be on the Ako Aotearoa website 20