October 2008 Using qualitative data to prove and improve quality in Australian higher education Geoff Scott, Leonid Grebennikov and Mahsood Shah Office.

Slides:



Advertisements
Similar presentations
EU Presidency Conference Effective policies for the development of competencies of youth in Europe Warsaw, November 2011 Improving basic skills in.
Advertisements

1 Entering through the same door - Universal design put simple Soren Ginnerup Danish Building Research Institute Consultant to the COE group on Universal.
KATIE BUCKLEY, HARVARD UNIVERSITY SCOTT MARION, CENTER FOR ASSESSMENT NATIONAL CONFERENCE ON STUDENT ASSESSMENT (NCSA) NATIONAL HARBOR, MD JUNE 22, 2013.
Introduction to VET Quality Assurance in the UK Mark Novels 6 th December 2011 Quality Assurance in Technical and Vocational Education and Skills Study.
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
NCATS REDESIGN METHODOLOGY A Menu of Redesign Options Six Models for Course Redesign Five Principles of Successful Course Redesign Four Models for Assessing.
Learner-Centered Education Course Redesign Initiative Builds upon work of prior LCE grants Will award grants of $40,000 - $50,000, with the option.
Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Qualifications Update: National 5 Music Qualifications Update: National 5 Music.
1 Assessing and Giving Feedback. 2 Learning Outcomes By the end of the session, participants should be able to: Identify the purposes and use of different.
Experience of using formative assessment and students perception of formative assessment Paul Ong Greg Benfield Margaret Price.
Knowledge Exchange: The KEy to your Future
1 Developing Assignments using the assessment grid.
Skills for Life Support Programme T: F: E: W: The Skills for Life.
Enhancing student learning through assessment: a school-wide approach Christine OLeary & Kiefer Lee Sheffield Business School.
HE in FE: The Higher Education Academy and its Subject Centres Ian Lindsay Academic Advisor HE in FE.
Providing Effective Feedback
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
Unclassified Performance Management Process For Unclassified Administrative and Professional (Non-Teaching) Employees Summer 2011.
Configuration management
Presenter: Beresford Riley, Government of
Management Plans: A Roadmap to Successful Implementation
Southwood School: A Case Study in Training and Development
A Roadmap to Successful Implementation Management Plans.
© 2003 By Default! A Free sample background from Slide 1 A First Course in Database Management Jeanne Baugh Department of.
1 New York State English as a Second Language Achievement Test (NYSESLAT) Presented by: Vanessa Lee Mercado Assistant in Educational Testing Office of.
1 Market Pricing Organizations seek to offer market based pay rates in order to attract and retain competent employees There are two basic methods to recognize.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
[Insert faculty Banner] Consistency of Assessment
1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.
What is Pay & Performance?
Victorian Curriculum and Assessment Authority
Promoting Regulatory Excellence Self Assessment & Physiotherapy: the Ontario Model Jan Robinson, Registrar & CEO, College of Physiotherapists of Ontario.
QAA-HEA Education for Sustainable Development Guidance Document Consultation 5 November 2013, Birmingham Professor James Longhurst Assistant Vice Chancellor.
Sydney Institute Head Teacher Forum 29 th August 2012 Sydney Institute Head Teacher Forum 29 th August 2012 Gaining and Retaining Students Professor Geoff.
Developing an Effective Tracking and Improvement System for Learning and Teaching Achievements and Future Challenges in Maintaining Academic Standards.
How graduate attributes could redefine how we teach and how students learn......(but haven’t) National Learning and Teaching Forum Melbourne, Australia.
Seminar for Teacher Assistants
Designing an education for life after university: Why is it so difficult? CHEC, South Africa March 2011 A/PROF SIMON BARRIE, THE UNIVERSITY OF SYDNEY.
Korkeakoulujen arviointineuvosto — Rådet för utvärdering av högskolorna — The Finnish Higher Education Evaluation Council (FINHEEC) eLearning and Virtual.
Controlling as a Management Function
The Student Experience at Risk: Government Policy in the Measurement and Enhancement of Student Experience Dr Laura Hougaz Study Connections, ACPET Assoc.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Flexible Assessment, Tools and Resources for PLAR Get Ready! Go! Presenter: Deb Blower, PLAR Facilitator Red River College of Applied Arts, Science and.
NSW Curriculum and Learning Innovation Centre
The Five Working Groups Faculty Development Scaling-Up Post-Graduate programmes and 1.Research & Development 2.Innovation 3.Industry - Institute Interaction.
Sub-heading ADMINISTRATOR EVALUATION AND SUPPORT SYSTEM Curriculum, Instruction and Assessment Leader Proposed Adaptations.
2015 Program Session Three 1. itali.uq.edu.au Before SemesterDuring Semester Session One What are the expectations for tutor professionalism.
Australasian Higher Education Evaluation Forum (AHEEF) 2008 "Evidence based Decision Making: Scholarship and Practice" 2-3 October, 2008 Australasian Higher.
Online marking with Turnitin Assignments September 2013 Tracy Donelly, Turnitin Officer.
1 Research Questions & Hypotheses. 2 Research questions/hypotheses Viewed within the context of logical structure and objectives.
An Assessment Primer Fall 2007 Click here to begin.
ACADEMIC INFRASTRUCTURE Framework for Higher Education Qualifications Subject Benchmark Statements Programme Specifications Code of Practice (for the assurance.
Standards and Guidelines for Quality Assurance in the European
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Developing teaching and learning standards in a new regulatory environment Elizabeth Deane, Kerri-Lee Krause and Geoff Scott University of Western Sydney,
Research Project at Cabra
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
2:15 – 2:30.  Software for text analysis  The CEQ collects written comments under the headings “Best aspects” and “Needs improvement” ◦ What were the.
INTRODUCTION TO STUDY SKILLS. What are Study Skills?  Study skills are approaches applied to learning. They are considered essential for acquiring good.
Certificate IV in Project Management Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
Introduction to Development Centres Sandra Schlebusch The Consultants.
Looking at our School—LAOS School Development Planning Initiative.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Presentation transcript:

October 2008 Using qualitative data to prove and improve quality in Australian higher education Geoff Scott, Leonid Grebennikov and Mahsood Shah Office of Planning and Quality University of Western Sydney

2 Introduction –Limited use of qualitative data in institutional performance assessment –Advantages and benefits of using qualitative data –UWS experience in the systematic analysis of the qualitative data from student feedback surveys Method –CEQuery qualitative analysis tool –Comparative analysis of qualitative data generated by three key UWS student surveys Results and discussion Implications Outline

3 Qualitative data in institutional performance assessment Receive limited attention Cover aspects of student experience which are untapped in existing evaluations Identify reasons for statistical results which may be different from what researchers assume Define in students own words what they find important Should complement quantitative data

4 UWS experience in the systematic analysis of the qualitative data from student feedback surveys Since 2006 all UWS student surveys covering - overall experience at the University level - particular course or program - specific subjects invite respondents to answer two questions in their own words: What were the best aspects of their course/unit? What aspects of their course/unit are most in need of improvement?

5 UWS experience in the systematic analysis of the qualitative data from student feedback surveys Written comments are automatically classified by the CEQuery qualitative analysis tool into five domains and 26 subdomains using a custom-tailored dictionary. CEQuery results are integrated into Annual Course and Unit Reports in order to better identify key hot spots for improvement and actual solutions from the student perspective. Actual comments can be viewed once sorted into specific CEQuery domains and subdomains. High importance areas are used in course accreditation and review, and to validate rating items on surveys.

6 Comparative analysis of qualitative data from three key student surveys Survey 1: Covers total university experience; sample – 3,492 current students; 9,410 written comments Survey 2: The national CEQ covers graduate experience of the course just completed; sample – 2,734 respondents; 4,213 written comments Survey 3: Evaluates individual subjects each time they are offered; sample – about 200,000 students each year; 94,803 written comments

7 About CEQuery Best Aspect (BA) and Needs Improvement (NI) hits are coded and sorted into domains then subdomains. 5 domains – Assessment, Course Design, Outcomes, Staff, and Support, and 26 subdomains Hit rate – 80%, allocation accuracy – 90% BA + NI = Importance BA / NI = Quality Custom-tailored dictionary

8 CEQuery subdomains Assessment –Expectations –Feedback –Marking –Relevance –Standards

9 CEQuery subdomains Assessment: Expectations Provision of clear assessment tasks and expectations on how to tackle and present them; clear submission deadlines, guidelines rules and grading criteria. Provision of examples of work, to give an operational picture of different grades and quality of work in each subject. Typical NI comments Expectations for assignments need to be clearer Lack of clear criteria for marking More explanations than just expecting us to know or guess Better description of tasks

10 CEQuery subdomains Assessment: Feedback Promptness with which assignments are returned, use of staged deadlines, quality of the feedback received including the extent to which markers comment on what was done well, explicitly identify key areas for improvement and say how improvements could have been achieved – with specific attention to the grading criteria distributed at the start of the subject. Typical NI comments Im still trying to get back an assignment over 5 months old When returning essays tutors should give more detailed feedback so students know exactly how to improve work We only received one assessment back before the subject finished

11 CEQuery subdomains Course Design –Flexibility –Learning methods –Practice-theory links –Relevance –Structure Outcomes –Further learning –Intellectual –Interpersonal –Personal –Knowledge/skills –Work application

12 CEQuery subdomains Staff –Accessibility –Practical experience –Quality & attitude –Teaching skills Support –Infrastructure –Learning resources –Library –Social affinity –Student administration –Student services

13 More information on CEQuery Scott, G. (2006). Accessing the student voice: Using CEQuery to identify what retains students and promotes engagement in productive learning in Australian higher education. _resources/profiles/access_student_voice.htm _resources/profiles/access_student_voice.htm

14 The CEQuery study of comments from students in 14 universities - key implications for student retention and engagement It is the total experience that counts. Teaching is not learning Learning is a profoundly social experience. Need for more research on how various forms of IT-enabled learning do and do not add value as part of a broader learning design 60 learning methods, especially active and practice oriented ones depending on FOE and level of study Traditional lectures and class-based methods must be seen as just one of the options not the sole one.

CEQuery Subdomain Hits and Ranks across Three Levels of Student Experience

CEQuery Subdomain BA/NI ratios and Ranks across Three Levels of Student Experience

17 Discussion Why do very important CEQuery subdomains demonstrate patchy results in terms of quality? - Variety of factors shaping student experience - Extent of multi-campus university operation

18 Six areas of student experience that warrant an improvement focus Assessment (standards, marking, expectations management and feedback) Student Administration Course Structure

19 Six areas of student experience that warrant an improvement focus High-hit CEQuery Subdomains with low BA / NI ratios

20 Six areas of student experience that warrant an improvement focus Assessment (standards, marking, expectations management and feedback) Student Administration Course Structure Staff: Quality and Attitude (at the overall university level) Student Support: Infrastructure (course and subject level) Student Support: Learning Resources (course level)

21 UWS improvement actions based on quantitative and qualitative data (analysed via CEQuery) from student feedback surveys Introduction of online enrolment Implementation of the online complaints resolution system New assessment policy Introduction of assessment focused self-teaching guides for each subject A range of new, targeted transition support programs A number of new, free study assistance workshops and programs Use of the more interactive version of the online learning system. More opportunities for practice-based learning, e.g., through increased engagement with regional employers and industry bodies Results: Improvement in CEQ OS by 10% in three years, retention by 4%

22 Concisely, the systematic analysis of qualitative data helps: Generate a more focused and evidence-based set of good practice guidelines and areas for quality improvement down to the course and unit level Ensure that course and subject design focus on what counts for students, as courses and units are implemented and reviewed Inform what is and is not tracked in quantitative surveys, validate the items in these surveys to ensure they cover what is really important to students Assist in making staff development programs more relevant by providing BA and NI comments regarding each course and unit to relevant teaching and administrative staff Complement the quantitative data that are typically used to inform decision- making for the area