Can survey data be used as an indicator of learning outcomes? Karl Molden, Senior Planning Analyst Veronika Hulikova, Student Analyst.

Slides:



Advertisements
Similar presentations
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Advertisements

Demanding Questions and Difficult Answers Alan Maddocks Carol Newbold Loughborough University.
Conducting an Employability Audit Sharon Gedye & Brian Chalkley LTSN-GEES.
Evaluation of PDP University of Strathclyde. Why evaluate PDP now? PDP became an entitlement for all first year undergraduate students in 2004 and was.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Student Induction Student Representation
1 Taking the What is the NSS and why should I take part?
Simplification: ULN and Personal Learning Records, April 2012.
Deb Hearle and Nina Cogger.  Cardiff University: Periodic Review & Re-validation  Health Professions Council: Re-Approval  College of Occupational.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Sustaining Transformational Change Too much re-inventing of the wheel in short-term funded projects Changing funding environment Funding agencies are focusing.
Janet Williams, Peter Nelson, Sheffield Hallam University SWIPE : Cross-national curriculum development for globalised praxis.
Designing Case Studies. Objectives After this session you will be able to: Describe the purpose of case studies. Plan a systematic approach to case study.
+ Teaching psychological research methods through a pragmatic and programmatic approach. Patrick Rosenkranz, Amy Fielden, Efstathia Tzemou.
Slide 3.1 Saunders, Lewis and Thornhill, Research Methods for Business Students, 5 th Edition, © Mark Saunders, Philip Lewis and Adrian Thornhill 2009.
The Modernisation of Higher Education Introduction to LOLA methodology Anthony Vickers 27 th June 2012.
International Auditing and Assurance Standards Board Communicating Deficiencies in Internal Control to Those Charged with Governance and Management ISA.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop the Right Research Questions for Program Evaluation
Embedding information literacy into the curriculum - How do we know how well we are doing? Katharine Reedy (Open University) Cathie Jackson (Cardiff University)
What do Graduate Learners Say about Instructor and Learner Discourse in their First Online Course? By Dr. Peter Kiriakidis, PhD Abstract This study was.
1 © 2012 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill Chapter Seven Choosing to Read Actively.
Chris Evans, University of Winchester Dr Paul Redford, UWE Chris Evans, University of Winchester Dr Paul Redford, UWE Self-Efficacy and Academic Performance:
Partnership Forum 2014 Welcome. What’s New in the QA Office? Two Dedicated Collaborative Provision Staff Tina Hagger – New Collaborative Provision
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Unit 8: Uses and Dissemination of HIV Sentinel Surveillance Data #3-8-1.
Grant Writing Workshop for Research on Adult Education Elizabeth R. Albro National Center for Education Research.
Guidelines for he in fe: learning resources Kathy Ennis Senior Adviser Department of Knowledge & Information.
The new social work degree in England: fresh policies and new students? Shereen Hussein* Social Care Workforce Research Unit King’s College London * On.
Modern studies higher Question Stems.
Universal Design in Third Level Design Teaching in Ireland Marie Callanan, Antoinette Fennell, Anthony Owens, Mark R. Dyer, James Hubbard, Ger Craddock.
The Process of Conducting Research
Student Self-Perceptions Prior to Intervention at the University of Virginia Dean L. Stevenson Virginia Tech Research Performed at the University of Virginia.
Jason Leman Education Researcher Sheffield Hallam University.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
What does well-being mean for disabled children? Bryony Beresford Social Policy Research Unit University of York York. YO10 5DD
QUANTITATIVE RESEARCH Presented by SANIA IQBAL M.Ed Course Instructor SIR RASOOL BUKSH RAISANI.
Faculty Satisfaction Survey Results October 2009.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Copyright © 2012 by Educational Testing Service. Computer Based Achievement Assessment of Young Students (Grades 1, 2 and 3) Christine M. Mills ETS, Princeton,
INVESTIGATING PERCEPTIONS AND POTENTIAL OF OPEN BADGES IN FORMAL HIGHER EDUCATION Dr. Ian Glover, Sheffield Hallam University, UK Farzana Latif, City University.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
1 Equipe Plus Workshop Quality Assuring APL in UK University Lifelong Learning: An Overview Estonian Universities National Network 12,13 June 2007.
Nicky Andrew Student engagement and belonging: an evaluation of the GCU model of Academic Advising.
Office of Institutional Research Tracking Alumni Outcomes: A measure of educational effectiveness K. Tracy Barnes.
ACADEMIC PROMOTIONS Promotions Criteria Please note, these slides only contain a summary of the promotions information – full details can be found.
Week 2: Interviews. Definition and Types  What is an interview? Conversation with a purpose  Types of interviews 1. Unstructured 2. Structured 3. Focus.
Using the NSS to enhance teaching quality 22 nd June 2011 Dr Alex Buckley The Higher Education Academy.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
The Toulmin Method. Why Toulmin…  Based on the work of philosopher Stephen Toulmin.  A way to analyze the effectiveness of an argument.  A way to respond.
Accreditation 2007 Undergraduate Council September 26, 2005.
QAA COLLABORATIVE PROVISION AUDIT DRAFT REPORT. QAA CPA Process Submission by the University of Self Evaluation Document (SED) (December 2005) Selection.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
A Collaborative Mixed-Method Evaluation of a Multi-Site Student Behavior, School Culture, and Climate Program.
ESD in FE Topic Support Network Introduction to Responsible Futures 10/05/2016 Quinn Runkle Senior Project Officer - Communities and Curriculum Department.
Introduction Feedback is ‘one of the most powerful ways to… enhance and strengthen student learning’ (Hepplestone et al 2009) Anecdotal evidence suggests.
Dr Camille B. Kandiko King’s College London
How To Build An Assessment And Impact Model Dr. Suzan Harkness
Jo-Anne Kelder Andrea Carr Justin Walls
Building Evidence of Effectiveness
Introduction to Responsible Futures
(includes online “demo” video)
External Examiners Briefing Session Wednesday 12th April 2017
Responsibilities and engagement of an external examiner
Sarah Lawther, Nottingham Trent University
Emma Senior & Mark Telford.
Course Evaluation Ad-Hoc Committee Recommendations
Engagement-Motivation-Reward-Achievement
Workshop Set-Up: The aim is that at each table we have a variety of disciplines / subjects represented by (ideally) four participants. Ensure a mixture.
Do we really know how students use technology to maximise their learning? #digitalstudent Sarah Knight, Clare Killen and Alicja Shah, Jisc.
Presentation transcript:

Can survey data be used as an indicator of learning outcomes? Karl Molden, Senior Planning Analyst Veronika Hulikova, Student Analyst

| University of Greenwich Context 38,372 Students 67% Full time 82% Undergraduate In 2014 the University Student Survey was issued to: On-campus, first year undergraduates Non final-year undergraduates UK partner colleges Wholly overseas based students In 2015 we introduced surveys at module level – every student on every module is asked to answer a standard set of questions The university is developing its systems and policy with regards to analytics and their use

| Motivation Surveys are used across the sector and typically well understood by staff and students – potential to develop something clear and understandable for both groups which could be adapted easily by other institutions We know that other work carried out has found a small but significant relationship between measures of engagement and academic achievement* - this project would build on that to create “real-time” analytics from module level surveys Systems for collection of data already exist and are well supported We are "drowning in a sea of data“ – why add more if we can leverage what’s already there?"drowning in a sea of data“ * Sheffield Hallam University, Using UKES results and institutional award marks to explore the relationship between student engagement and academic achievement

| Method Match University Student Survey (USS) data with academic outcomes Statistical analysis to determine if there is a correlation between responses to USS questions and outcomes. Where/if a correlation is found, how strong is it? Process developed using R, open source statistical software. Code will be made available online, to allow other institutions to repeat and/or develop analysis. USS 2014 contained the UKES questions as a subset of the whole survey which means we can try to replicate the results from Sheffield Hallam. This material and its contents [UKES] is developed by The Higher Education Academy (“HEA”). Some of the questions in this survey are used with permission from The College Student Report, National Survey of Student Engagement, Copyright The Trustees of Indiana University. With the exception of the aforementioned, the copyright in such material and content belongs to the HEA. No reproduction, modification, or adaptation is permitted without the prior written consent of the HEA © The Higher Education Academy 2012, Amended 2015, All Rights Reserved.

| UKES Results Sheffield Hallam identified 6 UKES questions which had a statistically significant correlation with academic outcomes. Our analysis has identified 4 of the same 6 questions as having a similar correlation. One further USS question also found to have statistically significant correlation. Three of these five questions are thematically linked – around working with/explaining material to other students.

| Results

| Course Evaluation Survey Results Further to the results with the UKES questions, two questions from the course evaluation surveys were found to have a correlation with academic outcome. As these surveys are conducted every term this could be useful in developing “real-time” interventions. Further work is needed to look at the potential of adding further data to the mix to develop more sophisticated models.

| Course evaluations

| Conclusion The evidence seems to support a small, positive correlation between certain survey question responses and academic outcomes. By identifying activities which may lead to improved responses to certain questions it may therefore be possible to improve these outcomes. Similar relationships in “in-session” course level surveys have been identified which mean we may be able to develop other processes to improve academic achievement of students. | Questions?