Universities Scotland Retention Project Emerging findings Jane W. Denholm Project Consultant
Today’s session Background, aims and methodology Outline some key findings Enlist help with shaping final report
Background Sponsored by SFC Overseen and developed by a project steering committee Aim: to produce robust and evidence-based analysis of current practices in defining and monitoring undergraduate student retention, and systems of data collection and analysis Key feature: to offer institutions the opportunity to share and learn from each other’s experiences NOT directly about retention - about using data to support retention Three phases
Methodology Phase 1: scoping work – in depth, frank, confidential interviewing informed… Phase 2: main feature - developmental workshops generating information for… Phase 3: final report - learning outcomes and case studies for dissemination across the sector - due April 2012
Emerging key findings Synthesis - ongoing being nuanced and tested on/by Steering Group members Key findings - highlights for LfA audience: Data quality and definitions Retention practice trends Student engagement Using evaluation data Mainstreaming data analysis Sharing practice
Key findings 1: Data quality and definitions Management information and data is of good quality and is in practical use supporting retention Define terms to clarify data collection requirements: eg retention, progression, withdrawal, non-completion etc interrelated concepts definition is not straightforward Official definitions exist for important external reporting purposes: SFC, HESA. Drives institutional processes and practices which are in large part configured to meet HESA and SFC requirements. But limited use in improving retention internally because: high level – lacks nuanced detail retrospective – too late to act highly processed – ‘management’ information – lack of ownership/recognition in other parts of the institution Need to recognise that a range of definitions are required that are fit for purpose and drive data requirements accordingly
Key findings 2: Retention practice trends Project focus on data but considered current retention issues Key issue: trend away from simply defining ‘at risk’ students by particular characteristics (pertaining to age, gender, class, previous familial experience of HE etc)… …to also considering their behaviour when studying Most (all?) Scottish HEIs now have some sort of system in place to track early signs of ‘disengagement’
Key findings 3: Student engagement Student ‘engagement/disengagement’ emerged as a concept important to retention Defined for these purposes as ‘contact’ with the institution traditionally measured by: attendance at classes fulfilment of assignments/performance Now can be measured by monitoring engagement/disengagement with the electronic life of an institution. Staff can piece the student’s electronic footprint into an overview by eg: library usage lab usage use of VLE use of terminals around campus use of shops and refectories Big potential here, esp. because data is in real time, so information can be communicated to the right people to ensure that the right interventions are made at the right time Clear benefits if issues overcome and systems properly integrated
Key findings 3: Student engagement (ctd) Monitoring is not without some issues to be resolved: ethical regarding role of technology are we spying or supporting – how far do we go? ‘ideological’ how appropriate is it to make interventions into undergraduate lives? practical data management is massive - counting >200,000 students daily if there are, say, 6 attendance items per week up to 40m data items! Need for institutional (national?) policies and student input/ownership and understanding
Key findings 4: Using evaluation data Evaluations can be: a source of useful data improved by utilising existing or bespoke data Evaluations tend to use qualitative methodologies: can help with determining extent of impact can be immediate eg via focus groups. Can help explain the effects that show up in high-level data and league tables and can corroborate and/or feed back into retention practice Quantitative data is often at a higher level and more useful further down the line HEIs are now interested in blending quantitative and qualitative methods to strengthen analyses. This requires work in multidisciplinary teams: eg planners, statisticians, researchers with evaluation skills and project managers.
Key findings 5: Mainstreaming data analysis Good quality retention data is available to institutions Potential to do more with it and potential new sources eg electronic footprint but: How to manage to optimum effect? Ideally: acknowledge different data is required for different purposes meet HESA and SFC requirements but ensure they don’t overshadow processes and practices that could be useful promoting student retention within the institution in a practical sense commission data at a range of levels and at appropriate times get the right level of information to the right people, at the right time to act Actions include: Dialogue between different types of staff eg planners and retention specialists, managers and academic staff working together to examine data and data requirements, what its telling them, how it can be fed into the right place at the right time Sharing practice
Key findings 6: Data and practice sharing Every institution will have its own requirements and context is crucial Wide, if not universal, support for sharing good practice in: developing useful working definitions using retention data effectively better understanding processes and systems for data monitoring, collecting, and reporting identifying how this can be done in a systematic and efficient way dialogue with key staff with different job roles Report will form a basis for this including case study examples
Questions for discussion groups There is still time to influence the thinking going into the report and help the Steering Committee with potential next steps What issues would you like to see explained/emphasised/resolved? Do you agree the issue is not the quality of data but the challenge of dealing with the quantity? Student characteristics v behaviour/engagement – does this ring true? Would protocols regarding ethical use of electronic footprint data be useful and, if so, is this best done at institutional or national level? Are there other sector-wide issues that could be tackled centrally? What can the sector do next to take all of this work forward? What role can Universities Scotland and SFC play in taking this work forward?
Any questions?