Learning Analytics (LA’s), Student Engagement and Retention

Slides:



Advertisements
Similar presentations
SHELLY POTTS, PH.D. UNIVERSITY OFFICE OF EVALUATION AND EDUCATIONAL EFFECTIVENESS ARIZONA STATE UNIVERSITY Quality Assurance Mechanisms.
Advertisements

BENCHMARKING EFFECTIVE EDUCATIONAL PRACTICE IN COMMUNITY COLLEGES What We’re Learning. What Lies Ahead.
Want to be first in your CLASSE? Investigating Student Engagement in Your Courses Want to be first in your CLASSE? Investigating Student Engagement in.
Derek Herrmann & Ryan Smith University Assessment Services.
SENSE 2013 Findings for College of Southern Idaho.
Writing Across the Curriculum (WAC) at Sojourner Douglass College Faculty and Staff Session One Saturday, November 9, 2013.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
Derek Herrmann & Ryan Smith University Assessment Services.
CCSSE 2013 Findings for Cuesta College San Luis Obispo County Community College District.
Curriculum Design. A Learner Centered Approach May, 2007 By. Rhys Andrews.
Makes Sense Strategies Dimensions of teacher knowledge Declarative knowledge Procedural knowledge Conditional knowledge.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
1 This CCFSSE Drop-In Overview Presentation Template can be customized using your college’s CCFSSE/CCSSE results. Please review the “Notes” section accompanying.
Student Engagement as Policy Direction: Community College Survey of Student Engagement (CCSSE) Skagit Valley College Board of Trustees Policy GP-4 – Education.
CCSSE 2015 Findings for OSU Institute of Technology.
CCSSE 2014 Findings Southern Crescent Technical College.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
T-527 Fall 2005 Developing Curriculum for Deep Learning How is learning assessed? Ongoing assessment and educative evaluation 10 November 2005.
Pedagogy First, then Technology Strategies for migrating to a blended delivery approach in VET.
REDESIGNING TEACHING & LEARNING AND SOMETHING ABOUT TECHNOLOGY Brian Smentkowski, Ph.D. Director, Center for the Advancement of Faculty Excellence Queens.
SUPPORT FOR YOUR STUDENT EQUITY PLAN
Good teaching for diverse learners
School Building Leader and School District Leader exam
What Is This Intentional Learning Thing?
Course Director’s Strategy Day
Quest/SmarterMeasure™
Dr Micky Kerr Leeds Institute of Medical Education Rose Dewey
Forrest Lane, Ph.D. D. Patrick Saxon, Ed.D.
Learning Without Borders: From Programs to Curricula
Redesigning Teaching Brian Smentkowski, Ph.D.
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Lesson 3: The Roles of Technology
Interact 2: Communicating
Student Engagement Data in the UK: Policy and Practice
Jackson College CCSSE & CCFSSE Findings Community College Survey of Student Engagement Community College Faculty Survey of Student Engagement Administered:
COIS40894 PROFESSIONAL AND ACADEMIC SKILLS FOR APPLIED IT I
Brett Bruner, M.S. Director of Transition & Student Conduct
MUHC Innovation Model.
Simon Welsh Manager, Adaptive Learning and Teaching Services
Student Engagement at Orange Coast College
UTRGV 2016 National Survey of Student Engagement (NSSE)
SSP4000 Introduction to the Research Process Wk1: Module Introduction
AACSB’s Standard 9: Curriculum content
CEA Case Study Marianne Farrugia.
2017 National Survey of Student Engagement (NSSE)
Evaluating Digital Learning Implementation with the CWiC Framework
Derek Herrmann & Ryan Smith University Assessment Services
Redesigning Teaching Brian Smentkowski, Ph.D.
Helping US Become Knowledge-Able About Student Engagement
Redesigning Teaching Brian Smentkowski, Ph.D.
Talent Within: Building Library Leaders Through Staff Development
Advanced Program Learning Assessment
What We Know About Effective Professional Development for Teachers
Collaborative Leadership
Dr Camille B. Kandiko Howson Academic Head of Student Engagement
UTRGV 2018 National Survey of Student Engagement (NSSE)
Learning gain metrics and personal tutoring: Opportunities and ethics
Learning gain metrics and personal tutoring: Opportunities and ethics
Using Student Survey Data to Build Campus Collaborations
Integrating digital technologies: Three cases of visual learning Professor Robert Fitzgerald Charles Darwin University IRU Senior Leaders Forum 2018.
The Heart of Student Success
UTRGV 2017 National Survey of Student Engagement (NSSE)
ITS ALIGNMENT TO THE OTHER SYSTEMS
Using Learning Gain Data as Evidence for Enhancement
Technology in education: A friend or foe ?
February 21-22, 2018.
Committee # 4: Educational Program For The MD
Designing Programs for Learners: Curriculum and Instruction
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

Learning Analytics (LA’s), Student Engagement and Retention

Overview of session Lisa – LAs, engagement & retention (15 mins) Brett – LA project at Deakin (15 mins) Questions? (5 mins) Your turn! (25 mins)

The connections…LAs, engagement & retention LAs, like any data, tell us nothing on their own… What LA’s do we collect, to answer what questions? For what purposes?

LAs a part of strategic response to retention ‘Retention, completion and success: what do we know?’ (Deakin- 2018) The 20 Course Retention Project (2017) http://dteach.deakin.edu.au/wp- content/uploads/sites/103/2019/05/Retention- What-do-we-know- 2018.pdf?_ga=2.161257213.1978603766.1558911 225-2073488728.1538950765

Deakin’s current context Retention steady for 5 years (79-81%) Cloud campus rate significantly lower Deakin University CRICOS Provider Code: 00113B

Engagement & retention Early engagement (first 6 weeks) is key to retention Timely, targeted with supports embedded in curriculum Best approach – coordinated whole-of institution response

Measurement & metrics Is engagement time on task?, reflecting on task (cognitive), investment in the task (affective)? Is it a product or a process? Measured at Unit, Course, uni level?

Measuring types of individual student engagement Behavioral Cognitive Affective Frequency questions asking Proportion of coursework emphasizing higher order thinking strategies Effort to work harder to meet instructor’s expectations Frequency of group / collaborative work Time spent on projects requiring integration and synthesis of ideas Investment to better understand someone else’s perspective Frequency of tutoring others Amount of coursework requiring practical application of knowledge or skills Time investment in studying Frequency of attending events in the community related to course material Tendency to be prepared (or lack preparation) for class Frequency of discussing course material outside of class-time Butler (2011) differentiates typical assessment indicators along each of the typically studied dimensions of student engagement. Gives a good sense of the sophistication required for LAs around engagement.

Academic Analytics “Academic Analytics reflects the role of data analysis at an institutional level, whereas learning analytics centers on the learning process (which includes analyzing the relationship between learner, content, institution, and educator)” – (Long, Siemans et al. 2011) Deakin University CRICOS Provider Code: 00113B

Learning Analytics “Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” – (Long, Siemans et al. 2011) More critically, in todays context, it enables users of learning analytics (students, academics, administrators) to take action based on the data – as the learning occurs. Deakin University CRICOS Provider Code: 00113B

Three Levels of Analytics Student The Student is categorised as an end user. Their focus is on how LA dashboard can enable them to study and engage better. The dashboard provides insight not only to their own actions but also the broader cohort and cross unit interactions. Undergraduate/Postgraduate Full/Part Time Part of an equity/diversity cohort Cloud/Located Domestic /International Teaching Academic The Teaching Academic is at the coal face. They are delivering one or more unit. Their focus is on how LA can be used to deliver interventions or gain insight in how their students are engaging with their units and to take action. Delivering multiple units High Volume of students Is Sessional or Contract staff member Managing Sessional or Contract staff Part of an equity/diversity cohort Cloud/Located Administration The Administration layer is looking at more than one unit and multiple staff. This role is more interested in a organisational view/trends rather than direct interactions with single or groups of students. Educational/Instructional Designers Course Directors Unit chair leading multiple staff in one or more units Exploring trends Managing moderation of assessments Deakin University CRICOS Provider Code: 00113B

If I had asked people what they wanted, they would have said faster horses. apocryphally attributed to Henry Ford Eadweard Muybridge: The Horse in Motion 1878 Deakin University CRICOS Provider Code: 00113B

Design and Consistency Context is Critical while data does not lie it does rely heavily on context to move from a generated data point to actionable insight. Consistency is Key for learning analytics to pull and action meaningful outputs it must have a consistency of approach within the LMS and related systems. Consistency does not mean the same and LA should not stifle innovation. Clear Documented Design Processes are crucial as they provide not only a means to interrogate the data coming out of the system but a tool to map and align outcomes (unit, course and graduate) to assessment and regulatory standards. Dashboards provide a single location to access high level data, drill down to more granular levels, customise user reports and interventions. Deakin University CRICOS Provider Code: 00113B

Over to you! We will now spend 30 minutes workshopping your responses to a set of questions we have posed? Your responses will be used to…? Deakin University CRICOS Provider Code: 00113B

Feedback on todays session & next steps Provide feedback on the project now or email: Brett.McLennan@deakin.edu.au   Deakin University CRICOS Provider Code: 00113B