Download presentation
Presentation is loading. Please wait.
Published byClement Scott Flowers Modified over 9 years ago
1
Measuring Engagement: Learning Analytics in Online Learning Dr. Griff Richards Thompson Rivers University Open Learning Electronic Kazan, 19 April 2011
2
60 o N 49 o N Distance Learning In Canada - Is growing, especially in K-12 public schools -15% of 2008 high school graduates in BC took at least one on-line course -Increase in “Blended Learning”, online activities for F2F courses. Kamloops, British Columbia
3
Learning Engagement The more a learner interacts with the content and with their peers about the content, the more they are likely to internalize it, and remember it... Richard Snow, 1980
4
Interaction Equivalency Theory (Garrison & Anderson) The quality of the interaction is more important than the source Course Content Instructor Other learners
5
Learning Engagement (Kuh) Learning engagement promotes student retention and academic success. (George Kuh) Engagement can be broadly measured for a campus using National Engagement School Survey Instrument (NESSI).
6
Learning Engagement (Kuh) Five Factors: 1. active and collaborative learning, 2. student-faculty interaction, 3. supportive campus environment, 4. enriching educational experiences, and 5. the level of academic challenge.
7
Engagement Online Same Five Factors: 1. active and collaborative learning, 2. student-faculty interaction, 3. supportive campus environment, 4. enriching educational experiences, and 5. the level of academic challenge. on line
8
Promoting Engagement= promoting learning 1.Collaborative learning activities: Small groups Positive Interdependence Individual responsibility to the group Group reporting Individual reflection Peer formative evaluation => no place to hide.
9
Online Engagement 2. Student – Faculty Interaction -> weekly pacing messages -> rapid response to questions -> re-broadcast of questions to all the class -> flexibility for individual needs -> online “presence” in forums
10
Online Engagement 3. Supportive Online Environment -> LMS – e.g. Moodle or Blackboard -> Clear expectations and timelines -> Facilitates interactions -> Provides data on student interactions with LMS -> Synchronous conferencing tools
11
Online Engagement 4. Enriching educational experiences -> Variety of learning activities -> Project based learning -> Authentic assessment -> Discovery and sharing -> Challenge in Zone of Proximal Development (Vygotsky)
12
Promoting Engagement 5. the level of academic challenge -> Clear but demanding standards -> Clear but specific instructions -> Peer expectations -> Building social capital in a learning community
13
Measuring Engagement Engagement is “action based on internal motivation” Can not directly measure engagement Must look for external traces, actions that indicate interest, and involvement Kuh – NESSI – sample every year of 100,000s of students -> statistical trends, but not individual as results come too late to correct problems!
14
Measuring Online Engagement Engagement is inferred from activity. Student interaction with Online systems leaves an “exhaust trail” of their activity Learning Analytics is not a statistical sample, it is all the data for all learners Questions: What patterns to look for? and How to interpret them?
15
LMS interaction data e.g. Moodle discussion Extract linked list Plot interactions in star map Example: Snapp
16
SNAPP Visualization
17
SNAPP shot of Welcome Students “engaged”
18
SNAPP shot of Welcome Individuals with lower engagement, 3 or less messages
19
Does Activity = Engagement ? Beer (2010) plotted total LMS interactions with academic grades. Does activity = engagement? Is Blackboard more engaging than Moodle?
20
Activity Outside LMS As learners become engaged, they move to more engaging channels: email, Elluminate, Skype This activity is not tracked by LMS. No data available.
21
Interpretation of Analytics Data patterns require verification Quantitative data requires interpretation --> make and test hypotheses --> create useful models When we measure something we risk changing it. – e.g. If learners know we count hits they may make meaningless hits to fool the system.
22
Analytics for Learners! The same analytics should be able to provide easy to understand information dashboards for students.
23
How can analytics improve online learning? Measuring something is the first step “Better to measure something than to measure nothing” (Scrivens) Need more data than page hits. We also need to ask learners about their experience, what worked, what needs improvement.
24
Confused feedback systems Over time multiple fix-it systems have crept in Steady enrolments Credible learning Happy learners
25
Analytics can help build Quality Systems Multi-layered system, Timely, Qualitative & Quantitative data Steady enrolments Credible learning Happy learners Strategic course redesign
26
Dynamic Evaluation Model (Richards & Devries) » PreparationConductReflection Design Facilitation Learning Analytics at the activity level
27
Dynamic Evaluation Model » PreparationConductReflection Design Facilitation Learning Timely feedback enables quick fixes
28
The Goldilocks Principle of Learning Analytics Too little Too much Just right Masha & Medved We need to find balance.
29
Thank you! Questions?
30
Measuring Engagement: Learning Analytics in Online Learning Dr. Griff Richards Thompson Rivers University Open Learning Electronic Kazan, 19 April 2011
31
4 places to improve on-line learning Course Content Instructor Other learners 1. Clearer design 2. Collaborative activities 3. Less lecture, more mentoring 4. More study skills
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.