Presentation is loading. Please wait.

Presentation is loading. Please wait.

Core Methods in Educational Data Mining

Similar presentations


Presentation on theme: "Core Methods in Educational Data Mining"— Presentation transcript:

1 Core Methods in Educational Data Mining
EDUC619 Spring 2019

2 The Homework Let’s go over the homework

3 Let’s go over some of the solutions you handed in….
I will call on a small number of you

4 Let’s go over some of the solutions you handed in….
If I call on you, please come up and discuss What you did If you’re not the first person I call, please focus on how your solution differed from previous students How well it worked If you’re in the audience, please ask questions But be nice…

5 Ruikun Han

6 Michael Frisone

7 Who Who cross-validated or held out a test set? Who didn't?

8 Who Who cross-validated with a student batch? Why?

9 Different batch variables
Ruiqing, you cross-validated with Avgtimelast3SDnormed as batch. Peiying, you cross-validated with time as batch. These are unusual choices. Can you provide some justification?

10 Who used Python? Who used RapidMiner? Who used R?

11 Anyone else? Does anyone else in the audience have
Something clever they did and want to share? Something clever they didn’t do but want to discuss? A concern about how to do this right?

12 What mattered? What could you do to get better model performance?
(Without cheating)

13 Questions? Comments? Concerns?

14 Ch 2, V5, start at slide 6

15 Textbook/Readings

16 What is a behavior detector?

17 What are some of the methods for collecting ground truth for complex behavior?

18 What are some of the methods for collecting ground truth for complex behavior?
What are their advantages and disadvantages?

19 What are some indicators of ground truth for student success?

20 What are some indicators of ground truth for student success?
What are their advantages and disadvantages?

21 Thoughts on the San Pedro et al. case study?

22 Thoughts on the Sao Pedro et al. paper?

23 Thoughts on the Kai et al. paper?

24 Grain-sizes Which grain-size(s) were the detection focus for each paper/case study?

25 Grain-sizes What are the advantages and disadvantages of working at these different grain-size(s)? Student-level Action-level Observation-level Problem/Activity-level Day/Session-level Lesson-level

26 Why… Should we not expect (or want) Detectors with Kappa = 0.75
For models built with training labels with inter-rater reliability Kappa = 0.62?

27 Other questions, comments, concerns about lectures?

28 Basic HW 2

29 Basic HW 2 Take a couple of models
Apply some standard metrics for them Available on TutorShop

30 Basic HW 2 If you did not click DONE (were unable to) in Basic HW 1
Please go back and do that now Sorry for the glitch

31 Other questions or comments?

32 Next Class Feburary 20 Diagnostic Metrics
Baker, R.S. (2015) Big Data and Education. Ch. 2, V1, V2, V3, V4. Jeni, L. A., Cohn, J. F., & De La Torre, F. (2013). Facing Imbalanced Data--Recommendations for the Use of Performance Metrics. Proceedings of Affective Computing and Intelligent Interaction (ACII), Knowles, J. E. (2014). Of needles and haystacks: Building an accurate statewide dropout early warning system in Wisconsin. Madison, WI: Wisconsin Department of Public Instruction.  Kitto, K., Shum, S. B., & Gibson, A. (2018). Embracing imperfection in learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp ). ACM. Basic HW 2 due!

33 The End


Download ppt "Core Methods in Educational Data Mining"

Similar presentations


Ads by Google