An Empirical Study of In-Class Labs on Student Learning of Linear Data Structures Sarah Heckman Teaching Associate Professor Department of Computer Science North Carolina State University ICER 2015
Problem ICER 2015 CSC116CSC216CSC sections 33 students 1 instructor 2 TAs Lecture/Lab 1-2 sections students 1 instructor 2-3 TAs Lecture 1-2 sections students 1 instructor 2-3 TAs Lecture Transition! Retention! Lab? In-Class Labs? Do Nothing?
Research Goal To increase student learning and engagement through in-class laboratories on linear data structures Hypothesis: active learning practices that involve larger problems would increase student learning and engagement ICER 2015 In-class Labs > Pair & Share
Research Questions Do in-class laboratories on linear data structures increase student learning on linear data structures exam questions when compared to active-learning lectures? Do in-class laboratories on linear data structures increase student engagement on linear data structures exam questions when compared to active-learning lectures? ICER 2015
Active Learning in CSC216 “engaging the students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert” [Freeman, et al. 2014] Control: Active Learning Lectures –2-5 pair & share exercises per class –Submitted through Google forms Treatment: In-class Labs –Lab activity for the entire lecture period –Pre-class videos introduced topic ICER 2015
Study Participants MetricSection 001Section 002 # Enrolled85102 Participants (completed course) 4960 Dropped/Withdrawn (consenting only) 34 Women910 Meeting TimeTH 2:20-3:35pMW 2:20-3:35p ICER 2015 Self-selected into section during standard registration period Populations were similar as measured by a survey on experience with tooling and self-efficacy.
Methods Quasi-Experimental –Counter-balanced design –Learning measured through exams –Engagement measured through observations of class meetings ICER 2015 Lists Array Linked Iterators Exam Replication Materials: labs/csc216_labs.html Replication Materials: labs/csc216_labs.html Observed Class Meetings
Student Learning – Exam 1 Part 4: Method Tracing with ArrayLists Part 5: Writing an ArrayList method ICER 2015 ItemPointsS001 Mean S001 SD S002 Mean S002 SD p-value E1 P4# < E1 P4# E1 P4# E1 P < E1 P
Student Learning – Exam 2 Part 3 – Linked Node Transformation Part 5 – Writing a LinkedList Method ICER 2015 ItemPointsS001 Mean S001 SD S002 Mean S002 SD p-value E2 P < E2 P
Student Learning – Exam 3 Comprehensive 3 hour final exam Stack Using an ArrayList Queue Using a LinkedList ICER 2015 ItemPointsS001 Mean S001 SD S002 Mean S002 SD p-value E3 Array E3 Linked E3 Score
Student Engagement Observations for ArrayList and LinkedList class meetings Observers were graduate students and a colleague participating in a Teaching and Learning seminar Counts of students off topic during lecture and exercise portions of the class Some inconsistent use of the observation protocol ICER 2015
Student Engagement ICER 2015 ObservationClass Type # Off Topic – Lecture # Off Topic – Exercise Questions of Teaching Staff 1Lab5732 2Lecture Lab Lecture Lecture--- 6Lab Lecture Lab Lab Average Lecture Average Lecture / Lab
Threats to Validity External Validity –Two sections of the same course, taught by the same instructor, in the same semester, and same time of day –Replication needed in other contexts to generalize further –Could provide additional data points in future meta- analyses ICER 2015
Threats to Validity Internal Validity –Selection bias: students selected their own sections Initial surveys shows groups were similar –Confounding factors Materials shared between groups Effect size – only 6 in-class labs –Differential Attrition Bias Considered “soft-drops” in the study –Experimenter Bias Participants were not revealed until after the semester was over ICER 2015
Threats to Validity Construct Validity –Exams as Measures of Learning Exam 1 and Exam 2 were similar, but not the same, between sections Exam 3 was common Does exam really measure student learning? –Survey Wording may be confusing for prior tool experience Efficacy questions not a validated instrument –Observation Protocol as Measure of Engagement Inconsistent use by observers ICER 2015
Discussion Did in-class labs increase student learning? –No, at least not as measured by exam questions –Both control and intervention were active learning Maybe a simple active learning intervention is enough –Comparisons with earlier semesters may show more Did in-class labs increase student engagement? –Yes and No –The atmosphere in the classroom was fantastic –But many questions were technology and not concept Completion – 72% of students earned a C or higher –Not reaching the higher levels of completion we expect from active learning literature ICER 2015
Future Work Additional Work on Fall 2014 Data –Compare results on final exam with previous courses –Incorporate analysis of other measures of learning – projects, exercises, etc. Starting in Fall 2015 –Additional in-class labs → Lab-based course –Measure types of questions asked during in-class labs –Use labs as a way to encourage best practices (frequent commits to version control, TDD) ICER 2015 Replication Materials: labs/csc216_labs.html Replication Materials: labs/csc216_labs.html
Thank You! Questions? Comments? Concerns? Suggestions? ICER 2015