Download presentation
Presentation is loading. Please wait.
Published byKarin Mitchell Modified over 6 years ago
1
Monitoring Student Activity in Collaborative Software Development
Daniel Dietsch, Andreas Podelski, Jaechang Nam, Pantelis M. Papadopoulos, Martin Schäf Jesse Coultas Computer Science University of Illinois at Chicago
2
Paper Overview Goal is to develop a learning environment to provide real time feedback of a students progress Monitor undergraduate computer science students working in groups on a software development project for a class. Identify points of low engagement and insufficient collaboration patterns, so teaching team is able to provide assistance Study involved students from two different terms of the course
3
Research Questions What can be accurately and efficiently monitored?
Which are the metrics related to the degree of students’ individual engagement and the collaboration patterns occurred inside the groups? How can these metrics trigger instructional interventions from the instructors or an intelligent tutoring system?
4
Conceptualization How engaged is a group member?
What is a group member’s level of contribution? What methods of collaboration are used by a group member?
5
Method: Group Forming Students completed questionnaire for forming groups Questionnaire measured the following categories: personal data programming experience software development experience domain knowledge practical experience motivation questionnaire quality Groups formed to minimize inter-group variance 20 total groups with 4-5 students per group
6
Method: Communication
Group Mailing List Includes students in group, assigned TA, and instructors Weekly Modified Scrum Includes students in group and assigned TA Students define weekly sprint goals and review previously set goals TA provides input on tasks and answers technical and organizational questions One of the important things in scrum is that the team meet every day even for a few minutes where they discuss what their priorities for the day are, what they are blocked on and what and who else might get affected. This is in addition to the weekly/bi-weekly longer sprint meetings. It didn't appear the team did that. I'd be curious to know the effects of such differences. - Harish
7
Method: Communication (continued)
IRC Chat Student’s anonymously chat with TA’s and Instructors Supervised Programming in the Computer Lab 4 hour session for groups to work on project TA and instructors on hand to assist with technical questions
8
Method: Tools Version Control (SVN) – Log data analyzed
Bug Tracking (Trac) – Log data analyzed Nightly Build Automation – Provides nightly feedback to group Microsoft Visual Studio 2010 – IDE for development JetBrains ReShaper – Plugin to improve code quality
9
Operationalization: Qualitative
Student submits weekly written reports Review student s to the group mailing list TA analysis through interactions Red, Yellow, and Green status for a group based on TA assessment that group will complete tasks discussed for that week. They discuss some concepts without elaborating on like what was the criteria for Red, yellow, Green (rubric), I gather that they indicate number of reports they had to write, but later they indicate this metric as being a indication of their ability to finish the project?- Aditi
10
Operationalization: Quantitative
Track number of s sent to group mailing list Track usage of bug tracker (Trac) Track number of commits (code and binary) Track file ownership I don't know if the number of commits is a fair metric. I would much rather look at statistics like patch sizes, unit tests, and code coverage. Thoughts? -Harish I was confused by the metrics that they use, especially the number of commits. I feel the quality or number of lines modified as a function of the number of lines of code in the entire modified file would have been a better measure. I could have 10 commits in beautifying the code, but not having contributed anything useful in terms of functionality, or one student could be incharge of making small changes that came up in the TA discussions (like changing variable names, or just fixing syntactical errors) contributing to larger commits, which might not correctly reflect on the coding process. - Aditi Also, when counting ownership the authors do not consider the same owners they consider same number of people who modified the code. Would that in any way affect the analysis? The file could have 4 owners only one of them contributing to it through more lines of code? - Aditi
11
Student Engagement Roles and Responsibilities
Lead Coder Committed 50% or more of project code occurred in 13 groups Lead Designer Committed 80% or more of project binaries occurred in 12 groups As project progressed only 4 groups had above roles Group Leader Student with both lead coder and designer role occurred in 3 groups (at least) Free Riders Typically less than 10% contribution occurred in 15 groups with 19 total
12
Student Engagement Amount of Effort
Number of Tickets and Commits from log data Self reported number of hours
13
Student Engagement Activity Correlation
Activities are mostly positively correlated Hours seems to be correlated as expected with activities
14
Collaboration Patterns
Group State Files with shared ownership Last state of each group was strongly correlated with the final grade of the group in the course High number indicates work overlapping and possible miscommunication Rate acceleration suggests troubled collaboration Users with more commits tended to share ownership of more files
15
Discussion The authors started out with three main research questions, but I felt they did not touch upon the third research question on how intervention could be triggered. – Aditi I wonder if the data would be less meaningful if the students were required to put more effort into the reporting. –Andrew Results - A. Student Profile and Performance "students’ performance was satisfying (M = 7.50, SD = 2.52)"? I assume the previous statistics were from the -2 to 2 Likert scale questionnaire, what were the metrics for this one? 0-10? -Arthur Could be clearer on what they used for their statistical analysis - Arthur
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.