Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in.

Slides:



Advertisements
Similar presentations
Peer-Assessment. students comment on and judge their colleagues work.
Advertisements

Feedback sessions - Helping first year students get the most out of assessment and feedback Sue R Whittle & Linda B Bonnett Faculty of Biological Sciences.
Creating a dialogue over feedback between staff and students in the classroom and online Heather A. Thornton
The Mid-Semester Review – Bridge the Gap Barbara Vohmann, Dean Bowyer, Debbie Philipson, Pauline Start, Mark Tree, Dr. Julian Priddle.
Introduction to Personal Development Planning. What is Personal Development Planning? The process of Personal Development Planning (PDP) is designed to.
Increasing computer science popularity and gender diversity through the use of games and contextualized learning By Mikha Zeffertt Supervised by Mici Halse.
EVALUATING WRITING What, Why, and How? Workshopping explanation and guidelines Rubrics: for students and instructors Students Responding to Instructor.
Providing Constructive Feedback
SOFTWARE QUALITY ASSURANCE Maltepe University Faculty of Engineering SE 410.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
EXAMINING AN INNOVATIVE UNDERGRADUATE DISSERTATION APPROACH: AN INTERNATIONAL STUDENT PERSPECTIVE Presentation by Dr Simon M Smith, UCLan 20/06/2013.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
1 Examining the role of Self-Regulated Learning on Introductory Programming Performance Susan Bergin, Ronan Reilly and Des Traynor Department of Computer.
Teaching in Maths Background Marking Tutorials Practical information Handouts: PGs yellow+white, UGs pink+whiteyellowwhitepinkwhite Handouts and slides.
2010 Performance Evaluation Process Information Session for Staff
Factors Affecting the Innovation- Decision Process to Adopt Online Graduate Degree Program in Thailand (IEC2014) Siripen Pumahapinyo (1) Praweenya Suwannatthachote.
Technology subjects (7-12) Consistency of moderation.
ICT TEACHERS` COMPETENCIES FOR THE KNOWLEDGE SOCIETY
An investigation of the impact of student support initiatives on the retention of computer science students Clem O’Donnell 1, James Murphy 2, Abdulhussain.
1 Integrating Google Apps for Education to Business English Student Trainees’ On-the-Job Training English Reports Asst.Prof. Phunsuk Kannarik.
Presenter: Yun-Ting, Wong Adviser: Ming-Puu,Chen Date: Dec. 09, 2009 Liu, F. I., Chen, M. C., Sun, Y. S., Wible, D., & Kuo, C. H. (2010). Extending the.
1 How can self-regulated learning be supported in mathematical E-learning environments? Presenters: Wei-Chih Hsu Professor : Ming-Puu Chen Date : 11/10/2008.
Is PeerMark a useful tool for formative assessment of literature review? A trial in the School of Veterinary Science Duret, D & Durrani,
Dr Elena Luchinskaya, Lancaster University/ Leeds Metropolitan University, UK.
Seeking and providing assistance while learning to use information systems Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Sep. 16, 2009 Babin, L.M.,
Conceptions of and approaches to learning in technology-enhanced environments Prof Chin-Chung Tsai National Taiwan University of Science & Technology,
Dafna Hardbattle, Ken Fisher & Peter Chalk London Metropolitan University International Blended Learning Conference University of Hertfordshire,
Former Student in Bsc (Hons) Major in Biology, Minor in Environmental Management.
A STUDY OF LEARNING PERFORMANCE OF E-LEARNING MATERIALS DESIGN WITH KNOWLEDGE MAPS Presenter: Teng-Chih Yang Professor: Ming-Puu Chen Date: 12/ 23/ 2009.
Jenni Parker, Dani Boase-Jelinek Jan Herrington School of Education Murdoch University Western Australia.
Effects of high level prompts and peer assessment on online learners’ reflection levels Presenter: Zong-Lin Tsai Advisor: Ming-Puu Chen Date: January 19,
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
A review of peer assessment tools. The benefits of peer assessment Peer assessment is a powerful teaching technique that provides benefits to learners,
Presenter: Ming-Chuan Chen Advisor: Ming-Puu Chen Date: 6/8/2009 The effect of feedback for blogging on college students' reflective learning processes.
Is it possible to reconcile relationships and quality assurance in the Assessment Only Route? Isabelle SchäferCatriona Robinson Friday 16 th May 2014.
ICOLIS 2007 AN ATTEMPT TO MAP INFORMATION LITERACY SKILLS VIA CITATION ANALYSIS OF FINAL YEAR PROJECT REPORTS N.N. Edzan Library and Information Science.
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
Self-assessment Accuracy: the influence of gender and year in medical school self assessment Elhadi H. Aburawi, Sami Shaban, Margaret El Zubeir, Khalifa.
Common Issues with Group Projects and Ways to Improve Them Stephanie Higgins and Bryce Raymond References Cuseo, J. (1992). Cooperative Learning Vs. Small-Group.
The effect of peer feedback for blogging on college Advisor: Min-Puu Chen Presenter: Pei- Chi Lu Xie, Y., Ke, F., & Sharma, P. (2008). The effect of feedback.
INACOL Standard D: CLEAR EXPECTATIONS PROMPT RESPONSES REGULAR FEEDBACK.
Assessing Peer Support and Usability of Blogging Technology Yao Jen Chang Department of Electronic Engineering Chung-Yuan Christian University, Taiwan.
Review in Computerized Peer- Assessment Dr Phil Davies Department of Computing Division of Computing & Mathematical Sciences FAT University of Glamorgan.
Using games and simulations for supporting learning Presenter: Hsiao-lan Lee Professor: Ming-Puu Chen Date: 03 / 09 / 2009 de Freitas, S. I. (2006). Using.
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: March 02, 2009 Papastergiou, M.(2009). Digital Game-Based Learning in high school Computer Science.
An Evaluation of an On-line Anatomy Course by Lab Instructors: Building on Instructional Design Guo, X., Katz, L., & Maitland, M. The University of Calgary,
Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu.
Employing Wikis for online collaboration in the e-learning environment: Case study 1 Raitman, R., Augar, N. & Zhou, W. (2005). Employing Wikis for online.
Stephen Broughton Paul Hernandez-Martinez Carol L. Robinson Lecturers’ beliefs and practices on the use of computer-aided assessment to enhance learning.
1 The role of feedback and self- efficacy on web-based learning: The social cognitive perspective Source: Computers & Education 51 (2008) 1589 – 1598 Authors:
Measures of Teacher Stages of Technology Integration and Their Correlates with Student Achievement Christensen, R., Griffin, D., & Knezek, G. (2001,March).
The Impact of Student Self-e ffi cacy on Scientific Inquiry Skills: An Exploratory Investigation in River City, a Multi-user Virtual Environment Presenter:
1 Hypermedia learning and prior knowledge: domain expertise vs. system expertise. Timothy J. F. Mitchell, Sherry Y. Chen & Robert D. Macredie. (2005) Hypermedia.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
1 Goal Setting as Motivational tool in Student’s Self-regulated 指導教授: Chen, Ming- Puu 報告者 : Chang, Chen-Ming 報告日期: Cheug, E. (2004). Goal setting.
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
The role of feedback and self-efficacy on web-based learning: The social cognitive perspective Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jun.
Computerised self/peer support, learning and assessment of final year undergraduates? Phil Davies University of Glamorgan.
Developing a Work Based Portfolio
Final Presentation, European Cooperative House Brussels, 16 Dec.2009 Training +45 “Teachers Facilitating Learning among Elders” WP5 Test and Evaluation.
Structuring Learning. Agenda Structuring learning. Structuring lab sessions. Engagement. Critical Thinking. Ideas for structuring learning. Activity.
Peer feedback on a draft TMA answer: Is it usable? Is it used? Mirabelle Walker Department of Communication and Systems.
Copyright © May 2014, Montessori Centre International.
Angela Kleanthous University of Cyprus May 20th, 2017
Information for Parents Key Stage 3 Statutory Assessment Arrangements
"Feeding back and looking forward” Research about the development of academic skills and the use of feedback Jim Crawley Institute for Education, Bath.
Teacher Roles and Responsibilities
Introducing e-learning and imaging technology into histology practical classes in veterinary medicine: Effects on student motivation and learning quality.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Sue Forsythe, Cathy Smith, Charlotte Webb Mathematics Education
Presentation transcript:

Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in an undergraduate programming class. Journal of Computer Assisted Learning,24, 217–231. Computer support of effective peer assessment in an undergraduate programming class

Some students see learning as a matter of memorizing and comprehending knowledge only to cope with course requirements, and these are strategies in surface learning (Entwistle 2001). Others see learning as a way to satisfy their own requirements to develop new skills by relating previous knowledge with experiences, and these are strategies in deep learning (Entwistle 2001). However, most tools are appropriate for surface learning, such as helping in program construction, compilation, testing and debugging (Deek & McHugh 1998). Peer assessment involves students in the learning and in the assessment process. It is a tool for learning, and students can learn through marking by making judgements and providing feedback on other students’ work (Brown et al. 1997; Davies 2000). Introduction

We are interested in finding out whether peer assessment, which is a powerful technique for fostering deep learning, is an accurate assessment method in a computer programming course. The peer assessment investigation was performed on 213 first year undergraduate students enrolled on a first year UNIX programming module in the authors’ Computer Science department. During the process, the students marked and provided feedback on three consecutive assignments, and each assignment was marked by an anonymized group of three students, using a web-based peer assessment system and an anonymous communication tool. These assignments were also independently double-marked by two module tutors, in order to provide an expert reference against which the marks awarded through the peer assessment process can be compared. Introduction

Methodology Peer assessment process consists of three separate stages, including (1) students do the assignment in their own time. (2) students mark the quality of programming. (3) students mark the quality of initial marking.

In the third stage, students marked the quality of the initial marking by responding to the 10 questions that form the marking criteria, and throughout the process discussed anonymously using the Anonymous Communication Device (ACD). Methodology

In addition to the analysis of both peers’ and tutors’ marking, questionnaires and interviews were used to ascertain how satisfied students were with their marks, together with their opinions regarding the marking and the feedback they received. In this experiment, the average of the peers’ marks is higher than the average of the tutors’ marks – 17%, 7% and 8% in assignment 1, 2 and 3, respectively. The difference between the average marks decreased from assignment 1 to 3 because students had experience in marking from the first assignment. They knew exactly what the markers were looking for and they learned more about how to mark properly. Results

There was the strongest positive relationship between the peers’ and the tutors’ marks in assignment 1 (r = 0.85, n = 166, P < ). However, the correlation coefficient in assignment 3 (r = 0.62, n = 166, P < ) is lower than the correlation coefficients in assignments 1 and 2. This may be because assignment 3 is the most difficult assignment. Students have many assignments to finish at the same time at the end of term; therefore, they may not devote as much time to the marking process as they did earlier in the term. The differences between the peers’ and the tutors’ marks may result from the different marking perspectives. Students tend to give full marks for a program that they think is good, but tutors tend to give around 70% for the same program. It is interesting that students tend to be comfortable with high marks, whereas tutors do not. Results

Marking criteria

These results are supported by the following comment on marking criteria from a tutor who found that the subjective questions are difficult to mark consistently compared with the objective questions. Marking criteria

Comment issues It can be concluded that peers identify similar comment issues to tutors (because of using the same marking criteria), and provide more explanations and suggestions than tutors who provide concise feedback without further elaboration. However, they focus on different issues – for example, tutors focus on program correctness (especially meeting the program specification, and whether there are programming mistakes), while peers focus on the program readability and style.

Questionnaire and interview analysis Are the comments from peers useful? Most students found the comments from peers with suggestions for improving their programming abilities to be useful, and helped them when doing the next assignment. Are students satisfied with marks from peer assessment? There were two sets of marks awarded by students in peer assessment (i.e. quality of program and quality of marking). Results from the questionnaires indicate that 74% of students (116 out of 156) were satisfied with their marks from peer assessment.

Questionnaire and interview analysis Do students feel comfortable when assigning marks? Results from the online questionnaire indicate that 122 students felt comfortable when assigning marks, but 44 students did not. The major problem seems to be that students believe that they are not qualified to be a marker because they have just started to learn how to program and they do not have enough knowledge to mark their friends’ work.

Most students were actively involved in the learning process, were satisfied with marks given by their peers and accepted that the comments from peers were useful. Students became familiar with the assessment process, and what is required to be achieved, and as they practiced their marking, their confidence increased. Conclusions

These results suggest that the web-based peer assessment system can be used to promote deep learning, and to develop students’ professional skills by requiring them to make evaluative judgments and provide specific feedback on other students’ work.