Download presentation
Presentation is loading. Please wait.
1
Wu et al., NJIT ©2004 1 Constructivist Learning with Participatory Examinations Dezhi Wu, Michael Bieber, S. Roxanne Hiltz Information Systems Department College of Computing Sciences New Jersey Institute of Technology Hyo-Joo Han Information Systems Department College of Information Technology Georgia Southern University
2
Wu et al., NJIT ©2004 2 Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
3
Wu et al., NJIT ©2004 3 Motivation To increase learning of course content Learning through active engagement –involve students as active participants –with the full exam life-cycle –through peer evaluation Minimize overhead for instructors
4
Wu et al., NJIT ©2004 4 Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
5
Wu et al., NJIT ©2004 5 PE Process Each student creates 2 exam problems Instructor edits the problems if necessary Each student solves 2 problems Students evaluate (grade) the solutions to the problems they authored, writing detailed justifications Other students evaluate each problem a second time Instructor gives a final grade optional: Students can dispute their solution’s grade, by evaluating it themselves and writing detailed justifications Instructor resolves the dispute All entries posted on-line
6
Wu et al., NJIT ©2004 6 Screen Shot WebBoard System
7
Wu et al., NJIT ©2004 7 Exam Process Control Assign ID Edit questions Assign who answers questions Assign level-2 graders Course Design Determine Final Grades Set up on-line environment Dispute final grade Level-1 and Level-2 graders grade solutions Make up problems Read - other problems - other solutions - grade justifications - disputes Solve problems Instructor Control ProcessStudent Learning Process Resolve Disputes Process Flow: Learning from doing the PE activities additional learning from reading everything peers write
8
Wu et al., NJIT ©2004 8 Exam Process Control Assign ID Edit problems Assign who solves problems Assign level-2 graders Course Design Determine Final Grades Set up on-line environment Dispute final grade Level-1 and Level-2 graders grade solutions Make up problems Confirmation ID, understand process Read - other problems - other solutions - grade justifications - disputes Solve problems Instructor Control ProcessStudent Learning Process Resolve Disputes
9
Wu et al., NJIT ©2004 9 Evaluation (grading) Evaluation includes: –Written critique or “justification” (positive or negative) –Optional: separate sub-criteria to critique Solution result is correct and complete (40%) Solution was well explained (30%) Solution demonstrated class materials well (10%) Solution cited appropriate references (20%) –Grade Evaluation may be disputed (optional) –Student must re-evaluate own solution when disputing example of four sub-criteria (totals to 100%)
10
Wu et al., NJIT ©2004 10 Instructor should provide… Detailed instructions and timetable Solution: what is expected Critiquing and grading guidelines
11
Wu et al., NJIT ©2004 11 Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
12
Wu et al., NJIT ©2004 12 Constructivism (Learning Theory) The central idea is that human learning is constructed, that learners build new knowledge upon the foundation of previous learning {learning throughout the exam process} Two classic categorizations –Cognitive Constructivism (Piaget’s theory) –Social Constructivism (Vygotsky’s theory)
13
Wu et al., NJIT ©2004 13 Cognitive Constructivism (Piaget 1924) Knowledge is constructed and made meaningful through individual’s interactions and analyses of the environment. --> knowledge is constructed in the mind of individual Knowledge construction is totally student- centered.
14
Wu et al., NJIT ©2004 14 Learning Learning is a constructivist, often social activity occurring through knowledge building (Vygotsky, 1978) Knowledge building activities include contributing to, authoring within, discussing, sharing, exploring, deploying a collective knowledge base (O’Neill & Gomez 1994; Perkins 1993).
15
Wu et al., NJIT ©2004 15 Learning People learn as they navigate to solve problems (Koschmann et al, 1996) and design representations of their understanding (Suthers 1999) Learning requires cognitive flexibility (Spiro et al. 1991), and results from interaction with people having different experiences and perspectives (Goldman-Segall et al. 1998)
16
Wu et al., NJIT ©2004 16 Expert-like Deep Learning Categorizing knowledge and constructing relationships between concepts are likely to promote expert-like thinking about a domain (Bransford 2000). To design appropriate problems for their peers, students must organize and synthesize their ideas and learn to recognize the important concepts in the domain. This results in deep learning (Entwistle 2000) : –seeing relationships and patterns among pieces of information, –recognizing the logic behind the organization of material –achieving a sense of understanding
17
Wu et al., NJIT ©2004 17 Where is Knowledge Constructed in PE? In all PE stages: constructing problems, solutions, grade justifications, dispute justifications When reading everything their peers write –Students also are motivated to learn more when peers will read their work (McConnell, 1999).
18
Wu et al., NJIT ©2004 18 Assessment & Learning Main goals of tests: –To measure student achievement –To motivate and direct student learning The process of taking a test and discussing its grading should be a richly rewarding learning experience (Ebel and Frisbie 1986) Assessment should be a fundamental part of the learning process (Shepard 2000)
19
Wu et al., NJIT ©2004 19 Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
20
Wu et al., NJIT ©2004 20 Course Information NJIT CIS677: Information System Principles Graduate level introductory survey core course (Masters/Ph.D.) Aim: study how IS/IT can be used effectively Both on-campus and distance-learning sections software: WebBoard Traditional Exam: –Three-hour, in class, 3-4 essay questions, 6 pages of notes Used PE 5 times between Fall 1999 and Summer 2002 We compared control groups without PE and treatment groups with PE Also, we used with shorter essay questions in CIS 365, undergraduate course on file structures in Fall 2002, with similar survey results.
21
Wu et al., NJIT ©2004 21 Enjoyability QuestionsSAANDSDMeanS.D.# I enjoyed the flexibility in organizing my resources 26.2%48.9%16.7%3.6%4.6%3.881.00221 I was motivated to do my best work 23.5%42.9%28.2%3.4%2.1%3.82.92238 I enjoyed the examination process 17.2%42.3%22.6%10.5%7.4%3.511.13239 SA - strongly agree (5 points); A - agree (4); N - neutral (3); D - disagree (2); SD - strongly disagree (1); the mean is out of 5 points; S.D. - standard deviation Cronbach’s Alpha=0.68
22
Wu et al., NJIT ©2004 22 Perceived Learning QuestionsSAANDSDMeanS.D.# I learned from making up questions 17.9%42.5%21.3%13.8%4.5%3.551.08240 I learned from grading other students answers 17.7%48.1%19.4%9.3%5.5%3.631.06237 I learned from reading other people’s answers 15.8%45.0%22.1%11.3%5.8%3.541.07240 I demonstrated what I learned in class 13.6%50.2%22.6%10.9%2.7%3.61.95221 My ability to integrate facts and develop generalizations improved 21.8%49.2%25.6%2.1%1.3%3.88.83238 I learned to value other points of view 17.6%51.9%27.6%1.3%1.6%3.82.81239 I mastered the course materials 7.4%51.6%31.4%6.9%2.7%3.54.84188 Cronbach’s Alpha=0.88
23
Wu et al., NJIT ©2004 23 Recommendation: Do Again! QuestionSAANDSDMeanS.D.# Would you recommend in the future that this exam process used? 20.7%40.1%24.5%8.9%5.8%3.601.10237 Similar results for CIS365: undergraduate file structures course using short essay questions (Fall 2002)
24
Wu et al., NJIT ©2004 24 Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
25
Wu et al., NJIT ©2004 25 Trade-offs Trade-offs for students (traditional vs. PE) –Participation: solutions only vs. entire exam life-cycle –Timing: concentrated vs. drawn-out (2.5 weeks) –Access to information: limited vs. the Internet Trade-offs for professors –Fewer solutions to evaluate, but each is different –Timing: concentrated vs. drawn-out process –Much more administration
26
Wu et al., NJIT ©2004 26 What students liked best Active involvement in the exam process Flexibility Reduction in tension
27
Wu et al., NJIT ©2004 27 Issue: Perceived Fairness Q: Should students evaluate/grade peers? –A: But they must evaluate others in the workplace… Q: It’s the instructor’s job to evaluate and grade –A: PE is a (constructivist) learning technique Q: Students have no training in evaluation –A: Evaluation is a skill that must be learnt (and taught) Q: Many evaluators = inconsistent quality –A: Safeguard in the PE process: disputing!
28
Wu et al., NJIT ©2004 28 Extending the PE Approach Which activities? –so far: exams –what about: quizzes, homeworks, larger projects, in-class projects Which problem types? –so far: short and long essay questions –what about: multiple choice, short answer, computer programs, semester projects
29
Wu et al., NJIT ©2004 29 Degree of Evaluation –Currently: students only evaluate solutions –What about evaluating: quality of problems (how good was the problem?) quality of evaluations/grades (how good was the grading?) –All could be disputed Extending the PE Approach
30
Wu et al., NJIT ©2004 30 Full Collaboration Groups for: –Problems, solutions, evaluation, dispute arbitration Requires group process support –Group roles: leader, scheduler, etc. –Process: work on each activity together or separately, internal review –Grading of individual group members –Process Tools: brainstorming, voting, etc.
31
Wu et al., NJIT ©2004 31 Support Software We plan to develop support software –Guide students (what to do next) –GSS tools for collaboration –Manage administration for instructor –Minimize overhead for students –Minimize overhead for instructors
32
Wu et al., NJIT ©2004 32 PE: Contributions Systematic technique to increase learning –Constructivist approach, actively engaging students in the entire problem life-cycle –Minimize overhead for students and instructors Experimental evaluation Supporting software Looking for collaborators to try this out with us! Thank you! Questions, please?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.