Wu et al., NJIT © Constructivist Learning with Participatory Examinations Dezhi Wu, Michael Bieber, S. Roxanne Hiltz Information Systems Department College of Computing Sciences New Jersey Institute of Technology Hyo-Joo Han Information Systems Department College of Information Technology Georgia Southern University
Wu et al., NJIT © Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
Wu et al., NJIT © Motivation To increase learning of course content Learning through active engagement –involve students as active participants –with the full exam life-cycle –through peer evaluation Minimize overhead for instructors
Wu et al., NJIT © Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
Wu et al., NJIT © PE Process Each student creates 2 exam problems Instructor edits the problems if necessary Each student solves 2 problems Students evaluate (grade) the solutions to the problems they authored, writing detailed justifications Other students evaluate each problem a second time Instructor gives a final grade optional: Students can dispute their solution’s grade, by evaluating it themselves and writing detailed justifications Instructor resolves the dispute All entries posted on-line
Wu et al., NJIT © Screen Shot WebBoard System
Wu et al., NJIT © Exam Process Control Assign ID Edit questions Assign who answers questions Assign level-2 graders Course Design Determine Final Grades Set up on-line environment Dispute final grade Level-1 and Level-2 graders grade solutions Make up problems Read - other problems - other solutions - grade justifications - disputes Solve problems Instructor Control ProcessStudent Learning Process Resolve Disputes Process Flow: Learning from doing the PE activities additional learning from reading everything peers write
Wu et al., NJIT © Exam Process Control Assign ID Edit problems Assign who solves problems Assign level-2 graders Course Design Determine Final Grades Set up on-line environment Dispute final grade Level-1 and Level-2 graders grade solutions Make up problems Confirmation ID, understand process Read - other problems - other solutions - grade justifications - disputes Solve problems Instructor Control ProcessStudent Learning Process Resolve Disputes
Wu et al., NJIT © Evaluation (grading) Evaluation includes: –Written critique or “justification” (positive or negative) –Optional: separate sub-criteria to critique Solution result is correct and complete (40%) Solution was well explained (30%) Solution demonstrated class materials well (10%) Solution cited appropriate references (20%) –Grade Evaluation may be disputed (optional) –Student must re-evaluate own solution when disputing example of four sub-criteria (totals to 100%)
Wu et al., NJIT © Instructor should provide… Detailed instructions and timetable Solution: what is expected Critiquing and grading guidelines
Wu et al., NJIT © Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
Wu et al., NJIT © Constructivism (Learning Theory) The central idea is that human learning is constructed, that learners build new knowledge upon the foundation of previous learning {learning throughout the exam process} Two classic categorizations –Cognitive Constructivism (Piaget’s theory) –Social Constructivism (Vygotsky’s theory)
Wu et al., NJIT © Cognitive Constructivism (Piaget 1924) Knowledge is constructed and made meaningful through individual’s interactions and analyses of the environment. --> knowledge is constructed in the mind of individual Knowledge construction is totally student- centered.
Wu et al., NJIT © Learning Learning is a constructivist, often social activity occurring through knowledge building (Vygotsky, 1978) Knowledge building activities include contributing to, authoring within, discussing, sharing, exploring, deploying a collective knowledge base (O’Neill & Gomez 1994; Perkins 1993).
Wu et al., NJIT © Learning People learn as they navigate to solve problems (Koschmann et al, 1996) and design representations of their understanding (Suthers 1999) Learning requires cognitive flexibility (Spiro et al. 1991), and results from interaction with people having different experiences and perspectives (Goldman-Segall et al. 1998)
Wu et al., NJIT © Expert-like Deep Learning Categorizing knowledge and constructing relationships between concepts are likely to promote expert-like thinking about a domain (Bransford 2000). To design appropriate problems for their peers, students must organize and synthesize their ideas and learn to recognize the important concepts in the domain. This results in deep learning (Entwistle 2000) : –seeing relationships and patterns among pieces of information, –recognizing the logic behind the organization of material –achieving a sense of understanding
Wu et al., NJIT © Where is Knowledge Constructed in PE? In all PE stages: constructing problems, solutions, grade justifications, dispute justifications When reading everything their peers write –Students also are motivated to learn more when peers will read their work (McConnell, 1999).
Wu et al., NJIT © Assessment & Learning Main goals of tests: –To measure student achievement –To motivate and direct student learning The process of taking a test and discussing its grading should be a richly rewarding learning experience (Ebel and Frisbie 1986) Assessment should be a fundamental part of the learning process (Shepard 2000)
Wu et al., NJIT © Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
Wu et al., NJIT © Course Information NJIT CIS677: Information System Principles Graduate level introductory survey core course (Masters/Ph.D.) Aim: study how IS/IT can be used effectively Both on-campus and distance-learning sections software: WebBoard Traditional Exam: –Three-hour, in class, 3-4 essay questions, 6 pages of notes Used PE 5 times between Fall 1999 and Summer 2002 We compared control groups without PE and treatment groups with PE Also, we used with shorter essay questions in CIS 365, undergraduate course on file structures in Fall 2002, with similar survey results.
Wu et al., NJIT © Enjoyability QuestionsSAANDSDMeanS.D.# I enjoyed the flexibility in organizing my resources 26.2%48.9%16.7%3.6%4.6% I was motivated to do my best work 23.5%42.9%28.2%3.4%2.1% I enjoyed the examination process 17.2%42.3%22.6%10.5%7.4% SA - strongly agree (5 points); A - agree (4); N - neutral (3); D - disagree (2); SD - strongly disagree (1); the mean is out of 5 points; S.D. - standard deviation Cronbach’s Alpha=0.68
Wu et al., NJIT © Perceived Learning QuestionsSAANDSDMeanS.D.# I learned from making up questions 17.9%42.5%21.3%13.8%4.5% I learned from grading other students answers 17.7%48.1%19.4%9.3%5.5% I learned from reading other people’s answers 15.8%45.0%22.1%11.3%5.8% I demonstrated what I learned in class 13.6%50.2%22.6%10.9%2.7% My ability to integrate facts and develop generalizations improved 21.8%49.2%25.6%2.1%1.3% I learned to value other points of view 17.6%51.9%27.6%1.3%1.6% I mastered the course materials 7.4%51.6%31.4%6.9%2.7% Cronbach’s Alpha=0.88
Wu et al., NJIT © Recommendation: Do Again! QuestionSAANDSDMeanS.D.# Would you recommend in the future that this exam process used? 20.7%40.1%24.5%8.9%5.8% Similar results for CIS365: undergraduate file structures course using short essay questions (Fall 2002)
Wu et al., NJIT © Outline Motivation Participatory Exam approach A bit of theory Experimental results Interesting issues
Wu et al., NJIT © Trade-offs Trade-offs for students (traditional vs. PE) –Participation: solutions only vs. entire exam life-cycle –Timing: concentrated vs. drawn-out (2.5 weeks) –Access to information: limited vs. the Internet Trade-offs for professors –Fewer solutions to evaluate, but each is different –Timing: concentrated vs. drawn-out process –Much more administration
Wu et al., NJIT © What students liked best Active involvement in the exam process Flexibility Reduction in tension
Wu et al., NJIT © Issue: Perceived Fairness Q: Should students evaluate/grade peers? –A: But they must evaluate others in the workplace… Q: It’s the instructor’s job to evaluate and grade –A: PE is a (constructivist) learning technique Q: Students have no training in evaluation –A: Evaluation is a skill that must be learnt (and taught) Q: Many evaluators = inconsistent quality –A: Safeguard in the PE process: disputing!
Wu et al., NJIT © Extending the PE Approach Which activities? –so far: exams –what about: quizzes, homeworks, larger projects, in-class projects Which problem types? –so far: short and long essay questions –what about: multiple choice, short answer, computer programs, semester projects
Wu et al., NJIT © Degree of Evaluation –Currently: students only evaluate solutions –What about evaluating: quality of problems (how good was the problem?) quality of evaluations/grades (how good was the grading?) –All could be disputed Extending the PE Approach
Wu et al., NJIT © Full Collaboration Groups for: –Problems, solutions, evaluation, dispute arbitration Requires group process support –Group roles: leader, scheduler, etc. –Process: work on each activity together or separately, internal review –Grading of individual group members –Process Tools: brainstorming, voting, etc.
Wu et al., NJIT © Support Software We plan to develop support software –Guide students (what to do next) –GSS tools for collaboration –Manage administration for instructor –Minimize overhead for students –Minimize overhead for instructors
Wu et al., NJIT © PE: Contributions Systematic technique to increase learning –Constructivist approach, actively engaging students in the entire problem life-cycle –Minimize overhead for students and instructors Experimental evaluation Supporting software Looking for collaborators to try this out with us! Thank you! Questions, please?