Constraint-based tutoring

Slides:



Advertisements
Similar presentations
Classroom Instruction That Works Robert Marzano, Debra Pickering, and Jane Pollock August 19, 2008.
Advertisements

Characteristics of Descriptive Feedback
Learning Outcomes Participants will be able to analyze assessments
California Assessment of Student Performance and Progress
JAVA Coursework (the same for 2A and 2B). Fundamental Information The coursework is 30 marks in your O’Level = 15% of the exam Must be word processed.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
Using CourseCompass Features You must already be registered or enrolled in a current class.
Explicit Direct Instruction Critical Elements. Teaching Grade Level Content  The higher the grade the greater the disparity  Test Scores go up when.
Using MyMathLab Features You must already be registered or enrolled in a current class.
Constraint-based tutoring CPI 494 Feb 17, 2009 Kurt VanLehn ASU.
QA on Anderson et al Intro to CTAT CPI 494 & 598 Jan 27, 2009 Kurt VanLehn.
An introduction to intelligent interactive instructional systems
QA on “The behavior of tutoring systems” CPI 494 Feb 3, 2009 Kurt VanLehn.
Using MyMathLab Features You must already be registered or enrolled in a current MyMathLab class in order to use MyMathLab. If you are not registered or.
Using Rigorous & Relevant Standard Based Assessments to Evaluate Student Learning Kharma Banks Torrieann Dooley David Cox Road Elementary Drew Polly UNC.
How does an interactive learning environment affect the students’ learning? Marina Issakova University of Tartu, Institute of Computer Science Estonia.
P.E.R.T. Diagnostic Learning Pathways Math, Reading, Writing.
Design re-useable modules CurriculumCoursesLessonsPages Content modules.
Rika Yoshii, Ph.D. and Jacquelyn Hernandez CSIS Department California State University, San Marcos Send us suggestions and requests to.
Student Centered Teaching Through Universal Instructional Design Part II.
Learning SQL with a Computerized Tutor (Centered on SQL-Tutor) Antonija Mitrovic (University of Canterbury) Presented by Danielle H. Lee.
Intelligent Tutoring System for CS-I and II Laboratory Middle Tennessee State University J. Yoo, C. Pettey, S. Yoo J. Hankins, C. Li, S. Seo Supported.
Evaluation of Tutoring Systems Kurt VanLehn PSCL Summer School 2006.
The Call to Write, Third Edition
Course Information & Syllabus. Prerequisites Math Physics.
The Andes Intelligent Tutoring System: Five years of evaluations Kurt VanLehn Pittsburgh Science of Learning Center (PSLC) University of Pittsburgh.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
How Teachers Uses eTAP Digital resources for classroom instruction with IWB- Table of Contents topics match state standards. Supplement/Alternative to.
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
Presented by: Gordon Miller Christy Rybiski Gina Worley Assessing the Student.
Resources MML. Course Structure Unit 1Number Theories Unit 2Sets Unit 3 Algebra Unit 4Graphs Unit 5The Metric System Unit 6Geometry Unit 7Probability.
Experiments. The essential feature of the strategy of experimental research is that you… Compare two or more situations (e.g., schools) that are as similar.
Assessment embedded in step- based tutors (SBTs) CPI 494 Feb 12, 2009 Kurt VanLehn ASU.
INTRODUCTION: WELCOME TO STAT 200 January 5 th, 2009.
PRESENTED BY: GORDON MILLER CHRISTY RYBISKI GINA WORLEY Assessing the Student.
Curriculum Compacting GUIDELINES, PRACTICE AND NEXT STEPS COACHES MEETING MARCH 6, 2015.
Mater Gardens Middle School MATHEMATICS DEPARTMENT WHERE LEARNING HAS NO FINISH LINE ! 1.
Present apply review Introduce students to a new topic by giving them a set of documents using a variety of formats (e.g. text, video, web link etc.) outlining.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Lesson objective To understand how to draw and write up a questionnaire Success criteria: Build – discuss what makes a good question in a questionnaire.
Dr. Sarah Ledford Mathematics Educator
Classroom Assessments Checklists, Rating Scales, and Rubrics
Jovana Milosavljevic Ardeljan PEDAGOGICAL IMPLICATIONS
CVHS Math Teacher Special Education
Learning Targets and Scaffolding the Standard
Effects of Targeted Troubleshooting Activities on
National 5 Computing Science Specimen Question Paper
The ACT and Pre-ACT Tests
In vivo experimentation: An introduction
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom test and Assessment
Using MyMathLab Features
#8: Review and Assessment
Milwee Middle School Math Night
Intelligent Tutoring Systems
A Applying the New Curriculum in Classroom
Welcome Ms. Mandigo’s Core.
Topic 1: Problem Solving
Using MyMathLab Features
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
The Behavior of Tutoring Systems
Background and Overview of the MEP
Introduction to Comparative Effectiveness Course (HAP 823)
MyMathLab® Student Overview QRB/501
Mike Timms and Cathleen Kennedy University of California, Berkeley
Using CourseCompass Features
jot down your thoughts re:
Welcome Ms. Mandigo Class.
jot down your thoughts re:
Presentation transcript:

Constraint-based tutoring CPI 494 Feb 17, 2009 Kurt VanLehn ASU

Outline SQL-Tutor (previous class) ER-Tutor & Normit (today) Evaluation & Dissemination (today)

Framework for comparing step-based tutoring systems Feature SQL tutor Cognitive Step granularity & user interface Goal boxes; large & fixed granularity Goal boxes, granularity varies Interpreting a student’s step Ideal solution + constraints; String match Rules that gen. solution; Exact, intervals, regular expressions, etc. Suggesting good steps Next blank step Next step in order Feedback and hints on steps Few bugs; selectable hint from sequence Few bugs; hint sequence Task selection Some student choice Mastery learning Assessment Clauses? Constraints? Knowledge tracing Evaluations Dissemination

Let’s add two other Constraint-based tutors Feature NORMIT ER-Tutor SQL tutor Cognitive Step granularity & user interface Goal boxes; large & fixed granularity Goal boxes, granularity varies Interpreting a student’s step Ideal solution + constraints; String match Rules that gen. solution; Exact, intervals, regular expressions, etc. Suggesting good steps Next blank step Next step in order Feedback and hints on steps Few bugs; selectable hint from sequence Few bugs; hint sequence Task selection Some student choice Mastery learning Assessment Clauses? Constraints? Knowledge tracing Evaluations Dissemination

Steps? Looks like one goal box per page Sequence of pages is fixed: “…determine candidate keys, the closure of a set of attributes, prime attributes, simplify functional dependencies, determine normal forms, and, if necessary, decompose the table. The sequence is fixed: the student will only see a Web page corresponding to the current task.”

“self-explanation” of steps The first time a step is done, requires selecting an explanation from a menu. On subsequent occurrences, explanation is required only if the step is incorrect If explanation is incorrect, asked to select correct definition of a concept Menu is stored on constraints

Normit Feature NORMIT ER-Tutor SQL tutor Cognitive Step granularity & user interface Goal boxes + explanations Goal boxes; large & fixed granularity Goal boxes, granularity varies Interpreting a student’s step Generates solution + constraints; exact match Ideal solution + constraints; String match Rules that gen. solution; Exact, intervals, regular expressions, etc. Suggesting good steps Next step in strict sequence Next blank step Next step in order Feedback and hints on steps On demand On demand; Few bugs; selectable hint from sequence Immdiate: Few bugs; hint sequence Task selection ? Some student choice Mastery learning Assessment Clauses? Constraints? Knowledge tracing Evaluations Dissemination

ER-Tutor Whenever a new object is createed, student highlights text in problem statement Ideal solution

Normit Feature NORMIT ER-Tutor SQL tutor Cognitive Step granularity & user interface Goal boxes + explanations Canvas, but nouns act as goals Goal boxes; large & fixed granularity Goal boxes, granularity varies Interpreting a student’s step Generates solution + constraints; exact match Ideal solution + constraints; exact match on text, node type, links Ideal solution + constraints; String match Rules that gen. solution; Exact, intervals, regular expressions, etc. Suggesting good steps Next step in strict sequence ? Next blank step Next step in order Feedback and hints on steps On demand On demand; Few bugs; selectable hint from sequence Immdiate: Few bugs; hint sequence Task selection Some student choice Mastery learning Assessment Clauses? Constraints? Knowledge tracing Evaluations Dissemination

Outline SQL-Tutor (previous class) ER-Tutor & Normit (today) Evaluation & Dissemination (today)

Evaluation vs. Assessment “Evaluation” used for determining effectiveness of instruction “Assessment” used for determining competence of a student

Classic evaluation design Experimental instruction Baseline/control/comparison instruction Pre-training (e.g., read short textbook) Pre-training (ditto) Pre-test Training (e.g., Use SQL-tutor) Training (e.g., use paper & pencil) Post-test

Three types Control number of problems solved Control training time Time to complete training varies Score on post-tests varies Control training time Problems completed varies Control post-test score by repeating training until mastery Time to mastery varies Problems solved varies

Classic stats If pre-test scores not significantly different, then can use T-test on post-test Better: Gain_score = Post-test_score – pre-test_score T-test on gain_score: Most say “don’t use” ANCOVA on post-test, with pre-test as covariate

Effect size Cohen’s d: d=1.0 is considered excellent; one letter grade d = mean(gain_score(expt)) – mean(gain_score(control)) / standard_deviation(gain_score(control)) d=1.0 is considered excellent; one letter grade d=0.5 is good d=0.2 is questionable unless classroom/field

Evaluation of SQL-tutor Participation voluntary 20 of 49 in course volunteered 2 hours for training No pre-test, no post-test; course exam used as post-test; only questions relevant to SQLT counted T-test on means: p=0.006 Effect size: d=0.66 self-selection bias

Classroom evaluation of Lisp Cognitive Tutor Tutor vs. Lisp standard environment Random assignment Whole minicourse Final exam; no pretest T-test on means: Highly significant Effect size: d=1.0

Lab evaluation of Lisp Tutor Same exercises, same text; tutor vs. Lisp alone Control students did not successfully complete all exercises

Evaluation of Geometry Cognitive Tutor Tutor vs. baseline Whole classes Same teacher Whole minicourse Final exam; used prior year’s grade as covariate Contribution of tutor to regression was reliable Effect size: d= “more than 1.0”

Second evaluation of Geometry Tutor 4 conditions Tutor + project teacher (best) Tutor + nonproject teacher Paper + project teacher Paper + nonproject teacher First condition better than all others, which tied Effect size: d = 1.0

Evaluation of first Algebra Tutor Tutor vs. baseline Whole classes No significant differences