Mike Timms and Cathleen Kennedy University of California, Berkeley

Slides:



Advertisements
Similar presentations
Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Advertisements

The Cost of Authoring with a Knowledge Layer Judy Kay and Lichao Li School of Information Technologies The University of Sydney, Australia.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Database Planning, Design, and Administration
Customized math instruction via the Internet All it took was a little effort … and the MCAS-Pass Learning Lab.
Teaching Inquiry The BSCS 5E Model. What is Inquiry? Inquiry is a general term for the processes by which scientific knowledge is developed. Scientific.
B EAUCLERC E LEMENTARY S CHOOL Florida Standards Assessments.
Metacognition Seana DeCrosta Jennifer McCallum EDUS 515 Dr. P. Duncan.
Understanding Task Orientation Guidelines for a Successful Manual & Help System.
The Project AH Computing. Functional Requirements  What the product must do!  Examples attractive welcome screen all options available as clickable.
Metacognition Seana DeCrosta Jennifer McCallum EDUS 515 Dr. P. Duncan.
Tutoring and Learning: Keeping in Step David Wood Learning Sciences Research Institute: University of Nottingham.
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
1 How can self-regulated learning be supported in mathematical E-learning environments? Presenters: Wei-Chih Hsu Professor : Ming-Puu Chen Date : 11/10/2008.
12 November 2010 New Way forward to ICT Literacy Training.
System for Administration, Training, and Educational Resources for NASA SATERN Overview for Learners May 2006.
1 Pedagogical implications of mobile technologies Diana Laurillard WLE Symposium on M-Learning 9 February 2007.
Asst.Prof.Dr.Surasak Mungsing.  E-learning Content Design is the heart and soul of any e-learning development process. E-Learning Content Development.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
InterActions Overview This Presentation will touch on the following topics.  Brief Overview  Major Content Themes  Pedagogical Principles  Scaffolding.
Abel Villarreal, M. Ed in mathematics Teaching & Learning Center (TLC) Austin Community College Tutor with Vision Training Part 2: Reaching & Tutoring.
1 Introduction to Software Engineering Lecture 1.
I Power Higher Computing Software Development The Software Development Process.
User Support Chapter 8. Overview Assumption/IDEALLY: If a system is properly design, it should be completely of ease to use, thus user will require little.
Office of School Improvement Differentiated Webinar Series Formative Assessment – Feedback February 28,2012 Dr. Dorothea Shannon, Thomasyne Beverly, Dr.
IFS410 – End User Support Chapter 11 Training Computer Users.
Chapter 5 System Modeling (1/2) Yonsei University 2 nd Semester, 2015 Sanghyun Park.
AERA April 2005 Models and Tools for Drawing Inferences from Student Work: The BEAR Scoring Engine Cathleen Kennedy & Mark Wilson University of California,
George Goguadze, Eric Andrès Universität des Saarlandes Johan Jeuring, Bastiaan Heeren Open Universiteit Nederland Generation of Interactive Exercises.
Effective mathematics instruction:  foster positive mathematical attitudes;  focus on conceptual understanding ;  includes students as active participants.
Innovation What does it look like in Adult Learning?
Texas Assessment Management System STAAR Alternate Manage Teacher Assignments.
Cognitive explanations of learning Esther Fitzpatrick.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Outstanding Lessons at The Marches Aims – To generate key teaching sequences delivered in an outstanding lesson – To recognise the key areas of assessment.
Advanced Higher Computing Science
Chapter 5 – System Modeling
Human Computer Interaction Lecture 21 User Support
Helping Students Examine Their Reasoning
What is a CAT? What is a CAT?.
Student Assessment Data Portal
EDUKNOWLEDGE A Framework for Educational Purposes
Inquiry learning and SimQuest
Fundamentals of Information Systems, Sixth Edition
VISION Learning Station
Using Cognitive Science To Inform Instructional Design
Human Computer Interaction Lecture 21,22 User Support
System Design Ashima Wadhwa.
Computer Aided Software Engineering (CASE)
Object oriented system development life cycle
Authentic Assessment in Early Intervention
Teaching with Instructional Software
What is Maths-Whizz? Virtual tutor for your child online
Introducing the Numeracy continuum K-10
Data Entry Interface (DEI) Overview
What is Maths-Whizz? Virtual tutor for your child online
Understanding by Design Ensuring Learning through Lesson Design
What is Maths-Whizz? Virtual tutor for your child online
Construct Progressions
Data Entry Interface (DEI) Overview
F.S.A. Computer-Based Item Types
Chapter 11 user support.
Sir James Smith’s Community School
Principled Assessment Designs for Inquiry (PADI)
What are Technology Enhanced Item Questions ???
Data Entry Interface (DEI) Overview
CIS 375 Bruce R. Maxim UM-Dearborn
Lab 2: Information Retrieval
Presentation transcript:

Mike Timms and Cathleen Kennedy University of California, Berkeley FOSS: Using PADI to Develop an Online Self-Assessment System that Supports Student Science Learning Mike Timms and Cathleen Kennedy University of California, Berkeley AERA April 2005

Overview of presentation Interactive self assessment system Tutoring foundations of the system Description of the system How PADI supports the system We will describe an interactive self assessment system that gives students feedback in the form of hints. Developed to provide guided practice on the FOSS force and motion problems in which students apply the equation for speed (velocity!) We will cover: The tutoring foundations of the system – the philosophy of effective online tutoring we employed Description of the system – how it looks and how it works How PADI supports the system – both in the design and the implementation © Mike Timms, 2005 2

Tutoring foundations of the system Good tutors… use hints time and structure their feedback use a conversational style Selecting hints… concrete hints for less able students abstract hints for more able students Good Tutors … use hints – Tutors often offer hints as the opening to their tutorial offering. time and structure their feedback - give students just enough feedback at the right time to allow him or her to make a correction without giving too much away. With simultaneous errors prioritize on the more significant ones. Tutors give different feedback at different times. use a conversational style – Use “we” or “I”. For example, “ Why don’t we go back to the previous step of the problem.” “I would start at the top.” (Johnson, 2003). Selecting hints… The type of hint that is most beneficial depends on the learner’s cognitive ability. Concrete hints for less able students – need help with steps - help with steps along the way to a specific goal is effective in supporting performance throughout the task at hand Abstract hints for more able students - the abstractness level of the hints provided should be matched to the level of cognitive development of the learner Function oriented - explanations of how a particular function or principle works is more effective in producing learning that can be transferred beyond the immediate task. More metacognitive. © Mike Timms, 2005 3

Activity Selection Process Conceptual diagram of the system FOSS Self-Assessment System Activity Selection Process Presentation Process Computer-based Learner Administrator Task Specification Repository The assessment cycle is produced by the interaction of the following four processes: Activity selection process: The Activity Selection Process is the system responsible for selecting and sequencing items from the Self Assessment System. In this case items are selected using item shells and “realistic values” that are randomly selected for each new problem. Students move through 10 levels of items that relate to progressions in the curriculum unit. Presentation process: The Presentation Process presents the tasks (items) to the student via computer. It takes details about the task from the task library. It presents the problem statement, realistic values, graphics that make up the task. At certain points, when the learner performs the task, the Presentation Process will capture responses (Work Products) from the learner. These are periodically delivered to Response Processing for evaluation. The presentation of the task is governed by the Task Model, which describes the kinds of material that must be presented, as well as the kinds of Work Products that are expected to be produced. Response processing: Response Processing performs the first step in the scoring process: It identifies the essential features of the response that provide evidence about the participant's current knowledge, skills, and abilities. These are recorded as a series of Observations that are passed to the next process. Based upon prior evidence of performance, the system provides task level feedback in the form of hints that are tailored to a learner’s ability level. Summary Scoring Process: The Summary Scoring Process performs the second, or summary, stage in the scoring process: It updates our beliefs about the learner’s knowledge, skills, and abilities based on this evidence. Through a series of “quick check” items at each of 10 levels, the student ability level is monitored and the student and teacher can see a summary feedback report. Task Level Feedback Summary Feedback Hints Summary scoring process Response processing Adapted from Mislevy, Almond & Steinberg, 2001 © Mike Timms, 2005 4

Presentation interface Problems were developed according to a common shell. Parts of the shell remain constant across types of problem, but values of variables that are used in the equation are drawn randomly from a database of plausible values. Problem statement: This is the item shell into which plausible values are plugged to dynamically generate a problem. Student selects equation to use from a drop down menu of variations of the speed equation. We see only the selected equation here. Workspace: When student hits the “select equation” button, the equation boxes appear in the work space. Student fills in the values and makes their calculation of the answer. © Mike Timms, 2005 5

Presentation interface When a student hits “check my work” button, they receive feedback on the problem. It systematically searches for errors and gives feedback that is tailored to their ability level. This example shows concrete feedback for a lower ability student. Hints appear as text and also highlighting of the problem areas (in red) combined with pointing out information that needs to be used in the solution (in green). © Mike Timms, 2005 6

FOSS Self-Assessment System How PADI supports the system Design Phase Implementation Phase Copy of FOSS Task Specs Realistic values Hints PADI Database FOSS Self-Assessment System   PADI Scoring Engine PADI Design System This shows how the PADI system supported the design of the interactive assessment system and its implementation. Design Phase: The PADI Design System was used for the development of task specifications, or shells. Task Specifications get stored in the PADI database Implementation Phase: A copy of the task specifications is stored in the Self-Assessment application's local database, and tasks are filled in dynamically at the time they are delivered to a student using the plausible values and hints as needed. The self assessment system maintains a database of the student (the student model) Student scores are sent to the PADI scoring engine after each practice level. When the student completes a set of 3 quick-check items. The PADI scoring engine supplies ability estimates that the system uses to select appropriate hints for each student. This process repeats as students advance through ten levels of problem solving. Student Database © Mike Timms, 2005 7

FOSS: Using PADI to Develop an Online Self-Assessment System that Supports Student Science Learning Mike Timms mtimms@berkeley.edu Cathleen Kennedy cakennedy@berkeley.edu AERA April 2005