1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate.

Slides:



Advertisements
Similar presentations
What is Organizational Behavior?
Advertisements

Prepared by the Society for Industrial and Organizational Psychology - SIOP © 2002 Industrial-Organizational Psychology Learning Module Training in Organizations.
USEFUL LESSONS FROM ASSESSMENT IN AN ADULT LEARNING ENVIRONMENT: INSTRUCTIONAL STRATEGIES IN DISTANCE LEARNING Harry O’Neil University of Southern California.
1 Positive Learning Outcomes Through Problem-Based Learning Willie Yip Department of Computing.
Professional Learning Community at Work Delwyn L. Harnisch University of Nebraska-Lincoln Lincoln, Nebraska, USA Advances in Learning, Teaching and Technology.
Intel® Education K-12 Resources Our aim is to promote excellence in Mathematics and how this can be used with technology in order.
Meaningful Learning in an Information Age
Building Student-Centered Curricula: Problem-Based Learning and Cooperative Learning.
Chapter 10 – Team Leadership
1 Instructional Strategies to Improve Learning in Computer Games Harold F. O’Neil and Hsin-Hui Chen, University of Southern California/CRESST AERA v.5.
DEVELOPING ACADEMIC LANGUAGE AND TEACHING LEARNING STRATEGIES Anna Uhl Chamot Jill Robbins George Washington University.
Planning, Instruction, and Technology
Team Leadership Chapter 12.
Team Leadership AGED 3153.
WebCT Web Course Tools Online Teaching. How Much Online?  Traditional Teaching (in the classroom) with supporting material on the Web  Syllabus  Orientation.
What is Business Analysis Planning & Monitoring?
Tutoring and Learning: Keeping in Step David Wood Learning Sciences Research Institute: University of Nottingham.
C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Strategies to Accelerate Academic Learning for English Learners
Chapter 11 – Team Leadership
1 How can self-regulated learning be supported in mathematical E-learning environments? Presenters: Wei-Chih Hsu Professor : Ming-Puu Chen Date : 11/10/2008.
Chris Evans, University of Winchester Dr Paul Redford, UWE Chris Evans, University of Winchester Dr Paul Redford, UWE Self-Efficacy and Academic Performance:
Technology Integration Planning Guidelines for Development A Visual Guide.
Learning and Teaching Technology Integration Dr. Brush Research Group
Welcome to River Eves Elementary T.A.G. What is TAG? Talented and Gifted.
Seeking and providing assistance while learning to use information systems Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Sep. 16, 2009 Babin, L.M.,
12 November 2010 New Way forward to ICT Literacy Training.
A Framework for Inquiry-Based Instruction through
Chapter 11 Helping Students Construct Usable Knowledge.
CSA3212: User Adaptive Systems Dr. Christopher Staff Department of Computer Science & AI University of Malta Lecture 9: Intelligent Tutoring Systems.
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
EDU 385 Education Assessment in the Classroom
Chapter 10 – Team Leadership
Forum - 1 Assessments for Learning: A Briefing on Performance-Based Assessments Eva L. Baker Director National Center for Research on Evaluation, Standards,
CERA 87 th Annual Conference- Effective Teaching & Learning: Evaluating Instructional Practices Rancho Mirage, CA – December 4, 2008 Noelle C. Griffin,
Teaching Learning Strategies and Academic Language
Introduction To System Analysis and Design
Military Psychology: Teams and Teamwork Dr. Steven J. Kass.
CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
Winters, F., Greene, J., & Costich, C. (2008). Self-regulation of learning within computer-based learning environments: A critical analysis. Educational.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Plenary Session 7: Technologies and Principles of Learning in Support of Teaching Delwyn L. Harnisch University of Nebraska, Lincoln.
CT 854: Assessment and Evaluation in Science & Mathematics
Baker ONR/NETC July 03 v.4  2003 Regents of the University of California ONR/NETC Planning Meeting 18 July, 2003 UCLA/CRESST, Los Angeles, CA ONR Advanced.
K-12 Technology Literacy Curriculum and Assessment.
Crysten Caviness Curriculum Management Specialist Birdville ISD.
School in Front of Challenges of Knowledge Society, Again and Again Liisa Ilomäki Department of Psychology University of Helsinki EDEN conference.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Overview of the IWB Research. The IWB Research Literature: Is overwhelmingly positive about their potential. Primarily based on the views of teachers.
Evaluating the New Technologies Ann Sefton Faculties of Medicine and Dentistry University of Sydney.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
The role of feedback and self-efficacy on web-based learning: The social cognitive perspective Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jun.
1 Science, Learning, and Assessment: (Eats, Shoots, and Leaves) Choices for Comprehensive Assessment Design Eva L. Baker UCLA Graduate School of Education.
By: Asma Marshoud AlTarjimi Presented to: dr. Antar
The Effects of Scaffolding on Students’ Self-Regulated Learning Skills Brittany Schmitz This symbol in the bottom left hand corner will indicate that.
Knowledge is fixed and need only to transfer from teacher to students is based on constructive and transformation process through learning process Learning.
Harry O'Neil University of Southern California and The Center for Research on Evaluation, Standards, and Student Testing A Theoretical Basis for Assessment.
6 Technology, Digital Media, and Curriculum Integration
OSEP Leadership Conference July 28, 2015 Margaret Heritage, WestEd
Presenter: Hsiao-lan Lee Professor: Ming-Puu Chen Date: 06 / 15 / 2009
ELT. General Supervision
Project–Based Learning
DOCUMENTAL SOLUTIONS Market Analysis Intelligence & Tools
Developing Quality Assessments
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing

2/27 CRESST/UCLA Overview TECHNOLOGY FOR ASSESSMENT & CRESST MODEL ONGOING COLLABORATIVE PROBLEM SOLVING RESEARCH WHAT’S NEXT

TECHNOLOGY FOR ASSESSMENT & CRESST MODEL

4/27 CRESST/UCLA PRESUMED ADVANTAGES Provide consistent high-quality assessment available on large scale at remote sites Individualized testing Time/Cost saving reduce testing time quicker result reporting and system update rapid update of testing materials reduce reliance on highly skilled personnel

5/27 CRESST/UCLA POSSIBLE PROBLEM AREAS Cost Equity Fidelity Program maintenance Teacher attitudes

6/27 CRESST/UCLA PURPOSES OF TESTING AND ASSESSMENT Individual-Team–Oriented Individual/team certification Admissions and selection Placement Individual progress and student learning Diagnosis/prescription

7/27 CRESST/UCLA CRESST MODEL OF LEARNING Content Understanding Learning Communication Collaboration Problem Solving Self-Regulation

8/27 CRESST/UCLA PROBLEM-SOLVING DEFINITION Problem solving is cognitive processing directed at achieving a goal when no solution method is obvious to the problem solver (Mayer & Wittrock, 1996)

9/27 CRESST/UCLA PROBLEM SOLVING Self- Efficacy Content Understanding Domain-Dependent Problem-Solving Strategies Self-Regulation Metacognition Self- Monitoring Planning Motivation Effort

10/27 CRESST/UCLACOLLABORATIVEPROBLEM-SOLVING Problem-Solving Strategies Self - Regulation Content Understanding Group Teamwork Process 1. Adaptability 2. Coordination 3. Decision Making 4. Interpersonal 5. Leadership 6. Communication Environmental Science Domain Knowledge 1. Browsing 2. Searching 3. Boolean Operator Used 4. Feedback Accessing 1. Planning 2. Self-Checking 3. Effort 4. Self-Efficacy

ONGOING COLLABORATIVE PROBLEM SOLVING RESEARCH

12/27 CRESST/UCLA CRESST’S KNOWLEDGE MAPPER

13/27 CRESST/UCLA COMPUTER-BASED ASSESSMENT Diagnosis: Match in real time student map with expert map Prescription: Nature of feedback tied to diagnosis

14/27 CRESST/UCLA THREE CHARACTERISTICS OF FEEDBACK Complexity of feedback What information is contained in the feedback messages Timing of feedback When is the feedback given to students Representation of feedback The form of the feedback presented (text vs. graphics vs. audio)

15/27 CRESST/UCLA ONGOING COLLABORATIVE PROBLEM SOLVING RESEARCH Improve on the nature of the task Improve communication messages Improve scoring efficiency provide more extensive complex feedback. Feedback on collaboration/teamwork Feedback on problem solving Content understanding Domain specific problem-solving strategy

16/27 CRESST/UCLA Schacter et al. Use of simulated Internet Web space and Information seeking processes significantly increased content understanding/map scores.

17/27 CRESST/UCLA Hsieh & O’Neil 1. Improve task to a “real group task” 2.Provide two different type of feedback Knowledge of Response Your map has been scored against an expert’s map in environmental science. The feedback tells you: How much you need to improve each concept in your map (i.e., A lot, Some, A little). Use this feedback to help you search to improve your map. A lot SomeA little Atmosphere ClimateEvaporation BacteriaCarbon dioxide Greenhouse gasses Decomposition Photosynthesis OxygenSunlight WasteWater cycle

18/27 CRESST/UCLA Adapted Knowledge of Response Adapted Knowledge of Response (the above Knowledge of Response + the following) Improvement: You have improved the “food chain” concept from needing “A lot of improvement” to the “Some improvement” category. Strategy: It is most useful to search for information for the “A lot” and “Some” categories rather than the “A little” category. For example, search for information on “atmosphere” or “climate” first, rather than “evaporation.”

19/27 CRESST/UCLA Hsieh & O’Neil Knowledge of Response Feedback Adapted Knowledge of Response Feedback Contains Knowledge of Response Feedback plus an explanation for why it is correct or incorrect Adapted Knowledge of Response Feedback significantly better than Knowledge of Response Feedback Effect size moderate

20/27 CRESST/UCLA Chuang & O’Neil User defined feedback timing Adapted Knowledge of Response Feedback Task-specific knowledge of Response Feedback Adapted feedback plus task-specific strategies on searching using Boolean operators Task-Specific Knowledge of Response Feedback significantly better than Adaptive Knowledge of Response Feedback Effect size moderate

21/27 CRESST/UCLA Chen et al. Feedback via After-Action Review

22/27 CRESST/UCLA Chen et al. the AAR groups’ overall communication messages were higher. the AAR group changed their searching strategies and performed significantly more Boolean searches after receiving the feedback. However, the After-Action Review had no significant effect on improving the teams’ content understanding. Chen suggested investigating the relationship of this type of feedback and cognitive load.

23/27 CRESST/UCLA Chuang & O’Neil Kalyuga, Chandler, & Sweller, 2004 Mayer (2001) Audio vs. Text feedback After Action Review Feedback

WHAT’S NEXT

25/27 CRESST/UCLA GENERAL LESSONS LEARNED Computer based assessment is feasible and effective Need sub-models of process Role of type of task and feedback may be critical for assessment of collaborative problem solving

26/27 CRESST/UCLA WHAT’S NEXT IN LAB SETTINGS Tailor reports to individual differences Hi vs. low prior knowledge IN FIELD SETTINGS Scale up results to applied Navy training environment Goal of Navy training system is to reduce time in training and maintain proficiency

27/27 CRESST/UCLA DYNAMIC TESTING Reconceptualization of our line of research (Vygotsky, Feuerstein) Dynamic testing can be viewed as the quantification of one’s learning potential Potential vs. actual Learn new things rather than knowledge already acquired Difference between Dynamic and Static testing Emphasis on quantifying psychological process Role of feedback Static test has no explicit feedback Dynamic test provides feedback after each item Interaction is individualized

28/27 CRESST/UCLA