UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

PACT Feedback Rubric Pilot Results with UC Davis English Cohort.
Curriculum Development and Course Design
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
American Educational Research Association Annual Meeting New York, NY - March 23, 2008 Eva L. Baker and Girlie C. Delacruz What Do We Know About Assessment.
DEMONSTRATE BASIC COMBAT RIFLE MARKSMANSHIP SKILLS 1.
TWS Aid for Scorers Information on the Background of TWS.
Marzano Art and Science Teaching Framework Learning Map
Assessment: Reliability, Validity, and Absence of bias
Training and Developing Employees
1/∞ CRESST/UCLA Towards Individualized Instruction Using Technology Gregory K. W. K. Chung Annual CRESST Conference Los Angeles, CA – January 22, 2007.
Direct Instruction Instructional Model. Three Features I. Learner Outcomes Mastery of well structured knowledge (step by step) Skill Mastery II. Environment.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
Chapter 11 Instructional Procedures © Taylor & Francis 2015.
Principles of High Quality Assessment
CONCEPT OF SELECTION The next step after requirement is the selection of candidates for the vacant position from among the applicants. This is the most.
/ 181 InstrutionalMethods. Aim The purpose of this session is to increase the effectiveness of the trainings that are prepared by the participants, by.
Introduction to e- Learning Dr. Lam TECM What is wrong with e- learning? What are your experiences with e-learning? What made it effective or ineffective?
Sapient Insurance Partners. Overview & Services We have almost four decades of combined experience in the property & casualty insurance and reinsurance.
1 Ohio’s Entry Year Teacher Program Review Ohio Confederation of Teacher Education Organizations Fall Conference: October 23, 2008 Presenter: Lori Lofton.
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
Click to edit Master title style  Click to edit Master text styles  Second level  Third level  Fourth level  Fifth level  Click to edit Master text.
TECHNOLOGY INTEGRATION & INSTRUCTION FOR THE 21 ST CENTURY LEARNER JUNE 15-17, 2009 HOPE BROWN, HIGH SCHOOL SCIENCE, ST. EDMOND, FORT DODGE VALERIE JERGENS,
UNIVIRTUAL FOR INSTRUCTIONAL DESIGN Versione 00 del 29/07/2009.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Basic Cycling Skills Cycling Community Initiation Ready to Race! Introduction to Competition.
Engaging Learners and Realizing the Development of Mathematical Practices ALM Conference July 15, 2015 Trena L. Wilkerson Professor, Mathematics Education.
…Important Meeting…  Ag Ed Seniors pursuing the Teaching Option...  Will meet Tuesday, September 28 in AGH, Room 201 at 6:00 p.m....  To Discuss Requirements.
Instructional/Program Design Produced by Dr. James J. Kirk Professor of HRD Western Carolina University.
Business Analysis and Essential Competencies
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
Let’s Ride! Cycling Community Initiation. Introduction.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
Formulating learning objectives Prepared by: Beatrice Ghirardini, Chiara Nicodemi, Fabiana Biasini Instructional Designers, Food and Agriculture Organization.
1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate.
Object-Oriented Software Engineering using Java, Patterns &UML. Presented by: E.S. Mbokane Department of System Development Faculty of ICT Tshwane University.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
Teaching to the “Big Ideas”: Moving beyond the standards Terry P. Vendlinski UCLA Graduate School of Education & Information Studies National Center for.
CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Greg Chung Bill Bewley UCLA/CRESST Ontologies and Bayesian.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Competency based learning & performance Ola Badersten.
Performance Based Assessment. What is Performance Based Assessment? PBA is a form of assessment that requires students to perform a task rather than an.
Flipping for the Framework : Adapting a Library Instruction Session to the Framework for Information Literacy using Flipped and Discovery Based Learning.
1 Science, Learning, and Assessment: (Eats, Shoots, and Leaves) Choices for Comprehensive Assessment Design Eva L. Baker UCLA Graduate School of Education.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V4, 1/18/07 Research.
Accelerating Future Possibilities for Assessment and Learning Technology-Enabled Measurement: Looking Back to Move Ahead Greg Chung UCLA Graduate School.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
1.1: Skill Unit 1: Factors affecting performance ?
GCE Software Systems Development
D2L Refresher Upload content into the Content section in a D2L course
Classroom Assessment A Practical Guide for Educators by Craig A
Individual Thinking Time
Disaster Frontline Supervisor
Week 3 Class Discussion.
Setting Instructional Outcomes
Helping Students Generate and Test Hypotheses
Chapter 6 Selecting Employees
Presentation transcript:

UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology as an Agent of Change in Teaching and Learning Assessment and Technology: Identifying Current Trends and Issues Using Technology for Measures of Complex Knowledge and Skill William L. Bewley UCLA CRESST American Educational Research Association Annual Meeting Chicago, IL - April 10, 2007

1 Overview Assessing the impact of technology on teaching and learning is problematic Means and Haertel (2003, p. 42) “We can’t evaluate the impact of technology per se; instead, we must define more specific practices incorporating particular technology tools or supports and evaluate the effects of these.” To do this, we need appropriate learning measures, particularly for complex cognitive skills using complex constructed responses Technology can help us do that But (stating the obvious), we still need to know what we’re doing

2 Technology Is Not the Solution Technology is not the solution to better assessment, any more than it’s the solution to better learning and teaching The solution is good assessment design, supported by appropriate uses of technology

3 The Solution Good assessment design Why are we assessing? What will be assessed to support this purpose? What behaviors will provide evidence? What task(s) will elicit the behaviors? How is task performance scored? How are results interpreted? Supported by appropriate technology How are the task, performance scoring, and interpretation implemented and delivered?

4 Approaches We’re Using Simulation-based assessment Assessing the product Assessing the process: analyzing the clickstream Sensor-based assessment Interpretation (But not the only approaches)

5 Assessing the Product Evaluation of Shooting Positions

6 Assessing the Product Link Architecture Planning

7 Assessing the Product As a coach, observe shooters on the firing line Combat Marksmanship Coaching

8 What kind of error did you observe? Sequence Skipped Step Wrong Execution >> Choose all that apply << Assessing the Product Fault check and describe what’s wrong, or fix it by dragging and dropping to fill in missing step(s) correct the execution of a step correct the sequence of steps Combat Marksmanship Coaching

9 Assessing the Process Assessing the product Misses the processes used, and these potential benefits Verifying expected problem-solving behaviors Explaining performance differences between subgroups Supporting task validation Enabling individualized instruction

10 Assessing the Process Or if process data – can be expensive, may be unreliable, and not real time Think-alouds Video Observation But...

11 Assessing the Process Simulation-based assessments can do more than deliver the task – they can Unobtrusively measure processes by analyzing the clickstream Do it in real time Interpret the results in real-time to enable on- the-fly diagnosis of knowledge and skill gaps to individualize instruction

12 Known Distance Coaching Assessing the Process Diagnosis

13 The Decision Analysis Tool Assessing the Process Create options Adjust parameters

14 Assessing the Process Click to check threat information Click to view threat/defense geometry Place assets, location and role assignment Adjust defense geometry Air Defense Planning

15 Sensor-Based Assessment Pressure sensor on triggerEye trackerMotion sensor Trigger squeeze Laser strike sensor Eye position data Muzzle movement

16 MetricSourceUsage Neurophysiological measures Cognitive overload EEGUsed as an indicator of how well shooter is processing information and accommodating task demands. Prior work in this area has demonstrated the utility and feasibility of EEG as a measure of cognitive overload. AnxietyHeartbeatUsed to measure degree of stress experienced by shooter. Motor response measures Trigger break SwitchUsed to establish a synchronization point for all measures. Trigger squeeze Pressure sensorUsed to examine quality of trigger squeeze—slow or rapid. Trigger control is considered a fundamental skill in marksmanship. Muzzle wobble AccelerometerUsed to measure the degree of movement in the muzzle of the weapon. Sensor-Based Assessment Neurophysiological measures of cognitive overload and anxiety, plus motor response measures

17 Interpretation Modeling using ontologies Diagnosis using Bayes nets Artificial neural nets (current research) Hidden Markov models (current research) Prescription based on diagnosis and domain ontologies

18 Interpretation Domain knowledge: rifle marksmanship

19 Interpretation A domain ontology: rifle marksmanship Concepts and relations between concepts, in a database

20 Diagnosis / Remediation System Model of Knowledge Dependencies and Performance Dependencies Recommender Domain Ontology content knowledge and skill gaps knowledge fine motor processes performance: shot group accuracy and precision anxiety background instrumented indoor weapon system

21 Bayes Net Fragment: Sight Picture at Trigger Break from target from trigger sensor self-ratings against ideal position from eye tracker from motion sensor

22 Example Remediation Feedback (knowledge of results) Definitions, concepts, procedures Pictures, videos, and audio

23 Some Issues for Discussion Are these appropriate uses of technology for assessment? Can you model student knowledge in a domain? Interaction of complexity and examinee experience Are clickstream data the outcome of meaningful cognitive events? Alternative methods for analyzing and interpreting complex data Construct-irrelevant variance, e.g., the technology The cost Equivalence of tasks Level of fidelity

24 ©2007 Regents of the University of California