CRESST ONR/NETC Meetings, 17-18 July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Bill BewleyAllen Munro Greg ChungJosh Walker Girlie Delacruz.

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Second Information Technology in Education Study (SITES) A Project of the International Association for the Evaluation of Educational Achievement (IEA)
Developing Satisfaction Surveys: Integrating Qualitative and Quantitative Information David Cantor, Sarah Dipko, Stephanie Fry, Pamela Giambo and Vasudha.
Training. Training & Development Definition “The systematic acquisition of attitudes, concepts, knowledge, roles, or skills, that result in improved performance.
Exercise Science Chapter 19:Motor Learning and Skill Acquisition
INSTRUCTIONAL TARGET AREASDATES “Lesson & Assessment Strategies in the Standards-Based Classroom” Target # 1*Lesson and Assessment AlignmentSEP 15 – OCT.
CPP Training Block 4 and CPP Training Blocks Four and Five Basic marksmanship skills Weapons handling Presentation from the Holster Stance and.
Reasons for Evaluating Training Companies are investing millions of dollars in training programs to help gain a competitive advantage. To justify the costs.
Clinical Coach Standardisation Meeting August 2011.
1 A Review  Adequate training does not happen without a lot of work  It requires significant planning  There are definite.
Chapter 4 Validity.
Formative and Summative Evaluations
Curriculum Development Center (CDC) Curriculum Development Process Continue.
CPP Overview 1. Purpose 1 The purpose of this brief is to provide CMT’s information on the development of the Combat Pistol Program (CPP) and significant.
Designing Competency Based Training with Bloom’s Taxonomy Michele B. Medved Learning in the News
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
PISTOL FUNDAMENTALS Prepared by: Russell Sampson.
Table Top Exercise Development Overview and Orientation Washington State School Directors Association November 23, 2013.
STAGES OF SKILL LEARNING & FACTORS AFFECTING SKILL LEARNING
Gardening Simulation Created by: Sherry Burrill & Laura Hurlbirt OLIT 533 Computer Simulations Dr. Dennis Lester The University of New Mexico Inspired.
CRESST ONR/NETC Meetings, July 2003, v1 18 July 2003 The Workforce Learning Community Bill Bewley and Roxanne Sylvester UCLA / CRESST The ONR Workplace.
Matching PMBOK Section
LEARNING PRIORITY OF TECHNOLOGY PROCESS SKILLS AT ELEMENTARY LEVEL Hung-Jen Yang & Miao-Kuei Ho DEPARTMENT OF INDUSTRIAL TECHNOLOGY EDUCATION THE NATIONAL.
Why Do Teachers Need to Know About Assessment? Danielle Lamarre Darcy Church Jason Howard Sarah Fearon.
고려대학교 산업경영공학과 IMEN 315 인간공학 18. Selection and Training.
The Role of Information in Systems for Learning Paul Nichols Charles DePascale The Center for Assessment.
Identifying the Method for Effective Combat Marksmanship Training utilizing the Marine Corps Indoor Simulation Marksmanship Trainer (ISMT). Gabriel Diaz.
ONR/NETC Planning Meeting July 18, 2003 UCLA/CRESST Los Angeles, CA ONR/NETC 7/18/03 v.4 What Works in Distance Learning Harry O’Neil University of Southern.
Forum - 1 Assessments for Learning: A Briefing on Performance-Based Assessments Eva L. Baker Director National Center for Research on Evaluation, Standards,
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Online Assessment within.
Universal Design for Learning (UDL) Marialena Winter Educational Technology Teacher July 8, 2011.
Teaching Today: An Introduction to Education 8th edition
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology.
PowerPoint Presentation Design by Charlie Cook The University of West Alabama Prepared by Joseph B. Mosca, Monmouth University and Marla M. Kameny, Baton.
TWS Aids for Student Teachers & Interns Overview of TWS.
Distributed Laboratories: Control System Experiments with LabVIEW and the LEGO NXT Platform Greg Droge, Dr. Bonnie Heck Ferri, Jill Auerbach.
CRESST ONR/NETC Meetings, July 2003, v1 ONR Advanced Distributed Learning Impact of Language Factors on the Reliability and Validity of Assessment.
Lecture 5: Writing the Project Documentation Part III.
Baker ONR/NETC July 03 v.4  2003 Regents of the University of California ONR/NETC Planning Meeting 18 July, 2003 UCLA/CRESST, Los Angeles, CA ONR Advanced.
ONR/NSF Technology Assessment of Web-Based Learning, v3 © Regents of the University of California 6 February 2003 ONR/NSF Technology Assessment of Web-Based.
CPS ® and CAP ® Examination Review ADVANCED ORGANIZATIONAL MANAGEMENT By Garrison and Bly Turner ©2006 Pearson Education, Inc. Pearson Prentice Hall Upper.
Strategic Planning for Training Evaluation
C R E S S T / U C L A Alicia M. Cheak, Gregory K. W. K. Chung, Eva L. Baker, Cecile H. Phan, and Linda F. de Vries American Educational Research Association,
1/27 CRESST/UCLA DIAGNOSTIC/PRESCRIPTIVE USES OF COMPUTER- BASED ASSESSMENT OF PROBLEM SOLVING San-hui Sabrina Chuang CRESST Conference 2007 UCLA Graduate.
Individual Differences in Human-Computer Interaction HMI Yun Hwan Kang.
CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Greg Chung Bill Bewley UCLA/CRESST Ontologies and Bayesian.
CRESST ONR/NETC Meetings, July July, 2003 ONR Advanced Distributed Learning Greg ChungAllen Munro Bill BewleyQuentin Pizzini Girlie DelacruzUSC/BTL.
1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,
SOLUTION What kind of plan do we need? How will we know if the work is on track to be done? How quickly can we get this done? How long will this work take.
Planning for Technology The Plan enables you to manage the innovation instead of the innovation managing you.
Acquisition & Retention of Basic Components of Skill Robert W. Proctor and Motonori Yamaguchi Army Research Office Grant W9112NF Training Knowledge.
EDUCAUSE 2003 Copyright Toshiyuki Urata 2003 This work is the intellectual property of the author. Permission is granted for this material to be shared.
1 Chapter 18: Selection and training n Selection and Training: Last lines of defense in creating a safe and efficient system n Selection: Methods for selecting.
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
LEADERSHIP TEAM MEETING October 30, Bell Ringer Choose four formative assessment practices that research reviews suggest lead to improved student.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V4, 1/18/07 Research.
A Presentation on TRAINING NEEDS ANALYSIS. Shradha(02) Vidya(34) Rothin(58) Pallav(48) Preeti Minz(11) Preeti Kumari(S2) Rohan Charly(24)
Accelerating Future Possibilities for Assessment and Learning Technology-Enabled Measurement: Looking Back to Move Ahead Greg Chung UCLA Graduate School.
Knowledge is fixed and need only to transfer from teacher to students is based on constructive and transformation process through learning process Learning.
CRESST ONR/NETC Meetings, July July, 2003 ONR Advanced Distributed Learning Bill Kaiser UCLA/SEAS Wireless Networked Sensors for Assessment.
 As we progress from a beginner to a skilled performer we must pass through different stages of learning  There is no definitive point at which an athlete.
Organizational Learning
Other Testing Issues Chapter 7 Red book.
2015 International Development and Early Learning Assessment (IDELA)Baseline Results: ELM project Afar and South Omo, Ethiopia.
Orientation and Training
SCOUT SNIPER BASIC COURSE
Blended Learning: Improve Disaster Risk Management in India by Using Knowledge-Base and Statistics 24/02/2019 An introduction course on InWEnt Blended-learning.
Mental Training “The Triangle of Success”
18. Selection and Training
Presentation transcript:

CRESST ONR/NETC Meetings, July 2003, v1 17 July, 2003 ONR Advanced Distributed Learning Bill BewleyAllen Munro Greg ChungJosh Walker Girlie Delacruz USC/BTL UCLA/CRESST The USMC Marksmanship Application  2003 Regents of the University of California

CRESST ONR/NETC Meetings, July 2003, v1 1 The Problem –Assessment models and tools are needed to help Navy, Marine, and contractor personnel evaluate, design and use Distributed Learning Project Goals –Develop and test models and tools on real applications Content knowledge: USMC marksmanship (02-03) Problem solving: USN EDO (Engineering Duty Officer) training and one other domain (03) The KMT Project

CRESST ONR/NETC Meetings, July 2003, v1 2 The First Application: HUEY HUEY: “UNQ to Expert” –In 2002, about 45% of Marines are shooting lower than Expert –About 2% of Marines are unqualified –About half need two tries to qualify The goal: Move all Marines to Expert classification UnqualifiedSharpshooterMarksmanExpert

CRESST ONR/NETC Meetings, July 2003, v1 3 The KMT Plan Assess and remediate potential unqualified Marines before they reach the firing line—on-line—using USMC and ONR training approaches Research Questions –What are the critical types of knowledge that affect shooting performance? –To what extent can cognitively-based measures predict USMC rifle shooting performance?

CRESST ONR/NETC Meetings, July 2003, v1 4 The Payoff Save time Save money Increase shooting scores

CRESST ONR/NETC Meetings, July 2003, v1 5 What’s Wrong With This Picture?

CRESST ONR/NETC Meetings, July 2003, v1 6 And What Would Cause This Shot Pattern?

CRESST ONR/NETC Meetings, July 2003, v1 7 Who Cares? The answers are important if you want to be a good marksman

CRESST ONR/NETC Meetings, July 2003, v1 8 Who Cares? And marksmanship is not easy –A shooter must routinely hit a 19- inch circular area at 500 yards in the prone position

CRESST ONR/NETC Meetings, July 2003, v1 9 Who Cares? A 1/16 inch muzzle deflection will cause a miss of over 2 feet at 500 yards 500 yards: 1.5 times the distance between the top row of opposite end zones of the LA Coliseum

CRESST ONR/NETC Meetings, July 2003, v1 10 What We Did Field research –Knowledge acquisition + staff expertise –Develop and pilot test draft assessments Delivery infrastructure –BTL’s iRides authoring system –BTL’s Battlesight Zero and Databook simulations integrated with the CRESST Knowledge Mapper

CRESST ONR/NETC Meetings, July 2003, v1 11 Variables: The Big Picture Steadiness Prior shooting experience* Device-fire performance Rifle Marksmanship Performance Perceptual- Motor CognitiveAffective Equip- ment Environ- ment Training effects* Aptitude* Knowledge of shooting* Confidence Anxiety* Attitudes* Ballistics Rifle character- istics Weather Distance * = attempted to measure in current studies

CRESST ONR/NETC Meetings, July 2003, v1 12 Marksmanship Inventory Knowledge Assessment Evaluates prior knowledge, knowledge transfer of fundamentals instruction Paper or online

CRESST ONR/NETC Meetings, July 2003, v1 13 Marksmanship Knowledge Mapper Trainees diagram key marksmanship concepts and relationships –Fundamentals –Shot-to-shot explanation –Data book procedure Score against a “doctrine” map produced by Quantico WTB staff

CRESST ONR/NETC Meetings, July 2003, v1 14 Mapper: Fundamentals

CRESST ONR/NETC Meetings, July 2003, v1 15 Mapper: Shot-to-Shot

CRESST ONR/NETC Meetings, July 2003, v1 16 Mapper: Data Book Procedure

CRESST ONR/NETC Meetings, July 2003, v1 17 Shot Group Depiction

CRESST ONR/NETC Meetings, July 2003, v1 18 Evaluation of Shooting Positions Assess and correct fundamental problems with shooter’s body position and the resulting impact on performance

CRESST ONR/NETC Meetings, July 2003, v1 19 Evaluation of Shooting Positions

CRESST ONR/NETC Meetings, July 2003, v1 20 Affective Measures Trait worry about qualification trial Trait anxiety about qualification trial State worry (pre- and post- qualification) about qualification trial State anxiety (pre- and post- qualification) about qualification trial

CRESST ONR/NETC Meetings, July 2003, v1 21 Sample Description SampleDatesPurpose SLR -- Stone Bay Dec , 2002 Test assessments of marksmanship knowledge and evaluate prediction of qualification score. SLR -- Quantico Mar , 2003 Replicate study on 2nd sample. ELR (2LT) -- Quantico May , 2003 Replicate study on 2nd LT undergoing entry- level training. SLR = Sustainment-Level Rifle Marksmanship ELR = Entry-Level Rifle Marksmanship

CRESST ONR/NETC Meetings, July 2003, v1 22 Sample Description Stone BayQuantico Phase I training Mixed - unit trained or classroom training at Stone Bay Uniform - all Marines get classroom training at Quantico CoachesFrequent rotation Professional coaches Combat status About 30% were from infantry units Nearly 100% from support units

CRESST ONR/NETC Meetings, July 2003, v1 23 Prediction of Qualification Score (Regression Model) VariableSLR-SBSLR-QELR-2LT Multiple R (R 2).56 (.31).52 (.27).80 (.63) Model variables (P-M) most recent score, whether a coach; (COG) time since last Phase I training, prior knowledge, shot- group knowledge, position identification, knowledge map, perceived value of knowledge to performance; (AFF) planning/worry (P-M) most recent score, frequency of shooting outside USMC duties, no. of years of shooting experience prior to USMC; (COG) prior knowledge; (AFF) trait worry, trait firing line experience (COG [post- instruction]) prior knowledge, proper position identification (firing hand), ASVAB GCT, perceived level of marksmanship knowledge, (AFF) state worry/anxiety during qualification

CRESST ONR/NETC Meetings, July 2003, v1 24 Prediction of Qualification Score (Perceptual-Motor vs. Cognitive/Affective) ModelSLR-SBSLR-QELR-2LT Overall (shooting experience, cognitive, affective), R (R 2 ).56 (.31).52 (.27).80 (.63) Shooting experience, R (R 2 ).43 (.18).37 (.13)-- Cognitive and affective, R (R 2 ).41 (.17).50 (.25)--

CRESST ONR/NETC Meetings, July 2003, v1 25 Prediction of Qualification Score (Knowledge) VariableSLR-SBSLR-QELR-2LT Perceived level of marksmanship knowledge --.26**.42** Prior knowledge.29**.16.46*** Knowledge mapping (p=.07) -.07 Shot group depiction.27** Evaluation of Shooting Positions.20** (p=.07)

CRESST ONR/NETC Meetings, July 2003, v1 26 Working Hypotheses VariableSLR-SBSLR-QELR-2LT Overall level of marksmanship experience LML Overall quality of classroom and live-fire training experience MHH Skill level placement (learner, practice, automatic) learner/ practice learner

CRESST ONR/NETC Meetings, July 2003, v1 27 Working Hypotheses Three stages of skill acquisition: –Learning, practice, automatic Cognitive measures should be most sensitive to Marines in the beginning to middle of the learning phase, and less sensitive to those past the mid-learning phase Psychomotor variables should be the most sensitive to Marines past the initial learning stage

CRESST ONR/NETC Meetings, July 2003, v1 28 Working Hypotheses VariableLearner vs. Practice Sensitivity of knowledge measuresL > P Correlations among knowledge measuresL > P Relationship between knowledge measure and shooting score Perceptual-motorL < P AptitudeL > P KnowledgeL > P AffectiveL > P

CRESST ONR/NETC Meetings, July 2003, v1 29