Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Performance Assessment
The 21st Century Context for
Session Learning Target You will gain a better understanding of identifying quality evidence to justify a performance rating for each standard and each.
School Based Assessment and Reporting Unit Curriculum Directorate
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Wynne Harlen. What do you mean by assessment? Is there assessment when: 1. A teacher asks pupils questions to find out what ideas they have about a topic.
CDI Module 10: A Review of Effective Skills Training
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
So, What IS a Standards-based
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Gathering Evidence Educator Evaluation. Intended Outcomes At the end of this session, participants will be able to: Explain the three types of evidence.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
How to Integrate Students with Diverse Learning Needs in a General Education Classroom By: Tammie McElaney.
Problem Based Lessons. Training Objectives 1. Develop a clear understanding of problem-based learning and clarify vocabulary issues, such as problem vs.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Preview of Today l Review next paper l Cover Chapter Three Get into groups.
Principles of High Quality Assessment
Measuring Learning Outcomes Evaluation
INACOL National Standards for Quality Online Teaching, Version 2.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Principles of Assessment
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Chapter 13 Program Evaluation and Technology Integration Strategies.
Interstate New Teacher Assessment and Support Consortium (INTASC)
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Understanding Meaning and Importance of Competency Based Assessment
Committee on the Assessment of K-12 Science Proficiency Board on Testing and Assessment and Board on Science Education National Academy of Sciences.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Assessment for learning
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
What is a Planned Curriculum?
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
Selecting Appropriate Assessment Measures for Student Learning Outcomes October 27, 2015 Cathy Sanders Director of Assessment Office of Assessment and.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Helping Students Plan for Success a.k.a. It’s All About My Future Tracy McFarlin-Pressley School Counselor OSDFS: The Power of Change: Healthy Students,
Data Teams. Data Teams in Action Medical/Education connections Double Entry Journal MVP – Most Valuable Point Explain one MVP with a partner and then.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Enriching Assessment of the Core Albert Oosterhof, Faranak Rohani, & Penny J. Gilmer Florida State University Center for Advancement of Learning and Assessment.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Instructional Leadership Supporting Common Assessments.
Module II Creating Capacity for Learning and Equity in Schools: The Mode of Instructional Leadership Dr. Mary A. Hooper Creating Capacity for Learning.
National 4 & 5 Physical Education. Documents available on website Unit by Unit approach to Performance (package 1) Unit by Unit approach to Factors impacting.
Open Math Module 3 Module 3: Approaches to Integrating OER into Math Instruction Planning Instruction with OER 1.0 Introduction.
Training Trainers and Educators Unit 8 – How to Evaluate
Training Trainers and Educators Unit 8 – How to Evaluate
REPORTING.
A Review of Effective Teaching Skills
Presentation transcript:

Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department of Clinical Research and Leadership School of Medicine and Health Sciences The George Washington University 1

Overview Leading a Learning Intervention In today's complex social environment, anyone who has been assigned the task of leading a learning initiative has to approach the process in a systematic way in order to succeed. The crucial success factor is often creating alignment amongst components that constitute a well-articulated learning intervention proposal. This session is offered for those who would like to better understand the four key questions that allow educational leaders to effectively envision, design, implement, and assess a meaningful learning intervention, whether this intervention takes place in the classroom or in the boardroom. Learning Objectives Discuss challenges associated with implementing a learning intervention Analyze major components of an effective learning intervention proposal Analyze four key questions that drive a learning intervention Discuss how assessment fits into the larger context of a learning intervention proposal Review six major assessment methods used in health sciences 2

Learning Community Context (overview of the learners’ location, background, roles, and interests) The Problem Statement (current state of the learners – in measurable outcomes - and why this is a problem) The Statement of Significance (why it is important that the intervention be implemented and who suffers in what way, if not implemented) The Proposed Solution (envisioned learning objectives - including the statement of tangible, measurable objectives, which tie back to the problem statement - i.e., current state) Implementation Time Line (a graphical depiction of the implementation plan, with major milestones along the way) Evaluation Method(s) (how to measure progress against learning objectives and how it will be known when the learners finally get to the envisioned future state) Conclusion (summary of key messages with which to leave the audience) ALIGNMENTALIGNMENT KEY SUCCESS FACTOR Alignment of Major Components 3

Define Measurable Learning Objectives: - Individual Level - Change in behavior - Change in perception - Change in performance - Organizational Level - Change in outcomes - Societal Level - Change in environment What do I want to change, for whom, by how much, and by when? In other words, what will success look like? Learning Intervention Time Line Plan Learning Intervention: - Identify learners - Select content - Develop materials - Design delivery method - Construct learning environment - Develop delivery schedule How will I change what, for whom, by how much, and by when? In other words, how will I achieve success? Design Learning Evaluation: -Identify key variables to be measured at the appropriate level of analysis (i.e., individual, organizational, societal) -Decide on the most appropriate method(s) for conducting formative and summative assessments that measure the key variables -Determine frequency and timing for formative assessment and summative assessments -Plan delivery format, medium, frequency, and timing of feedback to learners -Select quantitative statistical methods that will help determine the effectiveness of intervention How will I know what I am changing, for whom, by how much, and by when? In other words, how will I monitor progress against set objectives that define success? Implement Learning Intervention and Conduct Learning Evaluation: - Deliver content - Assess learning - Provide feedback to learners - Analyze data using statistical methods - Determine effectiveness of learning intervention - Declare success! Has what I wanted to change, for those whom I wanted to change it, actually changed by the amount I had wanted it to change? Conduct Formative Assessment Conduct Summative Assessment Review and Revise Content Method or Schedule Deliver Content START Provide Learner Feedback STOP Delivery Complete ? Conduct CAT Determine Effectiveness of Intervention YES NO Learning Intervention Timeline 4 PHASE I PHASE II PHASE IIIPHASE IV Ekmekci, O. (2013). Being there: Establishing instructor presence in an online learning environment, Higher Education Studies. 3(1)

Design Learning Initiative Evaluation How will I know what I am changing, for whom, by how much, and by when? In other words, how will I monitor progress against set objectives that define success? By asking this question, the leader will be able to identify key variables to be measured, decide on methods and frequency for formative and summative assessment, plan delivery format, medium, frequency, and timing of feedback to participants, and select statistical methods that will help determine the effectiveness of the initiative Reflecting on Four Key Questions 5 Define Measurable Learning Objectives What do I want to change, for whom, by how much, and by when? In other words, what will success look like? By asking this question, the leader will be able to contemplate and quantify the envisioned type of change, its scope (in terms of breadth, depth, and timeline), and the level at which change is to take place (i.e., individual, organizational or societal) – which will help ability to monitor progress against objectives PHASE I Plan Learning Initiative How will I change what, for whom, by how much, and by when? In other words, how will I achieve success? By asking this question, the leader will more easily identify participants, select content, develop materials, design delivery method, construct learning environment, and develop delivery schedule PHASE II PHASE III Implement Learning Initiative & Conduct Evaluation Has what I wanted to change, for those whom I wanted to change it, actually changed by the amount I had wanted it to change? By asking this question, the leader will be able to effectively deliver content, assess progress, provide feedback to participants, and analyze data using statistical methods to ultimately determine how successful the initiative has been PHASE IV Ekmekci, O. (2013). Being there: Establishing instructor presence in an online learning environment, Higher Education Studies. 3(1)

Assessment as a Process 6 Assessment is the ongoing process that allows one to: establish clear, measurable expected outcomes of learning; ensure learners have sufficient opportunities to achieve those outcomes; systematically gather, analyze, and interpret evidence to determine how well learning matches expected outcomes of learning; and using the resulting information to understand and improve learning. Downing, S.M. & Yudkowsky, R. eds. (2009). Assessment in health professions education. New York: Routledge. p. 4.

Threats to Validity 7 Construct Underrepresentation (CU) Not measuring the right things: under-sampling of the achievement domain – such as tests that are too short to support legitimate inferences to the domain, trivial questions written at low- levels of the cognitive domain, or mismatch of sample to domain Construct Irrelevant Variance (CIV) Not measuring (the right) things in the right manner: erroneous inflation or deflation of test scores due to certain types of uncontrolled or systematic measurement error – such as poorly crafted test questions, rater bias, or environmental conditions Downing, S.M. & Yudkowsky, R. eds. (2009). Assessment in health professions education. New York: Routledge.

Levels of Assessment and Matching Methods 8 Behavior Cognition DOES Requires methods that provide an assessment of routine “real” performance for the learner SHOWS HOW Requires methods that allow learner to demonstrate the integration of knowledge and skills that will translate into successful routine performance KNOWS HOW Requires methods that provide an assessment of how well learner uses knowledge in the acquisition, analysis, and interpretation of data and the development of an approach or plan KNOWS Requires methods that provide an assessment of how well learner has acquired the fundamental knowledge on which to build higher levels of knowledge and behavior Written Test Oral Examination Observational Assessment Performance Test Simulation Adapted from: Downing, S.M. & Yudkowsky, R. eds. (2009). Assessment in health professions education. New York: Routledge. Portfolio

Wrap Up 9 Questions?