Assessment and Instructional-Element Analysis in Evidence-based Physics Instruction David E. Meltzer Arizona State University Supported in part by NSF.

Slides:



Advertisements
Similar presentations
Assessment types and activities
Advertisements

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
Improving on the Recitation Section: Tutorials in Introductory Physics Wed Brown Bag SJP Fa '05.
Methodology Student Course Performance with Objectives Based Assessment Todd A Zimmerman, Gabriel Hanna - University of Wisconsin-Stout Question: Does.
Maritime Education Summit
Developing a Statistics Teaching and Beliefs Survey Jiyoon Park Audbjorg Bjornsdottir Department of Educational Psychology The National Statistics Teaching.
Formative Assessment Materials for Large- Enrollment Physics Lecture Classes David E. Meltzer Department of Physics University of Washington
PER User’s Guide. Development of the PER User’s Guide: Identifying key features of research-based pedagogical tools for effective implementation Sam McKagan.
An Assessment Primer Fall 2007 Click here to begin.
INTERACTIVE LEARNING IN THE LECTURE-CLASS SETTING Alan Slavin Department of Physics and Jonathan Swallow (deceased) Instructional Development Centre TRENT.
Building on a Base: tools, practices, and implications from physics education research (PER) S.J. Pollock N.D. Finkelstein Physics Department Thanks for.
Experimental Hybrid Courses Featuring Collaborative Problem Solving and On-Line Lectures Cordelia M. Brown David G. Meyer Electrical & Computer Engineering.
How do we evaluate how effective we are in helping our students master course material? In promoting our broader educational agendas? -Prelim and final.
College of Engineering Hybrid Course Formats That Facilitate Active Learning Professor David G. Meyer School of Electrical and Computer Engineering.
Replacing “Traditional Lectures” with Face-to-Face Directed Problem Solving Sessions and On-Line Content Delivery David G. Meyer Electrical & Computer.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Statistics Education Research Journal Publishing in the Statistics Education Research Journal Robert C. delMas University of Minnesota Co-Editor Statistics.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Section 29.2 The Marketing Survey
Ideas for Educational Components of CAREER Proposals Jeffrey Froyd Director of Faculty and Organizational Development Office.
P Technology Enabled Active Learning (TEAL) Redesign of Mechanics and Electromagnetism at MIT Course Redesign Workshop October 17, 2008 Dr. Peter.
Chapter 3: Marketing Intelligence Copyright © 2010 Pearson Education Canada1.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Development of Active-Learning Curricular Materials in Thermodynamics for Physics and Chemistry David E. Meltzer Department of Physics and Astronomy And.
Are there “Hidden Variables” in Students’ Initial Knowledge State Which Correlate with Learning Gains? David E. Meltzer Department of Physics and Astronomy.
Student Centered Teaching Through Universal Instructional Design Part II.
Online Resources for Pre- Service Teachers Laura J. Pyzdrowski West Virginia University Anthony S. Pyzdrowski California University Of Pennsylvania
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evaluation of Teachers—or of Teaching?—for Improving Learning Outcomes David E. Meltzer Mary Lou Fulton Teachers College Arizona State University Supported.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
POSTER TEMPLATE BY: Now you can compare them all! Ahmed Ibrahim 1, Sameer Arabasi 2 1. Department of Educational and Counselling.
INVESTIGATIVE SCIENCE LEARNING ENVIRONMENT (ISLE) Eugenia Etkina Alan Van Heuvelen Rutgers University Moscow, Idaho, April 1st, 2011
1 Diagnostic Measurement and Reporting on Concept Inventories Lou DiBello and Jim Pellegrino DRK-12 PI Meeting Washington, DC December 3, 2010.
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
Research-Based Active-Learning Instructional Methods in Large-Enrollment Physics Classes David E. Meltzer Department of Physics and Astronomy Iowa State.
What is Teacher Effectiveness and How May it Be Assessed? David E. Meltzer Department of Physics, University of Washington and Seattle Country Day School.
Investigation of Diverse Representational Modes in the Learning of Physics and Chemistry David E. Meltzer Department of Physics and Astronomy Iowa State.
Evaluation & Research Opportunities Evaluation of the LON-CAPA Project Evaluation of resource material quality & effectiveness of materials for learning.
The “Fully Interactive” Physics Lecture: Active-Learning Instruction in a Large- Enrollment Setting David E. Meltzer Arizona State University USA
CF04: Using Assessment Data to Inform Instruction at the Buffalo State College Summer Physics Teachers’ Academy Kathleen Falconer, Department of Elementary.
Student Preferences For Learning College Algebra in a Web Enhanced Environment Dr. Laura J. Pyzdrowski, Pre-Collegiate Mathematics Coordinator Institute.
Learning Gains in Physics in Relation to Students' Mathematics Skills David E. Meltzer Department of Physics and Astronomy Iowa State University.
Multiple Representations in Physics Education: Recent Developments and Questions for Future Work David E. Meltzer Department of Physics University of Washington.
Experiments with PER (Physics Education Research) Chris Meyer York Mills C. I.
Diagnostic Testing David E. Meltzer University of Washington.
Time-dependent Interpretation of Correct Responses to Multiple-Choice Questions David E. Meltzer Mary Lou Fulton Teachers College Arizona State University.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Research Problem Part of learning is learning from mistakes, but there is no “safe” mechanism for attempt->failure->reattempt built into most courses where.
Adapting PER Strategies for Middle-School Science Classes David E. Meltzer Mary Lou Fulton Teachers College Arizona State University Supported in part.
Use of Pre-Instruction Tests to Predict Student Course Performance David E. Meltzer Arizona State University Supported in part by NSF DUE #
Diagnoser.com: Online learning & assessment tools for Physics
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
 Rubrics promote student performance and encourage reflective teaching practices (Beeth et al.,2001; Luft, 1998)  Consistent rubric use in the classroom.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Assessment in First Courses of Statistics: Today and Tomorrow Ann Ooms University of Minnesota.
David E. Meltzer and Matthew I. Jones Arizona State University
Office of Planning & Development
Performance on In-class vs
David E. Meltzer Department of Physics University of Washington
HRM 326 TUTORIALS Lessons in Excellence -- hrm326tutorials.com.
HRM 326 TUTORIALS Education for Service--hrm326tutorials.com.
Research on Geoscience Learning
David E. Meltzer Department of Physics University of Washington
Research-based Active-Learning Instruction in Physics
Presentation transcript:

Assessment and Instructional-Element Analysis in Evidence-based Physics Instruction David E. Meltzer Arizona State University Supported in part by NSF DUE #

How Are Research-Based Physics Instructional Methods Assessed? Resource Letter ALIP-1 (Meltzer and Thornton, 2012): Compendium of ≈30 research-validated instructional methods/materials in physics Each method/material examined to determine which instruments and techniques were used to provide evidence of instructional effectiveness

Diagnostic “Instruments” “Standardized” surveys: ≈20-40 items, usually multiple- choice, qualitative (non-algebraic, non-numerical); Example: FCI Researcher-constructed free-response questions: qualitative emphasis; fewer than 8 items; may or may not require student explanations; Example: University of Washington assessment items Instructor-constructed course assessments: quizzes, exams, homework, grades; emphasis on quantitative and algebraic problem-solving

Comparison Groups Local: compare to similar courses at home institution that use standard instruction External: compare to similar courses/student populations at other institutions National baseline: compare to previously published data reflecting performance in representative equivalent courses at multiple institutions

Survey Instruments Frequently used: –Force Concept Inventory (FCI) –Force and Motion Conceptual Evaluation (FMCE) Occasionally used: –Conceptual Survey in Electricity and Magnetism (CSEM) –Brief Electricity and Magnetism Assessment (BEMA) Rarely used: –Electric Circuits Concept Evaluation (ECCE) –Mechanics Baseline Test (MBT) –Colorado Upper-Division Electrostatics Diagnostic Quiz [non-MC] (CUE) –Quantum Mechanics Visualization Instrument (QMVI) –Quantum Mechanics Assessment Tool [non-MC] (QMAT) –Quantum Mechanics Conceptual Survey (QMCS)

Examples of Researcher-Constructed Free-Response Items University of Washington assessment items (published in AJP) University of Minnesota Context-Rich Problems (examples in AJP, others available in on-line compendium)

Other Outcomes Assessed Attitudes and beliefs (e.g., student ideas about how best to learn physics) Facility with physics practices (e.g., ability to design and execute experiment): rarely assessed –Example: Etkina and Van Heuvelen (2007)

Issues of Concern Most assessments done via multiple-choice survey instruments –(relatively) easy to implement –limited insight into student thinking: imprecise, and lack explanations –limited coverage of instructional intervention (narrow scope of topics) Most non-survey assessments unpublished Most comparison groups are limited in number and generalizability Most components of each collection of materials go unassessed –Exception: University of Washington; majority of UW Tutorials undergo extensive cycle of iterative assessment and validation

Assessment of Instructional “Elements” What is the relative impact of various instructional elements such as: –use/non-use of physical objects –size and composition of student groups –frequency and method of feedback provided –homework –TA preparation –classroom discussion format Rarely assessed due to logistical challenges –Notable exceptions: Koenig, Endorf, and Braun (2007); Morote and Pritchard (2009)

Summary Questions What assessment methods and materials yield optimum information regarding effectiveness of student learning? What assessment methods possess optimum potential to persuade skeptical instructors of the value of novel instructional methods? What are reasonable tradeoffs between constraints on time and resources vs. scope, validity, and reliability of assessments?