Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Economic Education and How People Learn Scott Simkins, Interim Director Academy for Teaching and Learning (ATL) North Carolina A&T State University Acknowledgements:
*based on the research of many people, some from my science ed research group (most examples college physics, but results general) Carl Wieman Assoc. Director.
Very High level overview– creating and sustaining a new science course Science Education Initiatives at University of Colorado and University of British.
Purpose of Scientific Education / Was: Training the next generation of scientists / Now: Preparing a scientifically literate populace and workforce in.
New Science Faculty Workshop #1 Science of Learning Carl Wieman-- physics Essential elements for learning Consistent research results from cognitive psychology,
Learning Community II Survey Spring 2007 Analysis by Intisar Hibschweiler (Core Director) and Mimi Steadman (Director of Institutional Assessment)
Supporting Teachers to make Overall Teacher Judgments The Consortium for Professional Learning.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
I) Introduction II) What research tells us about expert thinking and the effectiveness of different teaching approaches? III) How research can be used.
Science Education in the modern world; why and how
New Science Faculty Workshop # 2 Implementing principles of learning summarize bunch of research-- sources, a. refs on cwsei website, b. will list a few.
Carl Wieman UBC & CU Colorado physics & chem education research group: W. Adams, K. Perkins, K. Gray, L. Koch, J. Barbera, S. McKagan, N. Finkelstein,
Carl Wieman UBC & CU Colorado physics & chem education research group: W. Adams, K. Perkins, K. Gray, L. Koch, J. Barbera, S. McKagan, N. Finkelstein,
Carl Wieman UBC & CU Colorado physics & chem education research group: W. Adams, K. Perkins, K. Gray, L. Koch, J. Barbera, S. McKagan, N. Finkelstein,
Principles of High Quality Assessment
I) The new importance of science education. II) Research illuminating the problem. III) Vision of the solution. (Not medieval science, why medieval science.
Colorado Learning About Science Survey for Experimental Physics Benjamin Zwickl, Heather Lewandowski & Noah Finkelstein University of Colorado Physics.
Carl Wieman University of British Columbia University of Colorado Helen Quinn Symposium.
TEMPLATE DESIGN © Measuring Students’ Beliefs about Physics in Saudi Arabia H. Alhadlaq 1, F. Alshaya 1, S. Alabdulkareem.
Modeling Instruction Lessons from America. Mechanics Modeling Workshop 90 hours of professional development consisting of intensive immersion in the mechanics.
Studying Student Attitudes and Beliefs About Physics: their importance and what affects them Carl Wieman Work mostly done by Wendy Adams and Kathy Perkins,
Principles of Assessment
1 MSP-Motivation Assessment Program (MSP-MAP) Tools for the Evaluation of Motivation-Related Outcomes of Math and Science Instruction Martin Maehr
School Innovation in Science Formerly Science in Schools An overview of the SIS Model & supporting research Russell Tytler Faculty of Education, Deakin.
Scientific Inquiry: Learning Science by Doing Science
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Presented by: Joseph Ginotti PLN Director
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
Colorado physics & chem education research group: W. Adams, K. Perkins, K. Gray, L. Koch, J. Barbera, S. McKagan, N. Finkelstein, S. Pollock, R. Lemaster,
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
New Teachers’ Induction January 20, 2011 Office of Curriculum and Instruction.
Acadia Institute for Teaching and Technology1 Creating a Balanced Course.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
How do these groups of students think about and learn genetics?
CT 854: Assessment and Evaluation in Science & Mathematics
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Scholarship of Teaching and Learning Tuesday, June 24, 2014 Strategies for Research and Scholarship Karen M. Kortz, Community College of Rhode Island Carol.
Selected Teaching-Learning Terms: Working Definitions...
Cornell System of Note-Taking The Cornell Method of note making is a 3 part system of making notes. Making notes, as distinguished from taking notes, is.
Teaching Reading Comprehension
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
*based on the research of many people, some from my science ed research group (most talk examples from physics, but results general) Carl Wieman Assoc.
~ 25 years ago– Why grad students coming into my lab so good in physics courses, but do not know how to do physics? A scientific approach to teaching science.
Physics Education Research at CU S.J. Pollock SPS Fall 05 or why do I keep filling out those online surveys at the start of every course?
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
1 Engaging Students Incorporating Depth, Complexity, and Questioning Strategies into the classroom. Phase 1 “Plan for Using Questioning” November 4, 2009.
Agenda What is “learner-centered”? ~Think of Time Activity ~ Learner-Centered: In Our Own Words Effective Instructional Strategies for the Learner- Centered.
Lesson Planning in the Elementary Classroom By: Sara Peck.
Designing a Culminating Task Presented by Anne Maben UCLA Science & Literacy Coach Based on the model by Jay McTighe, Maryland Assessment Consortium.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Goals and Objectives  Why Use Questioning Strategies?  Effective Questioning Techniques  Levels of Questioning…Increasing Understanding, Models for.
How can we use CATs in tutoring? Lee Helbert-Gordon Director, Institutional Research & Student Success Center Prairie State College.
1 A scientific approach to learning and teaching physics (or any other science or engineering subject) Carl Wieman Department of Physics and School of.
Carl Wieman UBC & CU Colorado physics & chem education research group: W. Adams, K. Perkins, K. Gray, L. Koch, J. Barbera, S. McKagan, N. Finkelstein,
Carl Wieman Colorado physics & chem education research group: W. Adams, K. Perkins, K. Gray, L. Koch, J. Barbera, S. McKagan, N. Finkelstein, S. Pollock,
Center for Assessment and Improvement of Learning
Using Cognitive Science To Inform Instructional Design
Scholarship of Teaching and Learning
Assist. Prof.Dr. Seden Eraldemir Tuyan
ASSESSMENT OF STUDENT LEARNING
THE JOURNEY TO BECOMING
Presented by: Joseph Ginotti PLN Director
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Building Better Classes
Presentation transcript:

Carl Wieman Assoc. Director for Science White House Office of Science and Technology Policy Measuring Impact in STEM Ed; Are they thinking like experts?

The White House perspective “Maintaining our leadership in research and technology is crucial to America’s success. But if we want to win the future – if we want innovation to produce jobs in America and not overseas – then we also have to win the race to educate our kids.” B. Obama Major Policy questions What is effective teaching, particularly in STEM? Can it be developed? How? How can we achieve better learning? (evidence!)

switching hats to science education researcher What is the broad goal of your project? → how to measure What is the learning that matters to you? (30 s) Bunch of facts & solution techniques? May be useful, but use tiny fraction of what learn in school, and in career need vastly more than learn in school. Want them to understand _____! [DNA, relativity, PH…] What does “understand” mean? How measure if achieved? Think about and use ____ like a scientist/engineer.

“Think like a scientist/engineer.” I. What does that mean? Expert thinking (cog. pysch.) II. Development of expert thinking III. More details on expert thinking IV. Measuring --developing tools

cognitive psychology brain research Science classroom studies Major advances past 1-2 decades Consistent picture  Achieving learning → principles of learning help design experiments and make sense of results. Understand both what and why.

or ? Expert competence = factual knowledge Mental organizational framework  retrieval and application Expert competence research* Ability to monitor own thinking and learning ("Do I understand this? How can I check?") New ways of thinking-- require MANY hours of intense practice to develop *Cambridge Handbook on Expertise and Expert Performance patterns, relationships, scientific concepts historians, scientists, chess players, doctors,...

Significantly changing the brain, not just adding bits of knowledge. Building proteins, growing neurons  enhance neuron connections,... Brief digression on research on development of expertise.

Essential element of developing expertise* “Deliberate practice” (A. Ericcson) task of challenging but achievable level that requires explicit expert-like thinking. Intensely engaged reflection and guidance on result repeat & repeat & repeat,... 10,000 hours later-- very high level expertise Different brain, develops with “exercise.” * accurate, readable summary in “Talent is over-rated”, by Colvin cew interpretation--“formative assessment”, “constructivism”, “self-regulated learning” all contained in “deliberate practice” framework.

“Think like a scientist/engineer.” I. What does that mean? Expert thinking (cog. pysch.) II. Development of expert thinking III. More details on expert thinking IV. Measuring --developing tools

How experts solve a problem— ”Cognitive task analysis” (and how different from non-experts) “How Scientists Think in the Real World: Implications for Science Education”, K. Dunbar, Journal of Applied Developmental Psychology 21(1): 49– concepts and mental models (analogies) testing these and recognizing when apply or not distinguishing relevant & irrelevant information established criteria for checking suitability of solution method or final answer (“sense- making and self-checking”) features in your discipline? (1 min)

Lots of complex pattern recognition What features and relationships important? Which are not? (“surface features” vs. “underlying structure”) Often hear-- “Novice problem solvers just do pattern matching, experts use more sophisticated concept based strategies.” cew unproven claim (not official WH position)— It is all pattern matching– experts just look for and recognize different patterns.

Non-cognitive elements of thinking like a scientist. Perceptions/attitudes/beliefs (important, but changed more quickly, essential precursor to “deliberate practice”)

Novice Expert Content: isolated pieces of information to be memorized. Handed down by an authority. Unrelated to world. Problem solving: simple matching to memorized recipes. Perceptions about science (& how learned and used) Content: coherent structure of concepts. Describes nature, established by experiment. Prob. Solving: Systematic concept-based strategies. Widely applicable. *adapted from D. Hammer consistent views across scientists in a discipline (physics, chem, bio)

Student Perceptions/Beliefs 0% 10% 20% 30% 40% 50% 60% Actual Majors who were originally intended phys majors Actual Majors who were NOT originally intended phys majors Percent of Students CLASS Overall Score (measured at start of 1 st term of college physics) 0% 10% 20% 30% 40% 50% 60% All Students (N=2800) Intended Majors (N=180) Actual Majors (N=52) Percent of Students B CLASS Overall Score (measured at start of 1 st term of college physics) Expert Novice Kathy Perkins, M. Gratny

Student Beliefs 0% 10% 20% 30% 40% 50% 60% Actual Majors who were originally intended phys majors Actual Majors who were NOT originally intended phys majors Percent of Students CLASS Overall Score (measured at start of 1 st term of college physics) Expert Novice

Course Grade in Phys I or Phys II (beliefs more important factor than grades) 0% 5% 10% 15% 20% 25% 30% 35% 40% 45% DFWCBA Grade in 1 st term of college physics Percent of Students All Students (2.7/4) Intended Majors (2.7/4) Actual Majors (3.0/4)

Creating tests to measure expert thinking as different from non-expert (technical details) A. Cognitive Things to look for What mental models? How make decisions? What resources called upon (or not)? Must understand student thinking! No substitute for interviews. Cognitive– think aloud solution to task. Look for consistent features that appear. Code interviews and have independent coding to make objective. (BEWARE CONFIRMATION BIAS!)

Creating tests to measure expert thinking as different from non-expert Example– testing use of expert mental model “troubleshooting” Your laser suddenly put out only half as much light as it had been before. What change may have produced this result? “redesign” What are all the ways you could double the power coming out of your laser? You would like to …(e.g. build a bridge across this river). What information do you need to solve this problem?

Steps in test development 1. Interview faculty-- 2. Interview students-- understand student thinking 3. Open-ended survey questions to probe. 4. Create multiple choice test-- answer choices reflect actual student thinking. 5. Validation interviews on test– experts and sample population 6. Administer to classes-- run statistical tests on results. Often iterate and/or skip steps, refine. “Reasonable” data much better than no data!

Measuring perceptions. Same basic approach. Interview students, capture perceptions in their own words. Survey as to level of agreement. ~40 statements, strongly agree to strongly disagree-- Understanding physics basically means being able to recall something you've read or been shown. I do not expect physics equations to help my understanding of the ideas; they are just for doing calculations.

Conclusion Important educational goal “Thinking like a scientist” Requires careful analysis to make explicit,and distinguish from thinking of nonexperts. Straightforward process to create tests that measure. More sensitive and meaningful than typical exams. Development and validation of instruments to measure learning of expert-like thinking, W. Adams and C. Wieman, Int. J. Sci Ed (in press). Covers last part of talk and technical details.

Tips for developing assessment tools. 1. Interview largest possible range of people. Patterns and expert-novice differences more obvious student classes in large university don’t vary year-to-year. Good way to get test-retest reliability, find out if can measure changes. 3. Best questions: a)measure important aspect of student thinking and learning. b) measure aspect that instructors care about & shocked at poor result. 4. Hard and not so useful to measure expert-like thinking on everything. Sample as proxy.

Key elements of Good Concept Inventory: created by physicists, key concepts where student failure is shocking. created by physicists, key concepts where student failure is shocking. (not probed by standard exams) (not probed by standard exams) easy to administer exam pre & post. Learning from this course easy to administer exam pre & post. Learning from this course set of hard-to-learn topics-- (not everything) set of hard-to-learn topics-- (not everything) proxy for broader learning (mastery & application of concepts) Suitable for use with wide range of institutions and students Suitable for use with wide range of institutions and students

How administer? Attitude surveys-- online, 1 st and last week of class small bonus mark for completion % Concept inventories-- Pre--in class, 1 st week. Paper, scantron. Students not keep test. Post-- In class last week (“guide to in-class review and study for final exam”). No affect on course mark. Occasional question on final. 90+ %

Summary: Data to drive educational improvement Requirements measure value added (pre -post) easy to use (more important than perfection) test “expert-thinking” of obvious value to instructor validated (measure what is claimed) need many such instruments to use across curriculum (collaborate) instruments & research papers class.colorado.edu CWSEI.ubc.ca

On average learn <30% of concepts did not already know. Lecturer quality, class size, institution,...doesn't matter! Similar data for conceptual learning in other courses. R. Hake, ”…A six-thousand-student survey…” AJP 66, (‘98). Force Concept Inventory- Force Concept Inventory- basic concepts of force and motion 1 st semester university physics. Simple real world applications. Fraction of unknown basic concepts learned Average learned/course 16 traditional Lecture courses Measuring conceptual mastery Ask at start and end of semester-- What % learned? (100’s of courses) improved methods

Nearly all intro classes average shifts to be 5-10% less like scientist. Explicit connection with real life → ~ 0% change +Emphasize process (modeling) → +10% !! new

What every teacher should know Components of effective teaching/learning apply to all levels, all settings 1. Motivation (lots of research) 2. Connect with prior thinking 3. Apply what is known about memory a. short term limitations (relevant to you) b. achieving long term retention retrieval and application-- repeated & spaced in time *4. Explicit authentic practice of expert thinking. Extended & strenuous basic cognitive & emotional psychology, diversity

Measuring student (dis)engagement. Erin Lane Watch random sample group (10-15 students). Check against list of disengagement behaviors each 2 min. time (minutes) example of data from earth science course

Design principles for classroom instruction 1. Move simple information transfer out of class. Save class time for active thinking and feedback. 2. “Cognitive task analysis”-- how does expert think about problems? 3. Class time filled with problems and questions that call for explicit expert thinking, address novice difficulties, challenging but doable, and are motivating. 4. Frequent specific feedback to guide thinking. DP