General Level Structure

Slides:



Advertisements
Similar presentations
Test Development.
Advertisements

Science Breakout New Teacher Meeting 6, Year 2 March 31, 2011.
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
Crosscutting Concepts and Disciplinary Core Ideas February24, 2012 Heidi Schweingruber Deputy Director, Board on Science Education, NRC/NAS.
Teaching Experiments and a Carbon Cycle Learning Progression 2009 AERA Presentation Written by: Lindsey Mohan and Andy Anderson (Michigan State University)
Classroom Assessments Checklists, Rating Scales, and Rubrics
Karen Draney (University of California, Berkeley) Lindsey Mohan (Michigan State University) Philip Piety (University of Michigan) Jinnie Choi (University.
Learning Progressions: A Discussion Ravit Golan Duncan Rutgers University Ravit Golan Duncan Rutgers University.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
Science This introductory science course is a prerequisite to other science courses offered at Harrison Trimble. Text: Nelson, Science 10 Prerequisite:
Learning Progressions Immersion Activity Power point presented to teachers during professional development to help teachers learn about learning progressions.
Crosscutting Concepts Next Generation Science Standards.
This research is supported in part by three grants from the National Science Foundation: Developing a research-based learning progression for the role.
Using empirical feedback to develop a learning progression in science Karen Draney University of California, Berkeley.
Assessing K-12 Students’ Learning Progression of Carbon Cycling Using Different Types of Items 2010 NARST Presentation Written by: Jing Chen and Charles.
A comparison study on American and Chinese secondary students’ learning progression for carbon cycling in socio- ecological systems 2009 AERA Presentation.
ENVIRONMENTAL LITERACY PROJECT This research is supported in part by grants from the National Science Foundation: Developing a Research-based Learning.
Analyzing students’ learning performances in terms of practices for developing accounts Hui Jin, Jiwon Kim and Charles W. Anderson.
LEARNING PROGRESSIONS TOWARD ENVIRONMENTAL LITERACY Charles W. Anderson, Lindsey Mohan, Hui Jin, Jing Chen, Phil Piety, Hsin-Yuan Chen Karen Draney, Jinnie.
This research is supported in part by grants from the National Science Foundation: Developing a Research-based Learning Progression for the Role of Carbon.
This research is supported in part by three grants from the National Science Foundation: Developing a research-based learning progression for the role.
A K-12 LEARNING PROGRESSION TO SUPPORT ENVIRONMENTAL LITERACY MICHIGAN STATE UNIVERSITY Environmental Literacy Research Group.
Chapter 14: Affective Assessment
The Effects of Teaching Materials and Teachers’ Approaches on Student Learning about Carbon- transforming Processes Li Zhan, Dante Cisterna, Jennifer Doherty,
Connections between students’ explanations and interpretations of arguments from evidence Allison L. Freed 1, Jenny M. Dauer 1,2, Jennifer H. Doherty 1,
ENVIRONMENTAL LITERACY PROJECT This research is supported in part by grants from the National Science Foundation: Developing a Research-based Learning.
Sociology 12. Outcome analyze a variety of appropriate sociological research methods Describe common sociological research methods
A K-12 LEARNING PROGRESSION TO SUPPORT UNDERSTANDING OF WATER IN THE ENVIRONMENT Beth Covitt & Kristin Gunckel Geological Society of America, North-Central.
Center for Curriculum Materials in Science AAAS, Michigan State University, Northwestern University, University of Michigan Organizers Andy Anderson, Michigan.
Reload images Reload Images Understanding the AEDI results.
Review of Assessment Toolkit
Classroom Assessments Checklists, Rating Scales, and Rubrics
Molecules and Molecular compounds
Chapter 2 Objectives Describe the purpose of the scientific method.
How are physical and chemical properties different?
VALIDITY by Barli Tambunan/
Principles of Quantitative Research
Long Term Ecological Research Math Science Partnership
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Unified Modeling Language
MODERNIZING ECOLOGY CONTENT IN THE REQUIRED K-12 SCIENCE CURRICULUM:
Classroom Assessments Checklists, Rating Scales, and Rubrics
How are physical and chemical properties different?
Teaching and Educational Psychology
Hui Jin, Li Zhan, Charles W. Anderson Michigan State University
Improvement 101 Learning Series
American and Chinese Secondary Students’ Written Accounts of Carbon Cycling in Socio-ecological Systems Jing Chen1, Charles, W. Anderson1, & Xinghua Jin2.
© 2012 The McGraw-Hill Companies, Inc.
(Michigan State University)
Teaching Experiments and a Carbon Cycle Learning Progression
Jennifer Doherty, Karen Draney and Andy Anderson
Long Term Ecological Research Math Science Partnership
Validation of a Multi-Year Carbon Cycle Learning Progression
How are physical and chemical properties different?
Chapter 2 Objectives Describe the purpose of the scientific method.
Sociology Outcomes Assessment
THE NATURE OF SCIENCE.
Learning Progressions in Environmental Science Literacy
Chapter 2 Objectives Describe the purpose of the scientific method.
An Introduction to Chemistry Chapter 1
Writing Criterion Referenced Assessment Criteria and Standards
HS Physical Science Spring 2017
Key Ideas How do scientists explore the world?
PERCENTILE. PERCENTILE What is percentile? Percentile (or Centile) the value of a variable below which a certain percent of observations fall.
Developing Quality Assessments
2009 AERA Annual Meeting, San Diego
Partnership Moderation Report,
How are physical and chemical properties different?
Animals Unit Activity 2.4: Questions about Animals
Activity 2.4: Questions about Plants
Presentation transcript:

General Level Structure Developing progress variables and learning progressions for the Carbon Cycle The BEAR Assessment System Karen Draney, Mark Wilson, Jinnie Choi, and Yongsang Lee, UC Berkeley Progress variables Outcome space A progress variable is used to represent a cognitive theory of learning consistent with a developmental perspective. This is grounded in the principle that assessments be designed with a developmental view of student learning. The underlying purpose of assessment is to determine how students are progressing from less expert to more expert in the domain of interest, rather than limiting the use of assessment to measure competence after learning activities have been completed.With a progress variable, we seek to describe a continuum of qualitatively different levels of knowledge, from a relatively naïve to a more expert level. The “outcome space” refers to the mapping of all possible responses to an assessment item to a particular level of the progress variable. The outcome space must be exhaustive (every student response must be mappable) and finite! A careful study of this mapping has engaged our group for many meetings. In order to ensure the reliability of our scoring methods, and the stability of our variable structure, we have taken sets of student responses to many of our assessment items, and had teams at both Michigan State University and and Berkeley, score them All differences were resolved through discussion (below left). In addition, we have attempted to select from our data an “exemplary” student response for each item at every scoring level of the variable(s) with which it is associated, and to annotate why it is exemplary (below right). General Level Structure Carbon Cycle progress variables Structure of Systems Tracing Matter Tracing Energy Citizenship Change over Time Current Work TM Item scoring issue : BURNING A MATCH Q : What happens to the wood of a match as the match burns? Why does the match lose weight as it burns? Future Work Total 81,819 It turns to ash. The heat makes it lose weight. It loses weight because if the fire goes on it, It burns it and turns into ashes. Because wood burns into ash, takes out the water and sap. It becomes ashes. Ashes don’t weigh anything. Students’ actual responses To define the levels of the progress variables, we started with a description of the types of reasoning necessary to function at a high level of environmental literacy. In addition, we examined both previous literature, and written and interview-based accounts, from elementary, middle school, and high school students, to define additional levels. The lowest levels of reasoning, generally used by middle to upper elementary school students, we referred to as a lower anchor This includes a lack of awareness of various “invisible” mechanisms (including microscopic and atomic-molecular structure, large-scale structure, gases, etc.); reliance on senses rather than data; and narrative rather than model-based reasoning. As an upper anchor, we selected the highest levels of performance seen in high school students after completing relevant science units UCB : Level 2 MSU : More than Level 3 It could be a simple description about visible events. Students might simply mention ashes without recognizing chemical reaction, so we should not assume students recognize chemical reaction with these responses. Ashes are macroscopic products and students who identify ashes should be distinguished from students who don’t. “burning into ash” is a chemical reaction, so we can assume students identify chemical process with these responses. Items design Measurement Model Each item must represent some number of levels of one or more progress variables. The design of items, and the selection of existing items, to represent the variables, has occupied much of our time. In this process, the progress variables take form. Graphical Representation of the Item level difficulties and person performance levels for Structure of Systems In the graphical representation to the right, an X represents a person performance on a collection of items related to Structure of Systems. The numbers on the right represent scoring levels (from the general level structure above) on individual items (e.g. 5.2 indicates a level 2 performance on item 5). Less difficult tasks and less proficient persons are toward the bottom of the page. The use of a formal measurement model like this one to analyze our data allows us to examine our expectations for assessment empirically. We can make certain that items that we expect to be particularly easy or difficult in fact are, and we can find and examine item and person performances that are unusual or unexpected. In this case, we expected that students would use similar levels of reasoning on a wide variety of different assessment tasks, leading to a “banding” effect in which the same scoring levels would occur together across most or all of the items. In our preliminary analysis of the Structure of Systems data, we have indeed observed this.