Download presentation
Presentation is loading. Please wait.
Published byDouglas Ross Modified over 9 years ago
1
Connecting Cognition and Assessment: Concepts & Directions Derived from NRC Research Reports Jim Pellegrino University of Illinois, Chicago
3
The Problem States face several critical decisions regarding how their assessment systems should be designed and implemented to respond effectively to the multiple provisions of No Child Left Behind. This includes defining the appropriate targets for assessment relative to content standards and determining how those targets might be apportioned across different elements of a comprehensive assessment system that provides information to support the enhancement of learning and instruction as well as accountability. The multiple sets of goals and needs is a major conceptual and operational design challenge.
4
Overview o NCLB Requirements and Critical Issues NCLB Requirements and Critical Issues NCLB Requirements and Critical Issues o Advice for Policymakers -- Instructionally Supportive State Assessment Practices o Conceptual Basis for Working Through Critical Issues of Assessment Design and Use o What’s Needed for Further Progress o Where and How to Learn More
5
UNTESTED
6
ESEA/NCLB Key Requirements o Annual assessments of all students in Math and Reading for Grades 3-8, and once in grades 9-12, beginning no later than 2005/2006 academic year Math and Reading annual assessments must be aligned with state academic content and achievement standards o Annual assessment of students in science no less than once in each of grades 3-5, 6-9 and 10-12, beginning no later than 2007/2008 academic year o Adequate Yearly Progress 100% of students must meet or exceed a “proficient” level of academic achievement by the 2013-2014 academic year 100% of students must meet or exceed a “proficient” level of academic achievement by the 2013-2014 academic year Establish intermediate goals for ongoing improvement over the 12 year period Establish intermediate goals for ongoing improvement over the 12 year period o Reporting in multiple categories for multiple demographic groups
8
Some Key Issues o What gets assessed? u What standards & at what level of granularity? o How does useable information get derived? u Who needs what information, in what form, by when, and for what purpose? o Are systems of assessments necessary, desirable and/or feasible? u What methods of assessment can be used and for what purposes? u How does anyone make sense of data from multiple assessments?
9
Overview o NCLB Requirements and Critical Issues o Advice for Policymakers -- Instructionally Supportive State Assessment Practices Advice for Policymakers -- Instructionally Supportive State Assessment Practices Advice for Policymakers -- Instructionally Supportive State Assessment Practices o Conceptual Basis for Working Through Critical Issues of Assessment Design and Use o What’s Needed for Further Progress o Where and How to Learn More
11
Commission Requirements o Requirement 1. Prioritized Content Standards o Requirement 2. Unambiguously Described Content Standards o Requirement 3. Standard by Standard Reporting o Requirement 4. Classroom Assessments of State Content Standards o Requirement 5. Monitoring Curricular Breadth o Requirement 6. Appropriate Assessment for all Students o Requirement 7. Sufficient Test Development Time o Requirement 8. Pertinent Professional Development o Requirement 9. Ongoing Evaluation of the System
12
Issues Needing Further Guidance o What’s the conceptual basis for making decisions among standards? o How do we specify the meaning of a standard? u What is assessable and how? o What is the process of translating standards into assessment practices? u For large scale tests? u For classroom assessment purposes? o How do the system’s pieces come together?
13
Overview o NCLB Requirements and Critical Issues o Advice for Policymakers -- Instructionally Supportive State Assessment Practices o Conceptual Basis for Working Through Critical Issues of Assessment Design and Use Conceptual Basis for Working Through Critical Issues of Assessment Design and Use Conceptual Basis for Working Through Critical Issues of Assessment Design and Use o What’s Needed for Further Progress o Where and How to Learn More
15
The Committee’s Objective To help establish a theoretical foundation for the design and use of new kinds of assessments that will help all students learn and succeed in school. Needed are assessments that make as clear as possible to students, their teachers, and other education stakeholders the nature of their accomplishments and the progress of their learning.
16
About the Report o Proposes a vision of educational assessment based on contemporary understandings of how people learn and how to measure such learning. o Describes an improved approach to assessment design and use, along with promising examples that include applications of technology. o Provides directions for research, development, policy, and practice for moving the field of assessment forward.
17
Report Structure & Content o Part I - Introduction & Background u issues & opportunities in educational assessment o Part II - Scientific Foundations of Assessment u the sciences of cognition & measurement o Part III - Assessment Design & Use u vision of theory-driven design & its application o Part IV - Implications & Recommendations u necessary directions for policy, practice & R&D
18
Recommendations Regarding Assessment Policy & Practice Policy makers are urged to recognize the limitations of current assessments, and to support the development of new systems of multiple assessments that would improve their ability to make decisions about education programs and the allocation of resources. u Important decisions should not be based on single test score u Systems should measure growth in achievement over time u Emphasis should be shifted from assessment of learning to an increased importance of assessment for learning. u Assessments at classroom and large-scale levels should grow out of shared knowledge base about learning and knowing On what conceptual and operational basis can we begin the process of designing and implementing such systems?
19
Relating Teaching, Learning & Assessment High-Stakes Summative Tests Classroom Teaching & Learning Level of Impact
20
Why Focus on Classroom Formative Assessment? o o As instruction is occurring, teachers need information to evaluate whether their teaching strategies are working. o o They also need information about the current understanding of individual students and groups of students so they can identify the most appropriate next steps for instruction. o o Students need feedback to monitor their own learning success and to know how to improve. o o Black & Wiliam (1998) reviewed impact of formative assessment practices on learning outcomes -- effect sizes ranging from.5 - 1.0
21
What’s Needed for Formative Assessment to Work o Sadler’s 3 interconnected elements u A clear view of the learning goals u Information about the present state of the learner u Actions to close the gap o The major challenge is always that of Knowing What Students Know u Need conceptually rich systems that link curriculum, instruction and assessment
22
A ssessment as a Process of Reasoning from Evidence o cognition u model of how students represent knowledge & develop competence in the domain o observations u tasks or situations that allow one to observe students’ performance o interpretation u method for making sense of the data observation interpretation cognition Must be coordinated!
24
Understanding “Cognition” o Challenge I: Articulating Multiple Explanations of Thought & Behavior -- What we want and need to understand u Behavior ranges from micro-processes of rapid perception to macro-processes like problem solving and negotiation u Time periods over which behavior and learning unfolds can vary tremendously o Challenge II: Multiple Levels of Explanation -- The way we focus the explanation u Cognitive Accounts of Individual Processes and Knowledge Representations u Situated/Sociocultural Accounts of Collective Processes and Distributed Knowledge Representations
25
Cognitive Level Analysis o The most critical implications for assessment are derived from study of the nature of competence and study of the development of knowledge in specific curriculum domains. u Characterizing Performance: Task Analysis u Characterizing Development: Trajectories of Learning u Characterizing Knowledge: Forms of Representation
26
Task Analysis: Arithmetic o Melissa had 6 pencils. Henry gave her 14 more. How many pencils does Melissa have now? What are the foundations of competent performance? u Knowledge of concepts like cardinality, sets u Knowledge of strategies, like counting, joining sets
27
Trajectories: Arithmetic o Melissa had 6 pencils. Henry gave her 14 more. How many pencils does Melissa have now? u Direct Modeling - Represent Sets with Counters, Join, Count All 1…20 u Counting - Represent Sets as Numbers Count-on from first- 6, 7(1), 8(2), …20(14) Count-on from first- 6, 7(1), 8(2), …20(14) Count-on from larger- 14, 15(1), 16(2), …20(6) Count-on from larger- 14, 15(1), 16(2), …20(6) u Derived Facts - 6 + 10 =16, 16 + 4 = 20 u Recall 6 + 14 = 20
28
Sociocultural Level Analysis o The most critical implications for assessment are derived from study of the nature of practice and forms of participation in communities. u Characterizing Performance: Communal Practices u Characterizing Development: Trajectories of Participation u Characterizing Knowledge: Forms of Mediated Activity
29
o Practice: Accounting u New Math of the 15th Century o Development: learning to participate in a guild. u Apprenticeship model of development. o Knowledge: mediation of arithmetic operations via algorithms u Algorithms of the counting houses are those we teach today. Sociocultural Look at Arithmetic
30
Mathematical Practices o Number Theory u Conjecture (commutative?) o Trajectories of Participation in Argument u Cases ---> Generalization--->Proof 4 x 7 = 7 x 4 ss x ls = ls x ss
31
Generalizations About Performance Expertise u performance develops in communities that value certain forms of knowledge and activity, like modeling in science. u knowledge is tuned to specific patterns of activity, like solving certain kinds of problems. u performance increases in scope and precision with multiple, contextualized experiences. u no magic levers: practice, disciplined inquiry. o Implication -- Assessments must be designed to capture the complexity of competent performance, ranging from mental processes to participation in forms of practice
32
Generalizations about Development o Not all children learn in the same way or follow the same paths to competence. u Conceptual change is often not a simple, uniform progression, nor is there movement directly from erroneous to optimal solution strategies. u Intermediate forms of knowledge may not resemble expert forms, so simple building block relations may not hold. u Participation often “starts at the edges” and becomes progressively more aligned with core disciplinary practices. o Implication - Assessments should identify specific strategies and forms of activity with respect to the role they play in developmental trajectories. (e.g., COF is fine at grade 1, not grade 3)
33
Generalizations about Knowledge o Disciplinary Knowledge u Is organized in ensembles that facilitate its use. u Is amplified by processes of self regulation, or “metacognition,” where learners spontaneously evaluate their knowledge and its limits. u Is developed in communities that foster identity and interest. o Implications for Assessment -- multiple “questions” u Knowledge Issues - Specific Facts, Procedures, Schemas u Reflection Issues - Articulation, Evaluation u Practice Issues - Why prove? Model?
34
Why Cognitive Models of Content Knowledge are Critical o Tell us what are the important aspects of knowledge that we should be assessing. u Give deeper meaning and specificity to standards o Give us strong clues as to how such knowledge can be assessed u Suggest what can and should be assessed at points proximal or distal to instruction o Can lead to assessments that yield more instructionally useful information -- within and across levels and contexts o Can guide the development of systems of assessments u Comprehensive, Coherent & Continuous Comprehensive, Coherent & Continuous Comprehensive, Coherent & Continuous
35
Systems of Assessments o Comprehensive - utilize a range of measurement approaches that together produce information that can be combined and reported at different levels o Coherent - the conceptual base for the student models underlying design of the multiple assessments should be compatible. There also needs to be alignment between curriculum, instruction and assessment so that everything is working toward a common set of learning goals. o Continuous - Sequential observations, over time must be conceptually linked so that growth can be measured. Assessment is viewed as a cumulative process rather than a “drop-in-from-the-sky” event
36
Assessment Design Principles o Assessment design should always be based upon a model of student learning and a clear sense of the inferences about student competence that are desired for the particular context of use. o Starting with the Student Model: The student model suggests the most important aspects of student achievement that one would want to make inferences about and provides clues about the types of tasks that will elicit evidence to support those inferences.
37
Aspects of Student Models o Domain specific and empirically based o Identifies cognitive performances that differentiate expert and novice learners o Lays out one or more typical progressions toward competence including milestones or landmark performances along the way. o Can be at various levels of detail; grain size depends on assessment purpose o Valuable information source Valuable information source Valuable information source
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.