Download presentation
Presentation is loading. Please wait.
1
Reflective Assessment/Alignment Daniel. T. Hickey Learning & Performance Support Lab Department of Educational Psychology University of Georgia AAAS/CCMS Knowledge Sharing Institute June 2003 June 2003
2
Two Questions about Assessment Aligning assessments to standards: Aligning assessments to standards: –How does articulation of outcomes impact assessment? –What is the impact of learning performances? –Depends on the goal of assessment. Impact of Assessment Practice: Impact of Assessment Practice: –How can information be used to revise curriculum? –What is the diagnostic power of items? –How can assessment advance learning? –Depends on the format of the assessment.
3
CCMS Design Tensions and Curricular Solutions Learning inquiry vs. content coverage: Learning inquiry vs. content coverage: –Focus on coherent “slices” though the systems. –Tsou’s analogical comparisons. Self-directed, intrinsically-direct inquiry leads to multiple paths: Self-directed, intrinsically-direct inquiry leads to multiple paths: –Focus on select benchmarks. Some problem contexts require non- targeted background knowledge: Some problem contexts require non- targeted background knowledge: –Make tradeoffs in context selection.
4
The New Challenge of “Accountability” NCLB, WWC, likely DOE Technology Plan use standardized M-C tests as policy lever: NCLB, WWC, likely DOE Technology Plan use standardized M-C tests as policy lever: –Bizarre way to demonstrate knowledge. –Cheap, reliable, & naïve appeal. Conventional curriculum & integrated teaching systems have a profound advantage: Conventional curriculum & integrated teaching systems have a profound advantage: –Fine-tuned to directly present many of the associations needed to identify most correct response. – Decontextualized, individualized format of turnkey systems lend nicely to random assignment. Innovators run risk of being labeled “unproven”. Innovators run risk of being labeled “unproven”.
5
Addressing Curricular Tensions with Assessment & Alignment Good assessment promotes coverage and inquiry. Good assessment promotes coverage and inquiry. “Learner-oriented” formative feedback: “Learner-oriented” formative feedback: –Attaches new content to robust inquiry schema. –Promotes participation in discourse and argumentation. Good classroom accountability: Good classroom accountability: –Guides student and teacher curricular enactments. –Guides refinement of materials and methods. –Affords shared knowledge base for subsequent inquiry. –Provides additional incentives to motivate engagement. Design-based refinements of activities and classroom assessments ensures needed outcomes. Design-based refinements of activities and classroom assessments ensures needed outcomes.
6
Functions of Assessment Almost all assessment serves formative goals: Almost all assessment serves formative goals: –Ultimate goal is improving teaching & learning. More summative functions get all the attention: More summative functions get all the attention: –Summative functions undermine formative. Range of formative functions: Range of formative functions: –Guiding curricular selection. –Guiding curricular revision. –Guiding curricular enactment. –Directly advancing student understanding.
7
Reflective Assessment/Alignment Framework Refined in studies of technology-based innovations: Refined in studies of technology-based innovations: – CTGV, GenScope, LBD, VSS, NASA COTF. Aims to increase evidential (summative) and consequential (formative) validity. Aims to increase evidential (summative) and consequential (formative) validity. Considers levels of assessment: Considers levels of assessment: –Activity, Curriculum, & Standards-Oriented Considers approaches to assessment: Considers approaches to assessment: –Empiricist, Rationalist, Sociocultural.
8
Stanford’s Five Levels of Summative Testing Immediate: Immediate: –artifacts from the enactment of the curriculum; Close: Close: –parallel to the content of a unit; Proximal: Proximal: –Tapping the knowledge and skills of a curricula but not necessarily the topics. Distal. Distal. –Reflecting state and national standards; Remote: Remote: –Broader outcomes (e.g., NAEP, workforce)
9
Levels of Assessment Embedded (Immediate ): Embedded (Immediate ): –Is the curriculum. –Informal, collaborative, minimally summative Activity-Oriented (Close): Activity-Oriented (Close): –Somewhat formal, individually completed. –Modestly summative. Curriculum-Oriented (Proximal): Curriculum-Oriented (Proximal): –Sensitive to specific curriculum and standards. –More formal, fairly summative. Standards-Oriented (Distal): Standards-Oriented (Distal): –Reflects broader standards. –Extremely formal and highly summative.
10
Close Assessment: Activity-Oriented “Quizzes” Similar context, content, and representation: Similar context, content, and representation: –Screen captures or paper versions of physical lab. –Use multiple-choice only when appropriate. –Short answer + explanation is a useful format. Ideal for guiding, tuning & revising activities. Ideal for guiding, tuning & revising activities. –Completed at the end of a unit (e.g., weekly). –Extremely sensitive to enacted activities. –Should directly guide enactment, support self-assessment. Ideal for directly furthering student knowledge. Ideal for directly furthering student knowledge. –Need “learner-oriented” feedback rubrics. –Coach participation in “assessment conversation”. –Don’t let summative function undermine formative goals.
11
Proximal Assessment: Curriculum-Oriented “Exams” Cherry picked or custom items and tasks. Cherry picked or custom items and tasks. –Same content, but new and formal context. –Sensitive to curriculum and standards. –Use PALS, NAEP, released forms, text supplement. Ideal for assessing & refining curricular coverage Ideal for assessing & refining curricular coverage –Use to guide refinement of activities and quizzes. –Good for classroom accountability (grades). Ideal for comparing enactments of innovation. Ideal for comparing enactments of innovation. –Can’t give to comparison groups. Useful for furthering student knowledge: Useful for furthering student knowledge: –Requires students to formalize knowledge. –Need summative and formative rubrics.
12
Distal Assessment: Standards-Oriented “Tests” Randomly-selected items from larger pool: Randomly-selected items from larger pool: –New context and some new content. –Reflects broader standards. –Same methods used for high-stakes tests. Ideal for summative evaluation and curricular comparisons. Ideal for summative evaluation and curricular comparisons. –Test must be protected from compromise. –Different items on pretest and posttest. –Provide only score-level feedback.
13
Approaches to Assessment Assessment practices reflect assumptions about knowledge: Assessment practices reflect assumptions about knowledge: –Assumptions are often implicit. Competing assumptions = competing tests: Competing assumptions = competing tests: –Supports validity of high-stakes tests. –Explains how innovators compromise validity. Generally 3-4 sets of assumptions: Generally 3-4 sets of assumptions: –“Behavioral, Cognitive, & Cultural”. –“Familiar, Important & Enduring”.
14
Three Approaches to Assessment Behavioral/Empiricist focus on lower-level associations: Behavioral/Empiricist focus on lower-level associations: –Recognition and recall of specific associations. –Supports multiple-choice standardized tests. Cognitive/Rationalist focus on higher-level conceptual schema. Cognitive/Rationalist focus on higher-level conceptual schema. –Problem-solving focus on rationalization. –Central to 90’s assessment reform Situative/Sociohistoric focus on participation in knowledge practices. Situative/Sociohistoric focus on participation in knowledge practices. –Shift from acquisition to participation. –Shift from individual to event. –Reconcile tensions between two acquisitory approaches
15
Assessment Levels x Assessment Approaches ASSESSMENT LEVEL (Primary function, context of use). ASPECT OF PROFICIENCY (Conceptualization of knowledge transfer) A. Behavioral A. Behavioral (recognize & recall) B. Cognitive (rationalize) C. Cultural (ritualize) 1. ACTIVITY- ORIENTED QUIZZES (after activity, ideal for directly advancing knowledge) 1A. MC or SA items created for activity. 1B. PA or essay items, same context as activity 1C. Enactment of intended routines, relative to ideals. 2. CURRICULUM- ORIENTED EXAMS (after curriculum, ideal for refining curriculum) 2A. MC or SA items created or selected to match curricular targets. 2B. PA or essay items, somewhat different context from curriculum. 2C. Discourse during somewhat different problems. 3. STANDARDS- ORIENTED TESTS (pre and post, in innovative and comparison curriculum) 3A. Externally developed MC or SA items, aligned to standards, randomly selected. 3B. Externally developed PA or essay items, very different context from curriculum. 3C. Discourse while completing very different problems.
16
Suggestions for CCMS Alignment Clarify functions of assessment: Clarify functions of assessment: –Don’t confound formative and summative. –Separate activity-oriented and standards-oriented assessments. –Maximize consequential validity of formative. –Maximize evidential validity of summative. Discriminate between approaches to assessment. Discriminate between approaches to assessment. –Greeno, Collins, & Resnick; Hickey & Zuiker –Wiggins’ familiar, important, & enduring. Exploit incentive value of summative functions. Exploit incentive value of summative functions. Use IRT (Facets or WinMira) whenever possible. Use IRT (Facets or WinMira) whenever possible.
17
Suggestions for CCMS Feedback Exploit the formative value of assessment: Exploit the formative value of assessment: –Directly advance student understanding. –Guide teacher’s enactment of new activity. –Guide students’ enactment of later activity. Develop good classroom assessments: Develop good classroom assessments: –Create learner-oriented rubrics for each assessment (“answer explanations”). –Include new content. –Coach assessment conversation.
18
General Suggestions Use design-based methods (Sloane & Gorard, 2003) for iterative refinement: Use design-based methods (Sloane & Gorard, 2003) for iterative refinement: –Use quizzes to refine activities; –Use exams to refine quizzes and feedback; –Use tests to refine curriculum, exams, and exam feedback. –Build local-level prototheory along the way. Include a more comprehensive view of assessment in 2061 criteria. Include a more comprehensive view of assessment in 2061 criteria.
19
Assessment Levels x Assessment Approaches ASSESSMENT LEVEL (Primary function, context of use). ASPECT OF PROFICIENCY (Conceptualization of knowledge transfer) A. Behavioral A. Behavioral (recognize & recall) B. Cognitive (rationalize) C. Cultural (ritualize) 1. ACTIVITY- ORIENTED QUIZZES (after activity, ideal for directly advancing knowledge) 1A. MC or SA items created for activity. 1B. PA or essay items, same context as activity 1C. Enactment of intended routines, relative to ideals. 2. CURRICULUM- ORIENTED EXAMS (after curriculum, ideal for refining curriculum) 2A. MC or SA items created or selected to match curricular targets. 2B. PA or essay items, somewhat different context from curriculum. 2C. Discourse during somewhat different problems. 3. STANDARDS- ORIENTED TESTS (pre and post, in innovative and comparison curriculum) 3A. Externally developed MC or SA items, aligned to standards, randomly selected. 3B. Externally developed PA or essay items, very different context from curriculum. 3C. Discourse while completing very different problems.
20
GenScope Project Results
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.