Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,

Similar presentations


Presentation on theme: "1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,"— Presentation transcript:

1 1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing (CRESST) CRESST Conference Los Angeles, CA September 9, 2005

2 2 The mile wide, inch deep curriculum  How wide and deep should it be?  1 inch wide and 1 mile deep?  1/2 mile wide and 2 inches deep?

3 3 Example of Instruction e πi + 1 = 0

4 4 Assessment e πi + 1 =

5 5 Answer e πi + 1 = 0

6 6 What would you have to know in order for this to be meaningful and useful? e πi + 1 = 0  What do the symbols mean?  What is this equation about?  What can you do with it?  What’s important about it?  How does it connect to other topics in mathematics?

7 7 Our approach Analyze and map the domain based on previous research and knowledge elicitation  Identify central principles  Used to build schemas  Enable inferences, complex problem solving, and learning  Provide a foundation for advanced learning Use models to build assessments at right level of cognitive demand

8 8 Research on the Structure of Knowledge --Chi, Glaser, & Rees, 1983

9 9 Eliciting the Structure of Knowledge Experts (scientists, mathematicians, historians, writers) identify organizing concepts and principles (“big ideas”) Determine related ideas and skills: facts, problems, situations, etc. Map all ideas and skills

10 10 Structure of Algebra I Knowledge: Central Principles  1 Number  2 Expressions, Equations, Inequalities  3 Functions  4 Problem Solving  5 Reasoning  6 Sets

11 11 Ideas About Functions  A function is a mapping between inputs and outputs such that each input is mapped to one and only one output.  Many events in the physical world can be modeled as functions.  Many functions can be represented by algebraic equations or graphs.  The equation of a linear function can be written in the form f(x) = mx + b.  The graph of a linear function is a line.

12 12 Map of Algebra I Knowledge

13 13 Task Components for Assessing Complex Learning Text or other representation of information requiring domain knowledge (concepts, principles, and factual knowledge) Respond with complex performance E.g., explanation, problem solving, knowledge map Scoring based on expert performance

14 14 Examples of Models in Action  Assessments of history and math understanding tested with hundreds of students  Scaling up with 300-400,000 students per year in the district’s 2nd largest district  IERI: online tools to build assessments (Assessment Design and Delivery System or ADDS)  New CRESST: assessments of mathematics understanding in grades 6-8, including worked examples

15 15 Study of Fifth Grade Students’ Understanding of Fraction Representations  500 students randomly assigned to instruction on: Fractions as parts of wholes Fractions as rational numbers  After instruction, students given  Students performed better with the types of representations and meanings they had been taught.  Students who understood the representations performed better on measures of complex problem solving.

16 16 Fractions

17 17 Fractions

18 18 Equivalent Fractions

19 19 Equivalent Fractions

20 20 Fractions

21 21 ADDS Designer: Select Information Source

22 22 Screenshot from an Animation

23 23 Two Screenshots From an Animation 1 2

24 24 Screenshot from a Simulation

25 25 Results from Studies of Teachers Using ADDS  Series of experimental studies  In one study, 33 middle school science teachers randomly assigned to ADDS and non-ADDS groups  ADDS teachers were more likely to focus on central ideas and developed more cognitively complex assessments.  In another experimental study with 17 teachers, ADDS enhanced the ability of teachers to develop comparable assessments on the same topic.

26 26 What have we learned (in the effort to “make evidence based practice a reality”)?  Complex, research-based assessments:  Can be instructionally sensitive, valid measures of challenging state standards, predictive of performance on state tests  Can provide info to guide and improve instruction,  Can be reliably scored.

27 27 What have we learned?  Deep understanding is rare or non-existent, but understanding and use of big ideas can be improved in a relatively short time.  Information on student understanding and complex performance is a powerful tool for change and can be used to help focus and propel capacity building efforts.


Download ppt "1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,"

Similar presentations


Ads by Google