Download presentation
Presentation is loading. Please wait.
Published byPhebe Patience Dennis Modified over 9 years ago
1
ASSESSMENT FOR BETTER LEARNING USING NAPLAN DATA Presented by Philip Holmes-Smith School Research Evaluation and Measurement Services
2
Overview
3
1. The National Scale
4
National Scale Scores or Estimated VELS Equivalent Scores? You can choose to look at National Scale Scores or Estimated VELS Equivalent scores What’s the Difference?
5
Understanding the National Scale The National Scale is an arbitrary scale – at this stage it is not related to points along a developmental curriculum. But, it is highly likely that it will be mapped onto the National Curriculum at some time in the future.
6
Understanding the National Scale The National Scale is an arbitrary scale – at this stage it is not related to points along a developmental curriculum. But, it is highly likely that it will be mapped onto the National Curriculum at some time in the future. The National Scale was fixed in 2008 as follows: – – Range: 0-1000 – – Mean: 500 – – Standard Deviation: 100 (i.e. 68% students between 400-600) +1 STD 2008 National Mean -1 STD 900 800 700 600 500 400 300 200 100
7
National Averages (2008) Only 20.1% of Australian school communities are more advantaged My School Website (2009) Year Level ReadingWritingSpelling Grammar & Punctuation Numeracy 3400.4414.2399.3402.9396.7 5484.3486.4483.6496.0475.7 7536.6533.7538.6529.0544.9 9578.0569.3577.0569.2582.2 School’s Average National Average Average for school’s with similar ICSEA
8
Understanding the National Scale The National Scale is an arbitrary scale – at this stage it is not related to points along a developmental curriculum. But, it is highly likely that it will be mapped onto the National Curriculum at some time in the future. The National Scale was fixed in 2008 as follows: – – Range: 0-1000 – – Mean: 500 – – Standard Deviation: 100 (i.e. 68% students between 400-600) The Scale has been divided into ten Bands which are used for reporting to parents. – – Band 1 covers all scores equal to or less than 270. – – Bands 2 – 9 increment by 52 score points each Band – – Band 10 covers all scores above 686.
9
Parent Reports Year 3 Year 5 Year 7 Year 9
10
Understanding the National Scale The National Scale is an arbitrary scale – at this stage it is not related to points along a developmental curriculum. But, it is highly likely that it will be mapped onto the National Curriculum at some time in the future. The National Scale was fixed in 2008 as follows: – – Range: 0-1000 – – Mean: 500 – – Standard Deviation: 100 (i.e. 68% students between 400-600) The Scale has been divided into ten Bands which are used for reporting to parents. – – Band 1 covers all scores equal to or less than 270. – – Bands 2 – 9 increment by 52 score points each Band – – Band 10 covers all scores above 686. At this stage Bands have no explicit curriculum meaning but results show that for Victorian students in 2009: – – A typical Yr3 level of performance is at the bottom of Band 5 – – A typical Yr5 level of performance is almost halfway into Band 6 – – A typical Yr7 level of performance is a third into Band 7 – – A typical Yr9 level of performance is at the bottom of Band 8 Yr 3 Yr 5 Yr 7 Yr 9
11
Victorian State Averages - 2009 (By Year Level and Dimension)
12
Comparing National Scale Scores to Estimated VELS Equivalent scores Comparing National Scale Scores to Estimated VELS Equivalent scores The Victorian means for Year 3 and Year 9 Reading and Numeracy on the National scale are compared to the estimated VELS equivalent scores below: The Victorian means for Year 3 and Year 9 Reading and Numeracy on the National scale are compared to the estimated VELS equivalent scores below: Compared to our expected curriculum outcomes for Year 3 students (2.175), the State Reading mean is about 1½ term ahead of where we expect a typical Year 3 student to be. However, the State Numeracy mean is about 2½ terms below where we expect a typical Year 3 student to be. Compared to our expected curriculum outcomes for Year 3 students (2.175), the State Reading mean is about 1½ term ahead of where we expect a typical Year 3 student to be. However, the State Numeracy mean is about 2½ terms below where we expect a typical Year 3 student to be. Compared to our expected curriculum outcomes for Year 9 students (5.175), the State Reading mean is about one (1) term ahead of where we expect a typical Year 9 student to be. However, the State Numeracy mean is about 2½ terms below where we expect a typical Year 9 student to be. Compared to our expected curriculum outcomes for Year 9 students (5.175), the State Reading mean is about one (1) term ahead of where we expect a typical Year 9 student to be. However, the State Numeracy mean is about 2½ terms below where we expect a typical Year 9 student to be. Year 3 Victoria MeansYear 9 Victoria Means Dimension National Scale VELS Equivalent National Scale VELS Equivalent Reading430.62.37587.95.28 Numeracy411.01.83596.34.88
13
Cautionary Note #1 Equal scores amongst different dimensions (on the National Scale) do not equate to equal levels of performance in terms of expected VELS levels. Equal scores amongst different dimensions (on the National Scale) do not equate to equal levels of performance in terms of expected VELS levels. For example a National Yr9 Reading score of 587.9 is equivalent to a VELS score of 5.28 but a higher National Yr9 Numeracy score of 596.3 is equivalent to a lower VELS score of 4.88. For example a National Yr9 Reading score of 587.9 is equivalent to a VELS score of 5.28 but a higher National Yr9 Numeracy score of 596.3 is equivalent to a lower VELS score of 4.88.
14
Cautionary Note #2 Some “Estimated VELS Equivalent Scores” CAN NOT be read as VELS scores. Some “Estimated VELS Equivalent Scores” CAN NOT be read as VELS scores. Specifically, it is doubtful that the “Estimated VELS Equivalent Scores” for Writing or Spelling are truly VELS scores*. Specifically, it is doubtful that the “Estimated VELS Equivalent Scores” for Writing or Spelling are truly VELS scores*. On the other hand, “Estimated VELS Equivalent Scores” for Reading and Numeracy appear to be trustworthy*. On the other hand, “Estimated VELS Equivalent Scores” for Reading and Numeracy appear to be trustworthy*. *As evidence, see following AIM/NAPLAN results
19
The National Minimum Standard (against Victoria’s State Averages) Year 3 Year 5 Year 7 Year 9 Yr 5 Yr 3 Yr 5 Yr 3 Yr 9 Even students at lower levels of “above Minimum Standard” may be “at risk”
20
Understanding Minimum Standards Yr 3, 5, 7 & 9 “Minimum Standards” are all very low (> 2 years below expected level.). In fact they are so low I think they offer no assistance in identifying “at risk” students so consider the following alternative. Yr 3, 5, 7 & 9 “Minimum Standards” are all very low (> 2 years below expected level.). In fact they are so low I think they offer no assistance in identifying “at risk” students so consider the following alternative. Use the “Holmes-Smith” Minimum Standard instead – anyone 0.5 of a VELS level below expected is in need of focused intervention or additional support. For example: Use the “Holmes-Smith” Minimum Standard instead – anyone 0.5 of a VELS level below expected is in need of focused intervention or additional support. For example: – Yr5: Expected VELS score ~ 3.175 – Yr5: Holmes-Smith “low achiever” ~ 2.675 or lower Also anyone 0.5 of a VELS level above expected is in need of further extension (to avoid boredom). For example: Also anyone 0.5 of a VELS level above expected is in need of further extension (to avoid boredom). For example: – Yr7: Expected VELS score ~ 4.175 – Yr7: Holmes-Smith “high achiever” ~ 4.675 or higher
22
The School Summary Report Choose “School Summary Report” Click on “Preview Report” to view results Choose to look at All Students or Girls vs. Boys or LBOTE Students or ATSI Students Choose to look at National Scale Scores or Estimated VELS scores
23
The School Summary Report
24
Interpreting “box and whisker” graphs There are 30 students in this Year Level. Therefore: – – 50% (or 15 students) are above the median. – – 50% (or 15 students are below the median. – – 50% (or 15 students) are inside the “box”. Half of these (7-8 students) are above the median and half (7-8 students) are below the median. – – 10% (or 3 students) are at or below the 10 th percentile “whisker”. – – 10% (or 3 students are at or above the 90 th percentile “whisker”. – – 15% (or 4-5 students) are spread between the 25 th down to the 10 th percentile. – – 15% (or 4-5 students) are spread between the 75 th up to the 90 th percentile. 15 students above median 15 students below median 7-8 spread above the median. 15 students inside the box. 7-8 bunched below the median. 3 students on or below the 10 th percentile. (Note, all we know is at least one student scored on the 10 th percentile.) 3 students on or above the 90 th percentile. (Note, all we know is at least one student scored on the 90 th percentile.) 4-5 students between the 75 th up to the 90 th percentile. 4-5 students between the 25 th down to the 10 th percentile.
25
Interpreting the School Summary Report 1. 1.Is the school’s median above, at or below the State median? School’s median is about half a band below the State median
26
Interpreting the School Summary Report 1. 1.Is the school’s median above, at or below the State median? 2. 2.Is one or more dimension very different from the other dimensions? School’s median is about half a band below the State median Writing and Spelling are lower than Reading and Grammar & Punctuation
27
Interpreting the School Summary Report 1. 1.Is the school’s median above, at or below the State median? 2. 2.Is one or more dimension very different from the other dimensions? 3. 3.How does the school’s spread compare to the State spread? School’s median is about half a band below the State median School has far fewer high performers than the State School has far more low performers than the State School’s median is at the State’s 25 th percentile Writing and Spelling are lower than Reading and Grammar & Punctuation
28
The School Summary Report
29
Interpreting the “Standard Error of the Mean” - se(mean) measurement error Your reported school mean simply reflects the performance of the students who were present on the day and how they felt on that day. What if some of the smartest students had been away? What if there had been a bad accident in the schoolyard just prior to the test and the students’ minds weren’t 100% on task? Differences in results due to such events are referred to as measurement error. State Mean School Mean Standard error of the Mean
30
Interpreting the “Standard Error of the Mean” - se(mean) measurement error Your reported school mean simply reflects the performance of the students who were present on the day and how they felt on that day. What if some of the smartest students had been away? What if there had been a bad accident in the schoolyard just prior to the test and the students’ minds weren’t 100% on task? Differences in results due to such events are referred to as measurement error. errors in measurement Statistically, we can allow for such errors in measurement by building a “confidence interval” around the reported school mean using the “Standard error of the mean” – se(mean). (473.2)(503.0) true school mean 473.2 – 503.0 We can be 95% confident that the true school mean for Reading is no lower than 488.1 – 1.96 * 7.6 (473.2) and no higher than 488.1 + 1.96 * 7.6 (503.0). That is, the true school mean is somewhere between 473.2 – 503.0. State Mean School Mean Standard error of the Mean
31
Interpreting the “Standard Error of the Mean” - se(mean) measurement error Your reported school mean simply reflects the performance of the students who were present on the day and how they felt on that day. What if some of the smartest students had been away? What if there had been a bad accident in the schoolyard just prior to the test and the students’ minds weren’t 100% on task? Differences in results due to such events are referred to as measurement error. errors in measurement Statistically, we can allow for such errors in measurement by building a “confidence interval” around the reported school mean using the “Standard error of the mean” – se(mean). (473.2)(503.0) true school mean 473.2 – 503.0 We can be 95% confident that the true school mean for Reading is no lower than 488.1 – 1.96 * 7.6 (473.2) and no higher than 488.1 + 1.96 * 7.6 (503.0). That is, the true school mean is somewhere between 473.2 – 503.0. (506.5) significantly below Now, because the State mean for reading (506.5) is above the highest estimate of the school reading mean we conclude that the school mean is significantly below the State mean. State Mean School Mean Standard error of the Mean
32
The Five Year Trend Report Choose “Five Year Trend Report” Choose one dimension
33
The Five Year Trend Report
36
Using the Five Year Trend Report Use the Five Year Trend Report to determine whether: – one-off – this year’s result was a one-off or whether it is consistent with previous results, – improving steady declining – the trend over time is showing an improving trend, a steady trend or a declining trend relative to the State. different cohort of students Remember, however, each year’s data comes from a different cohort of students and in some years, students are just much better or much worse than the typical cohort of students. small cohorts (< 10 students) Remember also that for small cohorts (< 10 students), a few extra high performing students can significantly increase your school mean. Likewise, a few extra low performing students can significantly decrease your school mean.
37
Analysing extracted data in SPA
38
A-E Grades: Year 3
39
A-E Grades: Year 7
40
Year 3 – Year 5 Growth
42
The Item Analysis Report equal wrong guessing Roughly equal numbers selecting each of the wrong answers = guessing all same wrong common misconception Nearly all students who got this wrong gave the same wrong answer = common misconception
43
Year 3 Reading (Q21) Year 5 Reading (Q9) ? Correct = ? ? Most common incorrect = ?
44
Year 3 Reading (Q21) Year 5 Reading (Q9) B Correct = B (Yr3 – 36%; Yr5 – 56%) C Most common incorrect = C (Yr3 – 49%; Yr5 – 34%)WHY?
45
Year 3 Numeracy ? Correct = ? ? Most common incorrect = ?
46
Year 3 Numeracy B Correct = B (60%) A Most common incorrect = A (28%)WHY?
47
Year 7 Numeracy ? Correct = ? ? Most common incorrect = ?
48
Year 7 Numeracy B Correct = B (55%) A Most common incorrect = A (26%)WHY?
49
Zone of Proximal Development
50
The Student Response Report (Reading or Numeracy – Difficulty Order) Choose “Student Response Report” Choose “Reading or Numeracy – Difficulty Order”
51
The Student Response Report
52
Data sorted by Item Difficulty and Student Ability Increasing Level of Difficulty Increasing Level Ability Mostly correct Mostly incorrect
53
Zone of Proximal Development (Vygotsky) The known: What students can already do independently. Increasing Level of Difficulty Increasing Level Ability Zone of Proximal Development: What students are capable of learning with the assistance of explicit instruction from the teacher (scaffolding). The unknown: What students are incapable of learning before prior concepts are taught.
54
Summarising strengths and weaknesses
55
Year 3 Reading – The Known (What students can already do independently)
56
Year 3 Reading – The Unknown (What students are incapable of learning before prior concepts are taught)
57
Year 3 Reading – The Zone of Proximal Development (What students are capable of learning with the assistance of explicit instruction from the teacher [scaffolding])
59
The Writing Criteria Report Choose “Writing Criteria Report” Click on “Preview Report” to view results
60
The Writing Criteria Report
61
The Writing Task
62
The Writing Criteria Report
63
The Writing Marking Rubric
68
Interpreting the Writing Criteria Report In this school, about 20% of students received a score of “1” but the majority (nearly 60%) received a score of “2”. To improve, the teacher needs to move the 1s onto 2s, the 2s onto 3s, etc.
69
Assessment as Learning Students writing like a “1” could be shown examples of how they are currently writing (Dungaun, The casel, BMX, etc.) and shown examples of what is now expected of them to improve to a “2” (Living dead, Woodern box, etc.)
70
Another assessment as learning example Using the “Paragraphing” rubric and accompanying sample scripts, a student could be shown that their writing demonstrates no paragraphing. More importantly, the rubric shows what is expected next and gives examples that students could read to get an idea of what writing in paragraphs looks like.
71
The Student Response Report (Writing Test – by criteria) Choose “Student Response Report” Choose “Writing Test – by criteria”
72
The Student Response Report (Writing)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.