Download presentation
Presentation is loading. Please wait.
Published byDamian Willis Modified over 9 years ago
1
Language Arts Day! October 24, 2011 Language Arts teachers are invited to gather at these sessions to study NeSA-Reading and NeSA-Writing data and use this information to consider how to continue to improve student achievement. In addition to the data analysis and NeSA updates, we will study recent research, pedagogy, and strategies directly related to language arts instruction.
2
Purposes Understand NeSA protocol, resources, and results Study NeSA results Use data to inform decisions for improving student achievement Study explicit instruction Share experiences, tips, ideas
3
Getting Started Introductions Norms Parking Lot Wikispace: http://esu6la.wikispaces.org Agenda Handouts & Copies Survey
4
2011 NeSA Testing Think What were the challenges of the NeSA-R or NeSA-W? How did (or might) these be overcome? What went well? Have you made any curricular adjustments? Pair Someone with similar responsibilities Share 3-5 minutes
5
Interaction Sequence Ask all student the question. Pause (3+ seconds). on-the-clock Put students on-the-clock. “You have 2 minutes to share your answer with your partner.” Students share their thoughts with a partner. Select student(s) to respond. Conference with 1 or 2 pairs Check student answers Probe Provide answers when missing 1. Purposeful Selection: Call on students you have visited. 2. Random Selection: Call on students so every student has an opportunity to be selected. 3. Volunteer Selection: Allow volunteer responses. (Sharer, Anastasio, & Perry, 2007, p. 80- 85)
6
NeSA Soup! DAC CAL DRC DRS AYP TOS PLD C4L
7
Understanding the NeSA-R & NeSA-W http://www.education.ne.gov/assessment/NeSA_Presentations.htm
8
Nebraska schools should use NeSA data to... Provide feedback to students, parents and the community Inform instructional decisions. Inform curriculum development and revision. Measure program success and effectiveness. Promote accountability to meet state and federal requirements. 8
9
NeSA is... A criterion-referenced summative test. A measurement of the revised Nebraska Reading Standards specific to vocabulary and comprehension. A tool including 45 to 50 multiple-choice items. A test administered to students online OR paper/pencil. 9
10
Tables of Specification What are... 10
11
Performance Level Descriptors What are... 11
12
NeSA... Produces a raw score that converts to a scale score of 0-200. Allows for students to be classified into one of three categories: Below the Standards, Meets the Standards, Exceeds the Standards. Provides comparability across Nebraska school buildings and districts. 12
13
~NeSA Terminology~ SCALE SCORE – a student’s transformed version of the raw score earned on NeSA Performance LevelReading Scale-Score Range Exceeds the Standards135 -- 200 Meets the Standards85-134 Below the Standards84 and below 13
14
14 Cut score processes: Contrasting Group Method – 400+ teachers Bookmark Method – 100+ teachers State Board of Education Reviewed Examined results of both processes Examined NAEP and ACT results for Nebraska Made decisions within recommended range at public meeting How are performance levels determined?
15
~NeSA Terminology~ RAW SCORE – the number of items a student answers ‘right’ on NeSA-R Content Area Points Possible Points Earned Student’s Scale Score Reading4221126 Mathematics4221127 Raw ScoreScale ScorePerformance Level 25200Exceeds 24167Exceeds 23148Exceeds 22135Exceeds 21126Meets 20118Meets 19111Meets on NeSA Reports 15 on Conversion ChartChart
16
~NeSA Terminology~ What is the difference between a raw score and a scale score? What is a raw score? A raw score is the number of correct items. Raw scores have been typically used in classrooms as percentages: 18/20= 90% correct. 16
17
~NeSA Terminology~ What is a scale score? A scale score is a “transformation” of the number of items answered correctly to a score that can be more easily interpreted between tests and over time. The scale score maintains the rank order of students (i.e., a student who answers more items correctly gets a higher scale score). For NeSA, we selected 0-200 and will use it for all NeSA tests, including writing. 17
18
~NeSA Terminology~ Why convert raw scores to scale scores? Raw scores are converted to scale scores in order to compare scores from year to year. Raw scores should not be compared over time because items vary in difficulty level. Additionally, raw scores should not be compared across different content area tests. Scale scores add stability to data collected over time that raw scores do not provide. 18
19
~NeSA Terminology~ SCALE SCORE CONVERTED TO PERCENTILE RANK? On score reports why is the... The percentile rank was placed on the score reports because our Technical Advisory Committee felt that parents would want to know their child’s position in relation to other test takers. A percentile rank of 84 means the child scored better than 84% of the students who took the test that year. 19
20
What does the Scale Score Look Like in Action? Although the test items are comparable, they are different.
21
2010 Reading Grade 82011 Reading Grade 8 Raw Score Scale Score Level Raw Score Scale Score Level 43135Exceeds43145Exceeds 42128Meets42139Exceeds 41133Meets 3285Meets3296Exceeds 3181Below3192Exceeds 3089Exceeds 2986Meets 2883Below
22
Scale Score Think What are the key points about scale scores that you would share with a parent who has questions about the NeSA-R? Ink Write 2-3 points. Link Find a partner; give one, and get one. Repeat.
23
Name That Concept! AKA Talk a Mile a Minute and Password
24
NeSA Related Terms Table of Specifications (TOS) District Assessment Contact (DAC) Scale Score Adequate Yearly Progress (AYP) Percentile Rank
25
NeSA Related Terms Check 4 Learning (C4L) Raw Score Proficiency Level Descriptions (PLDs) Data Reporting System (DRS) Criterion-Referenced Assessment
26
NeSA Reports What can we learn from this report? Do we have other data to support these results? What are the implications of this report?
27
NeSA REPORTS Individual Student Report School Student Roster School Indicator Summary School Performance Level Summary District Reading Indicator Summary District Performance Level Summary District Report of School Performance 27
28
Individual Student Report 28
29
School Student Roster 29
30
School Indicator Summary 30
31
School Performance Level Summary 31
32
District Reading Indicator Summary 32
33
District Performance Level Summary 33
34
District Report of School Performance 34
35
Step 1: Define the Situation What can we learn from each report? What is the data telling us (strengths and concerns)?
36
Step 2: Establish hypotheses Why are we getting these results?
37
Step 3: Verify / Refute Hypotheses Do we have other data to support these results?
38
Step 4: Create the Action Plan How can we use this NeSA data? What is the goal? (How much change is expected and by when?) What will be done to reach the goal(s), and how will progress toward goal(s) be measured?
39
Curriculum Alignment Examine PLDs and Tables of Specification. Are the tested indicators in our curriculum? -- Where? When are they taught? How are they instructed? At what DOK (Depth of Knowledge) level? By whom?
40
Instructional Effectiveness Examine PLDs and Tables of Specification. Do our students have opportunity to learn (and practice) the tested indicators? Is our instruction efficient and effective? How are students performing on the indicators on a day-to-day basis? Are we assessing them locally to find out what they know and can do?
41
Test Preparation Have our students used practice tests? Are our students familiar with the testing tools? Are we familiar with appropriate accommodations? Have we applied them?
42
2012 NeSA-R Testing Grades 3-8, 11 Standardized, secure testing procedures Paper and pencil or online (already submitted) Two independent sessions Untimed Cuts remain the same (0-84, 85-134, 135-200) What can you do or not do? (pages 19-21 & 30-34 in SAA-8) March 26 - May 4, 2012
43
2012 NeSA-W Testing Grade 4 narrative two 40-minute sessions (timed) #2 pencil holistically scored in 2012 (analytically scored in 2013) same cut scores as previous years (new in 2013) January 23 – February 10, 2012
44
2012 NeSA-W Testing Grades 8 & 11 descriptive (8); persuasive (11) online test administration one ~90-minute session (untimed; 2011 avg. = 45-65 min.) analytically scored (composite + 4 weighted domains) online dictionary and thesaurus; no spell-check 6,000 character limit (approx. 3 pages) January 23 – February 10, 2012 (SAA-8, p. 45-51)
45
2012 NeSA-W Testing Grades 8 & 11 (cont.) software update (wrap, spaces, dictionary) no “tab” (advise students to use 3-5 spaces) font size, spacing, margins do NOT affect scoring new cut score set in April 2012 composite score converted to scale of 0-70 can print practice & operational tests January 23 – February 10, 2012 (SAA-8, p. 45-51)
46
The way I see it… We have 3 years of data to consider. We can do some things right now… test procedures, format general test-taking skills motivation initial analysis, hypotheses, instructional change curriculum alignment effective instruction …and, we need to have a long-term, sustainable approach. analysis of trends hypotheses, instructional change, study results, etc. (PDSA cycle) diagnosis/intervention plan
47
2011 Statewide Results Grade Composite % Met Avg. Scale Score 2010 AYP Goal 2011 AYP Goal 370.95%10167%78% 475.39%10467%78% 570.01%10167%78% 673.72%10167%78%
48
2011 Statewide Results Grade Composite % Met Avg. Scale Score 2009 AYP Goal 2010 AYP Goal 2011 AYP Goal 773.87%11060%67%78% 871.43%10660%70%80% 1167.32%10257%68%79%
49
Improving Adolescent Literacy… http://ies.ed.gov/ncee/wwc/PracticeGuide.asp x?sid=8
50
Explicit Instruction http://explicitinstruction.org/?page_id=80
51
4 A’s Text Protocol What assumptions does the author of the text hold? What do you agree with in the text? What do you want to argue with in the text? To what parts of the text do you aspire ?
52
Resource & Idea Sharing What fabulous resources do you depend on for your professional practice?
53
Summary of Learning When my administrator asks about today, I will say that I learned… The most important / relevant thing I learned or was reminded of today is…
54
Future Sessions January 31 June 18
55
Please complete the evaluation! Start at the esu6la wikispace on the Language Arts Days page (toward the bottom of today’s agenda). Future Sessions: January 31 June 18
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.