Download presentation
Presentation is loading. Please wait.
1
Verifying Yearend Accountability Data
by Angie Crandall MVECA April 12, 2006 OEDSA
2
Presentation Goal To provide information and insight about reading and interpreting accountability-related ODE/EMIS data verification reports. NOTE: ODE/EMIS changes might occur in the future that could affect the content of this presentation. The ODE/EMIS web site and official ODE communications will contain the most accurate, updated information. April 12, 2006 OEDSA
3
Objectives To learn about:
Resources that explain the rules for the Ohio Accountability System and some important Accountability concepts Reading and interpreting Yearend ODE/EMIS reports used to verify EMIS data that will appear on the Local Report Card April 12, 2006 OEDSA
4
Ohio Accountability System Rules
Why do I need to know the rules? Law and policies are implemented in calculations performed on data reported by districts in EMIS. Knowing these rules helps to identify reporting issues/errors. Who is responsible for doing this? A tremendous amount of data are reported in EMIS, is it possible for one person to track all of this? It is important [for someone in your district] to know these rules and monitor for changes, since these rules can be adjusted based upon changes in law/policy or clarification. For example: unless you knew that the full academic year is now being defined as “The full academic year (FAY) definition is continuous enrollment from October count week to March 19th. This is the new definition of the 'Majority of Attendance IRN' so the FAY calculation will be done by local software systems and sent to EMIS as the MOA IRN.” Some EMIS coordinators would say “I couldn’t do my job without this information.” Other EMIS coordinators might say “This is not my responsibility”. The assignment of responsibilities is determined within your district. If this has not been explicitly discussed, it may be helpful to have this conversation to make sure you know what is expected in your district. April 12, 2006 OEDSA
5
Ohio Accountability System Rules (cont’d)
ODE Office of Accountability web site LRC/Accountability information by year Worth book-marking! Can be used to research information about the Ohio Accountability System from prior years, learn what is happening in the current year, and what is planned for future years. April 12, 2006 OEDSA
6
Ohio Accountability System Rules (cont’d)
April 12, 2006 OEDSA
7
Ohio Accountability System Rules (cont’d)
What do I need to know about the rules? For example, it would be helpful to know: What the general terms mean; E.g. What is “AYP”? What is “full academic year”? For , which subject/grade level tests are included: As indicators; In the performance index, and; In AYP determinations. Are there any changes to previously established calculations or “business rules” for ? April 12, 2006 OEDSA
8
Ohio Accountability System Rules (cont’d)
Which subject/grade level tests are included: as indicators; in the performance index; in AYP determinations. How district ratings are determined. Changes to previously established calculations or “business rules”. Information for April 12, 2006 OEDSA
9
Ohio Accountability System Rules (cont’d)
Changes in Accountability System business rules/calculations for can be found at: Some examples of changes include: MR/DD students' scores will count in the resident district totals. The full academic year (FAY) definition is continuous enrollment from October count week through March 19th. This is the new definition of the 'Majority of Attendance IRN‘ (MOA IRN) so the FAY calculation will be done by local software systems and sent to EMIS as the MOA IRN. Other important changes related to the OGT and LRC calculations can be found by following the above link. April 12, 2006 OEDSA
10
Testing Rules & Resources
Ohio Statewide Testing Program Rules Book Monthly communications from the Offices of Curriculum, Instruction, and Assessment Achievement & OGT Test Score Conversion Tables (See Statistical Summaries) April 12, 2006 OEDSA
11
EMIS Resources for 2005-06 FY2006 EMIS manual FY2006 EMIS changes
FY2006 EMIS changes LRC/Accountability prototypes for Testing Record Valid Combinations Watch the EMIS web site for updated LRC and Accountability Report Reference Guides. (These are excellent resources!) Also, watch for EMIS Newsflashes. April 12, 2006 OEDSA
12
ODE Accountability-related Reports
The following three ODE-generated accountability reports produced at year-end, are designed to be used together: Accountability Workbooks LRC Workbooks Where Kids Count (WKC) files April 12, 2006 OEDSA
13
How can I use the Accountability workbooks?
To identify for districts/community schools and buildings: The number of indicators made; The performance index score; Whether or not adequate yearly progress was made, and; The building/district’s report card designation. April 12, 2006 OEDSA
14
Design of the Accountability Workbook
This report provides an overview of district/building accountability status/statistics. For example: Worksheet Information provided Statewide_Indicators Which of the state indicators has the district made? AYP_Summary Did the district make AYP? AYP_Proficiency What is the AYP % proficient calculation for all students (combined grades)? For subgroups? April 12, 2006 OEDSA
15
How can I use the LRC workbooks?
It can be used to: Verify student counts on statewide tests; Validate the performance level counts; Pinpoint which student test records need to be verified individually, and; Verify data that will be publicly released. NOTE: Most of the information on the LRC workbook that does not appear on the Report Card will be available to the public on the ODE web site. April 12, 2006 OEDSA
16
Design of the LRC Workbook
The LRC Workbook contains disaggregate data that appear on the Accountability Report and other information that will appear on the Local Report Card. For example: Worksheet Information provided LRC-Proficiency# What are the specific counts of students by performance level on each test that are included in the % Proficient calculations? LRC-Grad_Withdrawal(D) What was the graduation rate? How many dropouts are we reporting in grades 9-12 during ? April 12, 2006 OEDSA
17
How can I use the WKC files?
This file includes student level data that can be used to verify test data for students included in the % proficient calculations on the Accountability Worksheet. Specifically, to verify student results included in: The “% proficient” calculation for determining whether the building/district made AYP (for all students and by subgroup); The “% proficient” calculation for state indicators, and; The attendance rate. April 12, 2006 OEDSA
18
More about the WKC File This is a district-level file that consists of several columns of data, including: Student identifying information; Subgroup membership; Where kids count information; Assessment information (by subject for all tested subjects), and; District Alternate Assessment Cap Information (included or not). April 12, 2006 OEDSA
19
More about the WKC File (cont’d)
Before using the “WKC file” to verify data, look to see if any students were not included in calculations due to the District Alternate Assessment Cap. This is column 4 on the Acct-Statewide_Indicators(D) tab in the District Accountability Workbook. If the district did not exceed the alternate assessment cap, the columns in the WKC file pertaining to the state and federal caps should be “N”- no. April 12, 2006 OEDSA
20
More about the WKC File (cont’d)
The WKC files include more detailed information than the WKC reports; e.g. performance levels, scores. These can also be imported into Excel, or another program, for detailed verification/analysis. Information included: Data reported by your district during the current school year, for students who took one or more of the statewide subject/grade level tests, and students in untested grade levels (for the attendance rate.) Data reported by other districts that are educating residents of your district through a special education cooperative agreement. April 12, 2006 OEDSA
21
More about the WKC File (cont’d)
It is helpful to know which students are included in each calculation. An important filter is “attending/home status”. NOTE: It is possible for students that are NOT included in the “% proficient” calculation to be included in the WKC file. For example, students with attending-home status “2E” (open-enrollment students educated elsewhere) may be included in the WKC file for a district. Those students would NOT be included in the “% proficient” calculation, but would still be included in the file. April 12, 2006 OEDSA
22
More about the WKC File (cont’d)
For 2006, because of changes in the full academic year definition, there will be fewer columns in this file.. The Full-academic year will be based upon the Majority of Attendance IRN. Specifically, the following columns will no longer be included in that file: March_IRN Oct_IRN April 12, 2006 OEDSA
23
More about the WKC File (cont’d)
ODE uses the data reported for each student to determine where a student counts using the WKC business rules, and the Full Academic Year criteria. Errors can happen when entering, loading, and/or reporting data, so it is very important for someone closest to the data to verify its accuracy. It is important to understand what each data element means and how it is being used, to determine whether or not the data are reported accurately. April 12, 2006 OEDSA
24
Example #1 Let’s say a district did not meet the standard for the 7th grade math indicator. What can they do? April 12, 2006 OEDSA
25
1. Look at the state indicator data.
Check to see if the district met the indicator. Where to look: Report: District Accountability Tab: Acct-Statewide_Indicators(D) Row: 7th Grade Math Indicator, 18th row Column(s): 5-7 April 12, 2006 OEDSA
26
2. Review the data behind the state indicators.
If the indicator was missed, check to see by how many students. (Look at Columns 1 & 2 in the same row, on the same worksheet.) April 12, 2006 OEDSA
27
3. Verify whether the “District Alternate Assessment Cap” has been exceeded.
Look at Columns 3 & 4 on the Acct-Statewide_Indicators(D) tab in the District Accountability Workbook If the district exceeds the cap, ODE excludes students from the # of students considered to be at or above the proficient level, until the district is at or below the cap. If these are “0”, the district did not exceed the cap. April 12, 2006 OEDSA
28
4. Find individual 7th grade Math Achievement records.
Filter the data in the WKC file to find students who took the 7th grade math achievement test. April 12, 2006 OEDSA
29
5. Look for records with scores below the proficient level.
Sort by “Math_Level”, look for students that scored below the proficient level (Basic or Below-basic), and check: For students that counted in the district and/or a building, were students really enrolled in the district for a “Full Academic Year”? Maybe one or more of these students had breaks in enrollment between the end of October Count Week and March 19th? Were scores reported accurately? e.g. Check to see if there are students you thought should have passed but did not and verify results. April 12, 2006 OEDSA
30
6. Look for records with scores at or above the proficient level.
For students scoring at the Proficient, Accelerated, or Advanced Levels, find students in the “State Total”, and check: If any of those students should be counting at the district level. Specifically, are there students that are only counted at the state level, who are reported as not meeting the “full academic year” criteria (MOA IRN=******), but who were continuously enrolled in the district from the end of October Count week through March 19? Are there students not being counted at your district because the student status or attending/home IRN indicator was reported inaccurately for the student’s situation? Are there students missing from the WKC file, because there are fatal errors on the aggregation reports? April 12, 2006 OEDSA
31
Other Places to Verify Data (in Addition to the WKC File)
Are there students who took the test for whom results are not being reported? Make sure students reported in a “tested grade level”: Have a test record reported, and; That information on the testing record is reported accurately (e.g. appropriate “Required Test Type”). NOTE: The ODE/EMIS Team has published a list of valid testing combinations that can be reported to EMIS. If you run into a situation that you think is being reported accurately, but the combination is not on the list, contact your designated ITC to see if there is another way to code the situation. April 12, 2006 OEDSA
32
Other Places to Verify Data (cont’d)
Check with your EMIS coordinator to see if there are errors appearing on the aggregation reports for a student because s/he does not have a testing record. ODE also typically sends a file with a list of SSIDs for which a Test Record was expected, but not received. (It is really important to review that list, because there could be funding implications.) If that list includes any students that you think should not be required to have a testing record reported, verify that you are reporting them accurately. If you are reporting them according to EMIS guidelines, but they are still appearing, inform your ITC. April 12, 2006 OEDSA
33
Now, let’s say the district missed AYP in math. What can they do?
Example #2 Now, let’s say the district missed AYP in math. What can they do? April 12, 2006 OEDSA
34
1. Check the AYP Summary. In the District Accountability Workbook, look at the Acct-AYP_Summary(D) tab, columns 0-S. Let’s say AYP was “not met” in Math “% proficient” April 12, 2006 OEDSA
35
2. Identify areas where AYP was “Not Met”.
Look at the AYP Summary table on the Acct-AYP_Summary(D) tab, columns A-M. In order for a district to meet AYP, all students (as a group) and all subgroups that are included in the district calculation must be at or above the AYP goals. What this means is that if “Not Met” appears anywhere in this table, the district has failed to meet AYP. April 12, 2006 OEDSA
36
3. Identify which subgroup(s) did not meet AYP.
In this case, let’s say the district did not make AYP in Math because the “Economically disadvantaged” subgroup did not make AYP. Look more closely at that subgroup to see how close that group is to meeting the target. April 12, 2006 OEDSA
37
4. Identify which subgroups are closest to meeting AYP for combined grade levels.
Look at the Math % Proficient results on the “Acct-AYP_Proficiency(D)” tab. Let’s say that group was 2 percentage points away from meeting AYP. April 12, 2006 OEDSA
38
5. Examine subgroup results by grade level.
To identify where to start verifying that data are reported accurately, we need to look more closely at how students in that subgroup performed by grade level on the Math tests. We are trying to find out “Which grade levels did not meet the % Proficient target in Math?” April 12, 2006 OEDSA
39
5. Examine subgroup results by grade level. (cont’d)
Look at the Math results for the Economically Disadvantaged subgroup on the Acct-AYP_ProficiencyDetail(D) tab, in the District Accountability Workbook to identify which grade levels did not meet the Current Year Target % Proficient or 2 Year Average Target % Proficient. April 12, 2006 OEDSA
40
6. Review subgroup results within grade level, by test type/accommodations.
Next, you could look at the “LRC-Proficiency#(D)” worksheet in the District LRC Workbook to find out the performance levels achieved by students in the “economically disadvantaged” subgroup, by type of test (standard or alternate assessment). This tells you where student performance appears to be lowest. (These are the data to verify, using the WKC files.) This step is optional. April 12, 2006 OEDSA
41
7. Use the WKC files to verify individual subgroup records.
We can use the WKC file (in Excel) to verify the accuracy of the 8th grade Math results reported in EMIS for students in the “economically disadvantaged” subgroup. April 12, 2006 OEDSA
42
8. Use the WKC files to verify individual records for results below the standard.
For students who score at the Basic or Below-basic levels, check: Were students really enrolled in the building for a “Full Academic Year”? Maybe one or more of these students had breaks in enrollment between the end of October Count Week and March 19th? Were scores reported accurately? e.g. Check to see if there are students you thought should have passed but did not, and verify results. Is the disadvantagement element correctly reported? Has this information been updated for yearend? April 12, 2006 OEDSA
43
9. Use the WKC files to verify individual records for results at or above the standard.
For students scoring at the Proficient, Accelerated, or Advanced Levels, check: Are there students who currently count at the state level, who should be counted at the district/building level? Specifically, are there students that are only counted at the state or district level, who are reported as not meeting the “full academic year” criteria, but who were continuously enrolled in the district from the end of October Count Week through March 19th? Are there other students who were not reported as “economically disadvantaged” that should have been? April 12, 2006 OEDSA
44
Other Items to Verify in Addition to Analyzing WKC Files
The “% proficient” subgroups are only evaluated for AYP if the group size exceeds 30 (45 for students with disabilities). Check to see if any subgroups were evaluated for AYP at the district or at buildings, for which the subgroup size should not be that large. April 12, 2006 OEDSA
45
Other Items to Verify (cont’d)
Were students reported as enrolled in the correct buildings? Were there any transfers between buildings not reflected in the data? This could affect AYP and state test indicators for buildings. For , if students transferred between buildings between October count week and March 19th, then the student would not meet the full academic year criteria at the building level, and would not be included in building level statistics subject to FAY. April 12, 2006 OEDSA
46
Other Items to Verify (cont’d)
Have buildings correctly identified students who are court-placed or parent-placed into institutions? These students are reported with student status = P or T. (These students only count at the state level.) If information about court-placed is not reported by buildings, is there a process for sharing this information with the EMIS coordinator, or someone at the district level who can update the student status? April 12, 2006 OEDSA
47
Other Items to Verify (cont’d)
Have students participating in the autism scholarship been identified and reported correctly? These students do not count in the building/district for accountability purposes. April 12, 2006 OEDSA
48
Other Items to Verify (cont’d)
Verify the accuracy of data reported by districts educating resident students through special education cooperative agreements. The data reported by the educating district count in the “% proficient” statistic at the resident/sending district, if the students meet the “full academic year” criteria. April 12, 2006 OEDSA
49
Other Items to Verify (cont’d)
Have assessment results been reported accurately for students attending an MR/DD? In MR/DD students' scores will count in the resident district totals. ODE indicates that these students should be included when calculating the 1% cap for alternate assessments allowed to count as proficient in the accountability calculations. Source: April 12, 2006 OEDSA
50
Other Items to Verify (cont’d)
Have you verified the WKC files to make sure OGT results are accounted for appropriately? Information about calculations that include OGT results can be found at: April 12, 2006 OEDSA
51
Other Items to Verify (cont’d)
The Accountability Workbook includes the graduation rate. This is based upon information from previous reporting periods and cannot be changed. Remember to verify the information on the Graduation Rate verification report, that will affect the graduation rate that will appear on the 2007 LRC released during NOTE: The information on the Graduation Rate data verification report will also affect the LRCs released in 2008, 2009, and 2010. April 12, 2006 OEDSA
52
Other Items to Verify (cont’d)
Attendance Rate Media and others are increasingly scrutinizing attendance rate data. Districts may want to verify that the following information is reported accurately: # of Attendance Days # of Excused Absence Days # of Unexcused Absence Days If there have been any significant changes in these numbers from last year, be ready to explain why. April 12, 2006 OEDSA
53
Other Accountability-related Information
The tests used to determine whether or not a district/building has made AYP have changed from FY2005 to this FY2006. Tests included Reading Math FY2005 Grades 3, 6, and OGT Grades 4, 6, and OGT FY2006 Grades 3, 4, 5, 8, & 10 Grades 3, 7, 8, & 10 April 12, 2006 OEDSA
54
Accountability EMIS Data Concepts
Because of the way Weighted AYP Targets are calculated, the targets could change during the current reporting period as data are corrected and resubmitted. This is because as additional assessment records are reported, the distribution of students taking each subject/grade level test may change. April 12, 2006 OEDSA
55
Accountability EMIS Data Concepts (cont’d)
One student could be included in multiple subgroups. For example: If a student is LEP, on an IEP, reported as Economically Disadvantaged, and reported as Multi-Racial, that student’s test results will be included in multiple subgroups. April 12, 2006 OEDSA
56
Reading the LRC/Accountability Report Explanations (Key)
April 12, 2006 OEDSA
57
Preparing Questions for ODE
In order to have questions ready, you might: Review prototypes before tomorrow; Review your local report cards and/or EMIS accountability reports from last year, and; Maybe even talk with the test coordinator or accountability manager to see if they have EMIS-related questions, based upon up-coming changes. NOTE: For district-specific questions, you might seek individual assistance before/after presentations. If you don’t have a question ready by tomorrow, you can take these same steps prior to the Spring OAEP conference. April 12, 2006 OEDSA
58
Preparing for FY2007 This year is an opportunity to get ready for next year, when 3-8 achievement test administration moves to May. After FY2006 Yearend reporting closes, Districts and Information Technology Centers (ITCs) could evaluate how things went, specifically: What worked; What didn’t, and; Where changes might be needed. April 12, 2006 OEDSA
59
Do you have any questions?
Thank you for your time and attention. April 12, 2006 OEDSA
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.