Download presentation
Presentation is loading. Please wait.
1
2013 STAR Interpreting and Using Results
August 7, 2013 Webcast Webcast starts at 9 a.m.
2
Objectives Workshop participants will be able to:
Describe the purposes of STAR reports Interpret STAR results Explain key statistics Compare and contrast types of reports Identify proper uses of reports August 2013 Post-Test Workshop
3
Agenda What’s New? Results and Statistical Analysis Using Results
Summary and Internet Reports Data CDs Individual Student Reports Early Assessment Program August 2013 Post-Test Workshop
4
What’s New in 2013 Standards-based Tests in Spanish (STS) performance levels are now also reported for students in grades 8–11who took the grade-level STS for RLA, STS for Algebra I, and STS for Geometry. The score for the California Standards Test (CST) for Writing is no longer doubled; possible scores are 1, 2, 3, and 4. Manual (M) 2 – 3 August 2013 Post-Test Workshop
5
What’s New in 2013 ELA scale scores for the grades 4 and 7 CSTs and California Modified Assessment (CMA) will be provided based on the multiple-choice items only and the reporting cluster “Writing Applications” is no longer part of the CST and CMA ELA cluster groups; instead, the writing score is provided as a standalone score called “Writing Response Score.” M 2 – 3 August 2013 Post-Test Workshop
6
What’s New in 2013 Enrollment and exit code data to determine which students are counted as “continuously enrolled” for accountability purposes, previously collected for (CBEDS), were not collected on STAR answer documents or in Pre-ID. Instead, these data were extracted from the California Longitudinal Pupil Achievement Data System (CALPADS). Please refer to pages 2 and 3 in the 2013 Post Test Guide for a list of all changes. M 2 – 3 August 2013 Post-Test Workshop
7
Quiz Question 1 Which of these tests had scale scores reported for the first time in 2013? CST for World History STS for Science CMA for Algebra I STS for Algebra I August 2013 Post-Test Workshop
8
Quiz Question 1 Which of these tests had scale scores reported for the first time in 2013? CST for World History CAPA for Science CMA for Algebra I STS for Algebra I August 2013 Post-Test Workshop
9
Purposes of STAR Reports
Results: Purposes of STAR Reports Report progress toward proficiency on the state’s academic content standards Notify where improvement needed To help students’ achievement To improve educational programs Provide data for state and federal accountability programs M 4 August 2013 Post-Test Workshop
10
Results: Performance Levels
State goal: All students score at proficient or above 350 or higher scale score CST CMA STS CAPA proficient: 35 or higher scale score M 8 – 11 August 2013 Post-Test Workshop
11
Other Performance Levels
Results: Other Performance Levels Advanced Basic cut score CST: 300 CMA: 300 STS: 300 CAPA: 30 Below basic Far below basic For each testing program, cut points vary for advanced and below basic by Subject Grade M 8 – 11; Appendix B August 2013 Post-Test Workshop
12
Results: Scale Scores Scale scores allow the same score to mean the same thing across test versions within grade and content area. Scale scores account for differences in difficulty. Scale score ranges by program are: CST, CMA, STS: 150–600 for each grade and subject CAPA: 15–60 for each level and subject M 8 – 11 August 2013 Post-Test Workshop
13
Results: Equating Psychometric procedure Adjusts for test difficulty
Additional information in the technical report on the CDE Web site M 8 August 2013 Post-Test Workshop
14
Results: Reporting Clusters (Content Area)
Three to six clusters for each subject May be useful as indicators of individual or group strengths and weaknesses But. . . reporting clusters should be interpreted with caution Cluster scores are sub-scores. The number of items per cluster score can vary quite a bit within a test. The number of items can be as low as 4 (probability & statistics grades 4 & 5) or as many as 28 (the earth’s energy, Earth Science). The reliabilities associated with these cluster scores can vary from .46 (probability & statistics grades 4 ) to .84 (word analysis and vocabulary, ELA Grade 3). Because of the few number of items in some cases, it is questionable if these sub-scores have adequately sampled a sufficient number of items from the domain of interest to make the results generalizable. The cluster scores are NOT equated and so are subject to the difficulty of the items in the scores to arrive at a percent correct So they can’t be compared across years. They can, however, be compared within a year. A student that only gets few items correct on a sub-score that has a fair number of items (probably not 4), might be encouraged to concentrate in this area. M 9 – 11; Appendix A August 2013 Post-Test Workshop
15
Results: Reporting Clusters
Cautions Cluster percent correct available for CSTs, CMA, STS Based on small numbers of items; therefore, may not be reliable or generalized NOT equated from year to year Should not compare reporting cluster percent correct from year to year M 9 – 11; Appendix A August 2013 Post-Test Workshop
16
Interpreting Reporting Clusters or Content Areas in the Same Year
Compare to percent-correct range of proficient students statewide M 9 – 11; Appendix A August 2013 Post-Test Workshop
17
2013 CST Reporting Clusters: Number of Questions and
Average Percent Correct 2013 Post-Test Guide, Appendixes A and C as posted on startest.org, will be finalized with complete data on August 15. M Appendix A August 2013 Post-Test Workshop
18
Examples—Interpreting Reporting Clusters for the CST for Geometry
August 2013 Post-Test Workshop
19
Quiz Question 2 What is a scale score?
Percent correct of all questions Mean percent correct of all questions An adjustment of this year’s and last year’s raw scores to show changes An adjustment of the raw score to account for differences in difficulty August 2013 Post-Test Workshop
20
Quiz Question 2 What is a scale score?
Percent correct of all questions Mean percent correct of all questions An adjustment of this year’s and last year’s raw scores to show changes An adjustment of the raw score to account for differences in difficulty August 2013 Post-Test Workshop
21
Using Results For instructional decisions in conjunction with other data Used in Academic Performance Index (API) calculations, all grades and subjects: CST, CMA, CAPA Used in adequate yearly progress (AYP) calculations, ELA and mathematics: CST — grades 2–8 CMA — grades 3–8 CAPA — grades 2–8, 10 API CMA: ELA grades 3–9, mathematics grades 3-7, Algebra I, Science grades 5, 8, and 10 CSTs ELA Grades 2-11, including a writing assessment in grades four and seven Math Grades 2-7; and-General mathematics (gr 8, 9 only) Algebra I-Geometry -Algebra II -Integrated mathematics 1, 2, or 3-High School Summative Mathematics Test Students in grade seven may take the Algebra I test if they completed an Algebra I course. • History–social science: gr 8; 9-11 World History; 11 US History • Science: gr 5, 8, 10; Grades five, eight, and ten and grades nine through eleven for the following course-specific tests:-Biology/life sciences -Earth science-Chemistry-Physics -Integrated/coordinated science 1, 2, 3, or 4 CMA • ELA: 3-11 • Mathematics 3-11(Algebra I for 7-11, and Geometry for 8-11) • Science 5, 8, 10 CAPA) • ELA and mathematics Grades 2-11 AYP CST: Grades 2-8 for ELA and Math grades 2-7 and Algebra I for grade 8 only CMA: ELA grades 3–8 and Math 3-7, Algebra I (for grade 8 only) CAPA: grades 2-8 and grade 10 in ELA and Math M 4 August 2013 Post-Test Workshop
22
Year-to-Year Comparisons
Do Compare CSTs: Same Grade and Same Content Area Mean scale score Same content and grade, varying years Percent in each performance level Same content by grade across years e.g., 2012 grade 10 ELA with 2013 grade 10 ELA M 12 – 15 August 2013 Post-Test Workshop
23
Year-to-Year Comparisons
Do Compare CSTs: Percent Proficient and Advanced Percentage of students scoring at PROFICIENT and above For a given grade and subject, e.g., Percent proficient and above for grade 3 math in 2012 and 2013 For a given subject and aggregated grades, e.g., Percent proficient and above for grades 2–6 mathematics in 2012 and 2013 Across grades and a subject, e.g., Percent proficient and above in all courses and all grades M 12 – 15 August 2013 Post-Test Workshop
24
Year-to-Year Comparisons
Don’t Compare Individual scale scores or statistics based on scale scores for different grades or content areas Subjects by grade are independently scaled Different content standards are measured in different grades Cohorts across grades Across tests Scale scores to percent correct scores CAPA to years prior to 2009, due to new standard setting then CAPA, CMA grades 3−5, STS grades 2−4, DON’T compare to years before 2009 CMA grades 6–8 ELA and 6–7; and STS grades 5–7, DON’T compare to years before 2010 CMA grades 9–10 ELA and Algebra I, DON’T compare to years before 2011 M 12 – 15 August 2013 Post-Test Workshop
25
Example—Using CST Results to Compare Grade Results from Year to Year
2012 CST for ELA 2013 CST for ELA Grade % Prof or Above Difference Grade 2 31% 35% 4% Grade 3 33% 0% Grade 4 29% 2% Grade 5 34% 32% –2% Grade 6 1% All Grades Table 3, p. 11 A major question for test users is how to measure growth. What they think they want is to be able to compare scaled scores across time. However, since the tests are not vertically scaled this is not possible. They can compare mean scaled scores across cohorts, e.g., this years third grade class with last year’s third grade class, however, they can’t validly compare how a group of students did in third grade and how they did in fourth grade (longitudinally). The reason this type of comparison is not valid is that The constructs of the tests may not be continuous across grades, e.g., the skills learned in grade 4 may be very different than the skills learned in grade 3. Each grade is independently scaled so a 300 in grade 3 means something different than a 300 in grade 4. There is no link between scales. The best information for purposes of comparison is the number or proportion of kids in each category formed by the cuts. These can be readily aggregated even across unlike content (e.g., Algebra and Physics) since the concern is with the number of kids above some category (e.g., proficient). It should be noted that the categories are not independent, if a kid moves from proficient to advanced, the advanced figure will rise while the proficient value will fall. M 14 August 2013 Post-Test Workshop
26
Quiz Question 3 Which is the best comparison for CST scores of students within a middle school? 2012 mean scale scores for ELA of a cohort of grade 7 students with 2013 scale scores for ELA of the same students in grade 8 2012 mean scale scores for ELA for grade 8 students with 2013 mean scale scores for ELA for grade 8 students 2012 mean percent correct scores for ELA with 2013 mean percent correct scores for ELA for the same students 2012 mean percent correct scores for ELA for grade 8 students with 2013 mean percent correct for ELA for grade 8 students August 2013 Post-Test Workshop
27
Quiz Question 3 Which is the best comparison for CST scores of students within a middle school? 2012 mean scale scores for ELA of a cohort of grade 7 students with 2013 scale scores for ELA of the same students in grade 8 2012 mean scale scores for ELA for grade 8 students with 2013 mean scale scores for ELA for grade 8 students 2012 mean percent correct scores for ELA with 2013 mean percent correct scores for ELA for the same students 2012 mean percent correct scores for ELA for grade 8 students with 2013 mean percent correct for ELA for grade 8 students August 2013 Post-Test Workshop
28
Quiz Question 4 Which is the best comparison of cluster scores for a single student? Compare. . . To proficient students statewide One cluster to another, same year The same cluster to the same cluster, different years To the average percent correct of all students in a class August 2013 Post-Test Workshop
29
Quiz Question 4 Which is the best comparison of cluster scores for a single student? Compare. . . To proficient students statewide One cluster to another, same year The same cluster to the same cluster, different years To the average percent correct of all students in a class August 2013 Post-Test Workshop
30
Aggregate (Summary) Reports
What are they? Student Master List Summary Student Master List Summary End-of-Course (EOC) Subgroup Summary Report emphasis: CSTs Criterion-referenced tests Progress is measured in percent of students scoring proficient and advanced Note: Back of reports provides guide to abbreviations, score codes M 18 – 22 August 2013 Post-Test Workshop
31
Student Master List Summary
By grade Results by program (CSTs, CMA, CAPA, and STS) and subject # and % at each performance level Mean scale score Reporting cluster: mean percent correct (except CAPA) M 18–19; M 27 – 33 August 2013 Post-Test Workshop
32
Student Master List Summary
End-of-Course By program (CSTs, CMA, and STS) and subject Results for each grade, and for all grades combined # and % at each performance level Mean scale score Reporting cluster: mean percent correct M 19–20; M 34 – 39 August 2013 Post-Test Workshop
33
Student Master List Summary
Grade 7 Example M 33 August 2013 Post-Test Workshop
34
Student Master List Summary
Basic Statistics M 28 – 31 August 2013 Post-Test Workshop
35
Who Counts? Number Enrolled
Total CST/CMA and CAPA multiple-choice answer documents submitted as scorable Minus Documents marked as “Student enrolled after the first day of testing and was given this test” M 29 August 2013 Post-Test Workshop
36
Who Counts? Number Tested
All CST, CMA, CAPA, STS answer documents with one or more answers Plus Z = Tested but marked no answers Not included A = Students absent E = Not tested due to significant medical emergency P = Parent/guardian exemptions T = Enrolled first day, not tested, tested at previous school Students with inconsistent grades Non–English learners who took the STS M 29 August 2013 Post-Test Workshop
37
Who Counts? Number and Percent Valid Scores
Number Valid Scores For the subject, number of students tested at grade level who received a score for the test. Not included: Incomplete tests Modified tests Non–English learners who took the STS Unknown EOC mathematics (except grade 7 mathematics) or science tests Inconsistent grades Percent Valid Scores For the subject, number of valid scores divided by the number of students tested. M 29 August 2013 Post-Test Workshop
38
Number Tested with Scores
Who Counts? Number Tested with Scores All tests taken, including those taken with modifications, that result in a score Not included: Incomplete tests Non–English learners who took the STS Unknown EOC mathematics or science tests Inconsistent grades M 29 August 2013 Post-Test Workshop
39
Student Master List Summary
Performance Levels M 30 August 2013 Post-Test Workshop
40
Who Counts? Performance Levels
All CSTs, CAPA, CMA, STS Advanced, proficient, basic, below basic All valid scores falling in the performance level Far below basic CSTs taken with modifications (in aggregate reporting [CSTs and STS] and accountability [CSTs] only) M 30 August 2013 Post-Test Workshop
41
Who Counts? Mean Scale Scores
Average of valid scale scores Can be used to compare results for the same content/grade across years M 30 August 2013 Post-Test Workshop
42
Student Master List Summary: Reporting Clusters
User could also compare cluster averages to those of state or to those of similar schools. Compare to: Average percent correct range for students statewide who scored proficient on the total test (See the Post-Test Guide, Appendix A.) M 30 August 2013 Post-Test Workshop
43
Student Master List Summary: Writing
B = Blank C = Copied prompt I = Illegible L = Language other than English R = Refusal T = Off topic W = Wrong prompt (prompt from an earlier administration) M 30 – 31 August 2013 Post-Test Workshop
44
Subgroup Summary: CSTs, CMA, CAPA, and STS
Disability status Based on disability status for CST, CMA, STS CAPA: each disability type Economic status Based on NSLP eligibility or parent education level Gender English proficiency Ethnicity Ethnicity for Economic Status (only for CSTs, CMA, and CAPA) Economic Status Non Eco Disadvantaged NSLP = NO AND PEL not equal to 14 (Not a high school grad) Eco Disadvantaged NSLP = YES OR PEL = 14 (Not a high school grad) Unknown NSLP not equal to Y or N AND PEL not equal to 14 (Not a high school grad) Language Fluency English Only and Fluent English Proficient “Language Fluency” (field 42) equal to 1,2, or 4 English Learner “Language Fluency” (field 42) equal to 3 English Learner Less Than 12 Months “Language Fluency” (field 42) equal to 3 AND “EL in Public Schools <12 mos” (field 55) equal to Y English Learner 12 Months or More “Language Fluency” (field 42) equal to 3 AND “EL in Public Schools <12 mos” (field 55) not equal to Y (should be blank) Unknown Fluency “Language Fluency” (field 42) either blank or equal to “+” M 40 – 54 August 2013 Post-Test Workshop
45
Ethnicity for Economic Status
Subgroup Summary: Ethnicity for Economic Status M 51 – 54 August 2013 Post-Test Workshop
46
Ethnicity for Economic Status
Subgroup Summary: Ethnicity for Economic Status Example: Economically disadvantaged for each ethnicity M 51 – 54 August 2013 Post-Test Workshop
47
Ethnicity for Economic Status
Subgroup Summary: Ethnicity for Economic Status M 51 – 54 August 2013 Post-Test Workshop
48
Break — 10 minutes
49
Internet Reports Summaries based on same data as paper reports: CSTs, CMA, CAPA, STS Available to the public online for school, district, county, and state “Students with Scores” = number tested with scores CST summaries of % advanced and proficient More subgroups than paper reports Parent education Special program participation Access from M 95 – 108 August 2013 Post-Test Workshop
50
Internet Demonstration
Demo Also show individual races August 2013 Post-Test Workshop
51
Internet Reports: CST Sample
August 2013 Post-Test Workshop
52
Internet Reports: CST Summary Sample
Only available for CSTs M 99 – 100 August 2013 Post-Test Workshop
53
Other Internet Reports
CST (M 98−100) CMA (M 100–101) Same as CST CAPA (M 101−104) State level: same as CST; separate Level I County, district, school Mean scale score Percent proficient or above STS (M 105−106) August 2013 Post-Test Workshop
54
Quiz Question 5 Which subgroup can only be accessed from the Internet?
Parent Education Level CAPA by individual disability status Ethnicity for Economic Status English Proficiency August 2013 Post-Test Workshop
55
Quiz Question 5 Which subgroup can only be accessed from the Internet?
Parent Education Level CAPA by individual disability status Ethnicity for Economic Status English Proficiency August 2013 Post-Test Workshop
56
Data CDs What are they? What are they used for? What else is needed?
Lists of information from answer documents and scores of every student in the district In .txt format What are they used for? Searching for specific data Creating unique reports Verifying paper reports What else is needed? Text editor or Desktop application or Student Information System This sample uses TextPro. August 2013 Post-Test Workshop
57
View of Data As .txt, word wrap on With text editor, word wrap off
August 2013 Post-Test Workshop
58
Organization of Data Two files:
Demographics, special conditions, and test scores Accommodations, modifications, English learners, and irregularities Data Layout = guide to location of data on files Position Number of characters Whether numeric or alpha August 2013 Post-Test Workshop
59
Data Layout Sample August 2013 Post-Test Workshop
60
Individual Reports STAR Student Record Label STAR Student Master List
Adhesive label to affix to student’s permanent school record STAR Student Master List Alphabetical list of students and their scores Tests listed in order within grade CSTs CMA CAPA STS STAR Student Report: individual’s scores 2 two-sided color copies for each test For parents/guardians, school Per regulations, district must forward within 20 business days Performance = Parent Report Record Label for Cum file Master List = Roster per wave M 55 – 93 August 2013 Post-Test Workshop
61
Student Record Label Grade 10 Sample: Student Name and Identification
August 2013 Post-Test Workshop
62
Student Record Label CST/CMA Grade 10 Example M 56
If tested with modifications, actual performance level is still given. M 56 August 2013 Post-Test Workshop
63
Student Master List CST/CMA Grade 3 Example M 57 – 60 August 2013
Post-Test Workshop
64
Student Report M 61 – 93; 71 – 72 CST Grade 11 Example August 2013
Post-Test Workshop
65
Student Report CST Grade 11 Example M 70 August 2013
Post-Test Workshop
66
Student Report CST Grade 11 Example
Student’s name on back CST Grade 11 Example Grade 10 report not in manual ♦ = Percent correct obtained by the student on the reporting cluster/content area ▬ = Average percent-correct range on the reporting cluster for students statewide who scored proficient on the total test scores M 71 August 2013 Post-Test Workshop
67
Explain to Parents Scale Score Average % correct cluster score x 600
Reporting clusters not comparable Different difficulty Varying number of questions Average % correct of clusters % correct on total test (or total number of test items) Scale scores Use conversion tables resulting from statistical procedures Equating allows scores to have similar meaning (e.g., 350 = lowest score for CSTs proficient) M 8 – 11 August 2013 Post-Test Workshop
68
Other Student Reports CMA (M 72−80) CAPA (M 81−85) STS (M 86−93)
Front: Performance levels and scale scores Back: Cluster percent correct, About the CMA CAPA (M 81−85) Back: About the CAPA STS (M 86−93) In Spanish Back: Cluster percent correct, how to use the report August 2013 Post-Test Workshop
69
Unmatched Reports Writing CST, CMA—Grs 4, 7 Only multiple-choice
Only writing Students receive 2 reports if writing score not matched to multiple-choice score CST, CMA—Grs 8, 11 Only CMA and no CST Grades 8 and 11 require CST for History–Social Science Only in grades 9 and 10 may a student take only CMA. If the student took only CMA and later the school realized the student needed to take CST also, they may have taken a blank CST answer doc and completed minimal demographics: Name SSID ID Grade Gender Pearson should have matched, and considered CMA answer doc as main one, but P may not have. August 2013 Post-Test Workshop
70
Summary Do’s Don’ts Summary and Internet reports Data CDs
Do compare mean scale score, percent at performance levels within same grade, same content area Do compare cluster scores to the corresponding proficient range provided for this year Don’ts Don’t compare cluster scores to each other within test, nor across years, grades, content areas Don’t compare mean scale scores across grades, content areas Summary and Internet reports Data CDs Individual reports August 2013 Post-Test Workshop
71
Upcoming Dates August 2013 Post-Test Workshop
72
2014 Setting Up Administration
Return the 2014 Superintendent’s Designation form and Security Agreement. Finalize instruction schedule. Set up admin in STAR management system prior to December 1. Enter and approve orders prior to December 1. After order approval, test administration dates are NOT changeable. August 2013 Post-Test Workshop
73
For more information see:
STAR Technical Assistance Center CDE Accountability CDE STAR office: August 2013 Post-Test Workshop
74
Next Webcasts 2013 Demographic Data Corrections
September 25, 2013, 9 a.m. 2014 STAR for Students with Disabilities Webcast September 25, 2013, 1 p.m. August 2013 Post-Test Workshop
75
Early Assessment Program Results
August 7, 2013 Connie Grueter
76
Early Assessment Program (EAP)
EAP results are being accepted at participating California Community Colleges (CCC). List of participating colleges is available on the CCC website at Check the site often for an updated list of campuses. Talking Points – many CCC campuses are excited about accepting the EAP results for exemption. As of July 21st, there were 75 campus who have agreed to accept EAP results and there are other campuses considering the use of EAP results, so check the web site often. The list of participating CCC campuses is available at a joint CSU/CCC EAP website. You can find these slides at CollegeEAP.org under the Educator’s link and then under the About EAP tab. PowerPoint slides – Educators, About EAP. August 2013 Post-Test Workshop
77
Early Assessment Program (EAP)
An EAP informational web site has been developed: The site provides EAP information for both CSU and CCC It is a resource for Students, Parents, and Educators PowerPoint slides – Educators, About EAP. August 2013 Post-Test Workshop
78
www.CollegeEAP.org This is the front page of the CollegeEAP.org site.
August 2013 Post-Test Workshop
79
Early Assessment Program (EAP)
Reporting Results New this year – Conditional Exemptions for EAP English Ready for CSU and participating CCC college-level English/mathematics courses – Conditional This status in conjunction with a grade of “C” or better in an approved English/math course or a supervised e-learning program can be exempt from the EPT or ELM for CSU or other English or math placement tests at participating California Community Colleges (CCC). A list of courses and e-learning programs can be found at: and The EAP Results are located on the STAR Grade 11 Student Report on the back page in the bottom left hand corner. We have been receiving questions in regards to satisfying the conditional exemption for EAP math. The CSU Math Success web site provides a list of the courses and e-learning programs. You can find this information under the student tab, and EAP statuses. PowerPoint slides – Educators, About EAP. August 2013 Post-Test Workshop
80
Early Assessment Program (EAP)
Sample EAP Box on the STAR Student Report This slide also shows the EAP box that students and their parents will see on the Student Report. A clear picture of the box can be found in the STAR Post Test Guide on page 78. The EAP box can be found on page 67 of the Post-Test Guide PowerPoint slides – Educators, About EAP. August 2013 Post-Test Workshop
81
Early Assessment Program (EAP)
Web Sites Links to CSU and CCC informational sites for Students, Parents, and Educators Provides a list of EAP Coordinators, EAP test blueprints, and informational materials Provides a list of participating Community Colleges CollegeEAP.org is a portal to information for both CSU and CCC for students, parents and educators. CalState.edu/EAP – has a list of the EAP Coordinators at all 23 CSU campuses. You can also find the test blueprints for EAP and CSU’s placement tests, EPT and ELM. The site also includes other information materials. CCCCO.edu/EAP – provides a list of community colleges who are accepting EAP or considering accepting the results. Each of the web sites include contact information for the systems should you have additional questions. PowerPoint slides – Educators, About EAP. August 2013 Post-Test Workshop
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.