35th Annual National Conference on Large-Scale Assessment June 18, 2005 How to compare NAEP and State Assessment Results NAEP State Analysis Project Don.

Slides:



Advertisements
Similar presentations
Jamesville-DeWitt School Report Card Presented to the Board of Education May 10, 2010.
Advertisements

You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Comparing State Reading and Math Performance Standards Using NAEP Don McLaughlin Victor Bandeira de Mello National Conference on Large-Scale Assessment.
Jack Buckley Commissioner National Center for Education Statistics June 27, 2013 #NAEP.
EVAL 6970: Meta-Analysis Vote Counting, The Sign Test, Power, Publication Bias, and Outliers Dr. Chris L. S. Coryn Spring 2011.
NYC ACHIEVEMENT GAINS COMPARED TO OTHER LARGE CITIES SINCE 2003 Changes in NAEP scores Leonie Haimson & Elli Marcus Class Size Matters January.
Kentucky’s School Report Card and Spreadsheets
Calculation of the ATAR and using the scaling report UAC Information Session 6, 8, 12 and 14 June 2012.
Getting Started with Large Scale Datasets Dr. Joni M. Lakin Dr. Margaret Ross Dr. Yi Han.
Post-Test Data Analysis Workshop. Agenda I. Analyze 2014 TerraNova Results II. Comparative Data Analysis: TerraNova & Explore Test Results III. Using.
NAEP 2008 Trends in Academic Progress in Reading and Mathematics.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Using Hierarchical Growth Models to Monitor School Performance: The effects of the model, metric and time on the validity of inferences THE 34TH ANNUAL.
Reliability and Linking of Assessments. Figure 1 Differences Between Percentages Proficient or Above on State Assessments and on NAEP: Grade 8 Mathematics,
The Nation’s Report Card Mathematics National Assessment of Educational Progress 1.
Mark DeCandia Kentucky NAEP State Coordinator
Grade 3-8 English Language Arts and Mathematics Results August 8, 2011.
Chronic Absence in Oregon Attendance Works The Children’s Institute The Chalkboard Project ECONorthwest.
Introduction to plausible values National Research Coordinators Meeting Madrid, February 2010.
NYC ACHIEVEMENT GAINS COMPARED TO OTHER LARGE CITIES SINCE 2003 Changes in NAEP scores Class Size Matters August
6. Implications for Analysis: Data Content. 1 Prerequisites Recommended modules to complete before viewing this module  1. Introduction to the NLTS2.
Validation of the Assessment and Comparability to the PISA Framework Hao Ren and Joanna Tomkowicz McGraw-Hill Education CTB.
Identifying the gaps in state assessment systems CCSSO Large-Scale Assessment Conference Nashville June 19, 2007 Sue Bechard Office of Inclusive Educational.
Student Engagement Survey Results and Analysis June 2011.
© 2010 THE EDUCATION TRUST Raising Achievement and Closing Gaps Between Groups: Roles for Federal Policy.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Qatar Comprehensive Educational Assessment (QCEA) 2007: Summary of Results.
International Assessments September 5, 2012 DAC Annual Meeting
Review and Validation of ISAT Performance Levels for 2006 and Beyond MetriTech, Inc. Champaign, IL MetriTech, Inc. Champaign, IL.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
PPA 501 – Analytical Methods in Administration Lecture 5a - Counting and Charting Responses.
 Closing the loop: Providing test developers with performance level descriptors so standard setters can do their job Amanda A. Wolkowitz Alpine Testing.
Measuring of student subject competencies by SAM: regional experience Elena Kardanova National Research University Higher School of Economics.
Educational Standards Cabinet January Early Years Performance  The percentage of pupils achieving the target expectations in the Early Years Foundation.
1 The Nation’s Report Card: 2007 Writing. 2 Overview of the 2007 Writing Assessment Given January – March 2007 – 139,900 eighth-graders – 27,900 twelfth-graders.
Counseling Research: Quantitative, Qualitative, and Mixed Methods, 1e © 2010 Pearson Education, Inc. All rights reserved. Basic Statistical Concepts Sang.
Employing Empirical Data in Judgmental Processes Wayne J. Camara National Conference on Student Assessment, San Diego, CA June 23, 2015.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction Results of the 2005 National Assessment of Educational Progress.
Mark DeCandia Kentucky NAEP State Coordinator
NAEP 2011 Mathematics and Reading Results Challis Breithaupt November 1, 2011.
Jackson County School District A overview of test scores and cumulative data from 2001 – 2006 relative to the following: Mississippi Curriculum Test Writing.
NAEP 2011 Mathematics and Reading Results NAEP State Coordinator Mark DeCandia.
The Nation’s Report Card: U.S. History National Assessment of Educational Progress (NAEP)
The Nation’s Report Card Science National Assessment of Educational Progress (NAEP)
Jack Buckley Commissioner National Center for Education Statistics February 21, 2013.
1. 2 Overview of the 2006 NAEP Assessments Administered in January–March 2006 National results for grades 4, 8, and 12 Results by scale scores and achievement.
Grade 3-8 English Language Arts and Math Results.
The Nation’s Report Card 4th-Grade Reading SOURCE: National Center for Education Statistics, National Assessment of Educational Progress (NAEP),
School-level Correlates of Achievement: Linking NAEP, State Assessments, and SASS NAEP State Analysis Project Sami Kitmitto CCSSO National Conference on.
AARRGGHH!! Data Analysis! Just Do It For Me & Tell Me What It Says! Laura Boudreaux Pitre Merry Jane Bourgeois WORKING DRAFT 5/24/06.
Understanding AzMERIT Results and Score Reporting An Overview.
The Nation’s Report Card: Trial Urban District Assessment: Science 2005.
Participation of and Accommodations for Students with Disabilities and English Language Learners NAEP State Analysis Project Jenifer Harr María Pérez CCSSO.
Evaluation Institute Qatar Comprehensive Educational Assessment (QCEA) 2008 Summary of Results.
University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Challenges for States and Schools in the No.
Vertical Articulation Reality Orientation (Achieving Coherence in a Less-Than-Coherent World) NCSA June 25, 2014 Deb Lindsey, Director of State Assessment.
1 Grade 3-8 English Language Arts Results Student Growth Tracked Over Time: 2006 – 2009 Grade-by-grade testing began in The tests and data.
2009 Grade 3-8 Math Additional Slides 1. Math Percentage of Students Statewide Scoring at Levels 3 and 4, Grades The percentage of students.
Chronic Absence in Oregon Attendance Works The Children’s Institute The Chalkboard Project ECONorthwest.
Gary W. Phillips Vice President & Institute Fellow American Institutes for Research Next Generation Achievement Standard Setting Symposium CCSSO NCSA New.
Evaluating Outcomes of the English for Speakers of Other Languages (ESOL) Program in Secondary Schools: Methodological Advance and Strategy to Incorporate.
1 Perspectives on the Achievements of Irish 15-Year-Olds in the OECD PISA Assessment
Terra Nova By Tammy Stegman, Robyn Ourada, Sandy Perry, & Kim Cotton.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Examining Achievement Gaps
FY17 Evaluation Overview: Student Performance Rating
2017 NAEP RESULTS: DC PUBLIC CHARTER SCHOOLS
Lauren Kinsella Dr. Wright ITEC 7305
NAEP and International Assessments
Presentation transcript:

35th Annual National Conference on Large-Scale Assessment June 18, 2005 How to compare NAEP and State Assessment Results NAEP State Analysis Project Don McLaughlin Victor Bandeira de Mello

 how do NAEP and state assessment trend results compare to each other?  how do NAEP and state assessment gap results compare to each other?  do NAEP and state assessments identify the same schools as high-performing and low- performing?  how do NAEP and state assessment trend results compare to each other?  how do NAEP and state assessment gap results compare to each other?  do NAEP and state assessments identify the same schools as high-performing and low- performing? overview: the questions

 results are different because  standards are different  students are different  time of testing is different  motivation is different  manner of administration is different  item formats are different  test content is different  tests have measurement error  results are different because  standards are different  students are different  time of testing is different  motivation is different  manner of administration is different  item formats are different  test content is different  tests have measurement error overview: the differences

 results are different because  standards are different  students are different  time of testing is different  motivation is different  manner of administration is different  item formats are different  test content is different  tests have measurement error  results are different because  standards are different  students are different  time of testing is different  motivation is different  manner of administration is different  item formats are different  test content is different  tests have measurement error overview: the differences

 the problem of different standards  how we addressed it  the problem of different students  how we addressed it  factors that affect validation  the problem of different standards  how we addressed it  the problem of different students  how we addressed it  factors that affect validation overview: the focus

 the problem of different standards

 trends and gaps are being reported in terms of percentages of students meeting standards  the standards are different in every state and in NAEP  comparisons of percentages meeting different standards are not valid  trends and gaps are being reported in terms of percentages of students meeting standards  the standards are different in every state and in NAEP  comparisons of percentages meeting different standards are not valid the different standards

 concept of population profile  a population profile is a graph of the achievement of each percentile of a population  concept of population profile  a population profile is a graph of the achievement of each percentile of a population the different standards

 a population achievement profile the different standards

 a population achievement profile the different standards 76% 32% 5%

 a population trend profile the different standards

 a population trend profile the different standards +9% +13% +5% gains

 a population gap profile the different standards gaps

 a population gap profile the different standards gaps after a 20-point gain

 a population gap profile the different standards 8 points smaller the same 6 points larger gap changes

 trends and gaps are being reported in terms of percentages of students meeting standards  the standards are different in every state and in NAEP  comparisons of percentages meeting different standards are not valid  trends and gaps are being reported in terms of percentages of students meeting standards  the standards are different in every state and in NAEP  comparisons of percentages meeting different standards are not valid the different standards

 the solution to the problem is to compare results at comparable standards  for comparing NAEP and state assessment gains and gaps in a state, score NAEP at the state’s standard  the solution to the problem is to compare results at comparable standards  for comparing NAEP and state assessment gains and gaps in a state, score NAEP at the state’s standard the different standards

 NAEP  individual plausible values for 4th and 8th grade reading in 1998, 2002, and 2003 and mathematics in 2000 and 2003  state assessment scores  school percentages meeting standards linked to NCES school codes, in 2003 and some earlier years   NAEP  individual plausible values for 4th and 8th grade reading in 1998, 2002, and 2003 and mathematics in 2000 and 2003  state assessment scores  school percentages meeting standards linked to NCES school codes, in 2003 and some earlier years  the different standards

 a school-level population gap profile the different standards

 comparing school-level population gap profiles the different standards

 comparing school-level population gap profiles the different standards

 scoring NAEP at the state assessment standard  determine the cutpoint on the NAEP scale that best matches the percentages of students meeting the state’s standard  compute the percentage of the NAEP plausible value distribution that is above that cutpoint  scoring NAEP at the state assessment standard  determine the cutpoint on the NAEP scale that best matches the percentages of students meeting the state’s standard  compute the percentage of the NAEP plausible value distribution that is above that cutpoint the different standards

 equipercentile equating the different standards

 equipercentile equating the different standards A B C D average average NAEP scale score hypothetical NAEP results in four schools in a state (actual samples have about 100 schools)

 equipercentile equating the different standards A B C D average %20%40%50%30% average NAEP scale score percent meeting state standard in school A, the state reported that 10% of the students met the standards

 equipercentile equating the different standards A B C D average %20%40%50%30% average NAEP scale score percent meeting state standard NAEP scale score corresponding to percent meeting state standard in school A, 10% of the NAEP plausible value distribution was above 225

 equipercentile equating the different standards A B C D average %20%40%50%30% %10%45%60%30% average NAEP scale score percent meeting state standard NAEP scale score corresponding to percent meeting state standard percent above 230 on NAEP If the equating is accurate, we should be able to reproduce the percentages meeting the state’s standard from the NAEP sample

 equipercentile equating the different standards A B C D average %20%40%50%30% %10%45%60%30% error -5%-10%+5%+10% average NAEP scale score percent meeting state standard NAEP scale score corresponding to percent meeting state standard percent above 230 on NAEP

 relative error  in estimating cutpoints for state standards, relative error is the ratio of the observed error in reproducing school-level percentages meeting standards to that expected due to sampling and measurement error  relative error  in estimating cutpoints for state standards, relative error is the ratio of the observed error in reproducing school-level percentages meeting standards to that expected due to sampling and measurement error the different standards

 mapping of primary state standards on the NAEP scale: math grade 8 in 2003 the different standards

 mapping of primary state standards on the NAEP scale: math grade 8 in 2003 the different standards

 mapping of primary state standards on the NAEP scale: math grade 8 in 2003 the different standards

 national percentile ranks corresponding to state grade 4 reading standards in 2003 the different standards

 states have set widely varying standards  does it matter?  standards should be set where they will motivate increased achievement  surely some are too high and some are too low  states have set widely varying standards  does it matter?  standards should be set where they will motivate increased achievement  surely some are too high and some are too low the different standards

 states with higher standards have lower percentages of students meeting them the different standards

 on NAEP, states with higher standards do about the as well as other states the different standards

 the problem of different students

 different school coverage  different grade tested  absent students  excluded SD/ELLs  the problem of different students  different school coverage  different grade tested  absent students  excluded SD/ELLs the different students

 different school coverage  our comparisons between NAEP and state assessment results are for the same schools. NAEP weights these schools to represent the public school population in each state  we matched schools serving more than 99 percent of the public school population. However, especially for gap comparisons, we were missing state assessment results for small groups whose scores were suppressed for confidentiality reasons  the median percentage of the NAEP student population included in the analyses was about 96 percent  different school coverage  our comparisons between NAEP and state assessment results are for the same schools. NAEP weights these schools to represent the public school population in each state  we matched schools serving more than 99 percent of the public school population. However, especially for gap comparisons, we were missing state assessment results for small groups whose scores were suppressed for confidentiality reasons  the median percentage of the NAEP student population included in the analyses was about 96 percent the different students

 different grades tested  in some states, assessments were administered in grades 3, 5, or 7, and we compared these results to NAEP results in grades 4 and 8  the difference in grades involved a different cohort of students, as well as a difference in curriculum content. These effects combined to reduce NAEP-state assessment correlations in some states by about 0.05 to 0.10  different grades tested  in some states, assessments were administered in grades 3, 5, or 7, and we compared these results to NAEP results in grades 4 and 8  the difference in grades involved a different cohort of students, as well as a difference in curriculum content. These effects combined to reduce NAEP-state assessment correlations in some states by about 0.05 to 0.10 the different students

 absent students  some students are absent from NAEP sessions, and some of these are not made-up in extra sessions. NAEP imputes the achievement of the absent students to be similar to that of similar students who were not absent  a study by the NAEP Validity Studies Panel has found that these imputations leave negligible (if any) bias in NAEP results due to absences that study compared the state assessment scores of students absent from NAEP to the scores of students not absent  absent students  some students are absent from NAEP sessions, and some of these are not made-up in extra sessions. NAEP imputes the achievement of the absent students to be similar to that of similar students who were not absent  a study by the NAEP Validity Studies Panel has found that these imputations leave negligible (if any) bias in NAEP results due to absences that study compared the state assessment scores of students absent from NAEP to the scores of students not absent the different students

 excluded SD/ELLs  some students with disabilities and English language learners are excluded from NAEP and others are included. A teacher questionnaire is completed for each SD/ELL selected for NAEP  in the past, NAEP has ignored this exclusion, and there is clear evidence that as a result, states in which NAEP exclusions increased had corresponding reports of larger NAEP achievement gains (and vice versa)  excluded SD/ELLs  some students with disabilities and English language learners are excluded from NAEP and others are included. A teacher questionnaire is completed for each SD/ELL selected for NAEP  in the past, NAEP has ignored this exclusion, and there is clear evidence that as a result, states in which NAEP exclusions increased had corresponding reports of larger NAEP achievement gains (and vice versa) the different students

 full population estimates  the trend distortions caused by changing exclusion rates can be minimized by imputing the achievement of excluded students. in this project, comparisons between NAEP and state assessment results are based on the NAEP full population estimates [1]  imputations for excluded SD/ELLs are based on the achievement of included SD/ELLs with similar questionnaire and demographic profiles in the same state [1] an appendix includes comparisons using standard NAEP estimates  full population estimates  the trend distortions caused by changing exclusion rates can be minimized by imputing the achievement of excluded students. in this project, comparisons between NAEP and state assessment results are based on the NAEP full population estimates [1]  imputations for excluded SD/ELLs are based on the achievement of included SD/ELLs with similar questionnaire and demographic profiles in the same state [1] an appendix includes comparisons using standard NAEP estimates the different students

 statistically significant state NAEP gains from 1996 to 2000 grade 4 grade 8 17 of of of 377 of 35 ignoring excluded students full population estimates the different students

 statistically significant state NAEP gains and losses from 1998 to 2002 grade 4grade 8 gainslossesgainslosses ignoring excluded students full population estimates the different students

 factors that affect validation

 the question  do state assessments and NAEP agree on which schools are doing better than others?  the measure  correlation between state assessment and NAEP school-level results  the question  do state assessments and NAEP agree on which schools are doing better than others?  the measure  correlation between state assessment and NAEP school-level results validation

 factors that specifically affect NAEP-state assessment correlations of school-level statistics  size of school NAEP samples  grade level the same or different  extremeness of the standard  factors that specifically affect NAEP-state assessment correlations of school-level statistics  size of school NAEP samples  grade level the same or different  extremeness of the standard validation

 median school-level correlations between NAEP and state assessment results grade 4grade 8 mathreadingmathreading original adjusted validation

 NAEP and state assessment school means validation

 two reports have been produced on 2003 NAEP-state assessment comparisons, one for mathematics and one for reading  each report has an appendix with multi-page comparison profiles for all of the states the following are examples of the kinds of information included  two reports have been produced on 2003 NAEP-state assessment comparisons, one for mathematics and one for reading  each report has an appendix with multi-page comparison profiles for all of the states the following are examples of the kinds of information included summary

 state profiles of NAEP-state assessment comparisons  test score descriptions and results summary  standards relative to NAEP  correlations with NAEP  changes in NAEP exclusion/accommodation rates  trends (NAEP vs. state assessment)  gaps (NAEP vs. state assessment)  gap trends (NAEP vs. state assessment)  state profiles of NAEP-state assessment comparisons  test score descriptions and results summary  standards relative to NAEP  correlations with NAEP  changes in NAEP exclusion/accommodation rates  trends (NAEP vs. state assessment)  gaps (NAEP vs. state assessment)  gap trends (NAEP vs. state assessment) state profiles

 a state’s standards relative to its achievement distribution state profiles

 a state’s math trends comparison state profiles

 poverty gap comparison state profiles

 poverty gap comparison  state assessment results  poverty gap comparison  state assessment results state profiles

 poverty gap comparison  NAEP results  poverty gap comparison  NAEP results state profiles

 poverty gap comparison  NAEP - state assessment  poverty gap comparison  NAEP - state assessment state profiles

 a state’s poverty gap comparison state profiles

 trends  gaps  overall coverage  subpopulation coverage  school analyses sample  trends  gaps  overall coverage  subpopulation coverage  school analyses sample other results

 comparison of trends reported by NAEP and state assessments (number of states) other results: trends grade 4grade 8 math 00-03read 98-03math 00-03read state assessment reported greater gains no significant difference NAEP reported greater gains

 reading 2003  NAEP and state assessments tended to find similar achievement gaps  math 2003  NAEP tended to find slightly larger gaps than state assessments did  reading 2003  NAEP and state assessments tended to find similar achievement gaps  math 2003  NAEP tended to find slightly larger gaps than state assessments did other results: gaps

 median state percentages of NAEP schools and student population matched and included in analyses other results: coverage grade 4grade 8 mathreadmathread percent of schools matched percent of student population matched percent of schools included in analyses percent of students included in analyses

 number of states and percent minority students included in the 2003 reading gap analyses other results: coverage grade 4grade 8 number of states2620 students included (%) number of states 1413 students included (%) number of states 3128 students included (%) black hispanic disadvantage

 percent meeting standards from state tests in NAEP schools and from state reports, 2003 other results: school sample

 producing the report

SAS programs  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations

SAS programs  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations data setup

SAS programs  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations population profiles

SAS programs  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations

SAS programs  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations

SAS programs  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations  the process  find state scores for NAEP sample  score NAEP in terms of state standards  compute inverse CDF pair for subpopulation profiles  compute mean NAEP-state gap differences and standard errors  compute trends and gains  compute smoothed frequency distribution of plausible values  compute NAEP-state correlations

SAS programs  programs  makefiles.sas  standards.sas  gaps.sas  gaps_g.sas  trends.sas  trends_r.sas  trends_g.sas  distribution.sas  correlation.sas  programs  makefiles.sas  standards.sas  gaps.sas  gaps_g.sas  trends.sas  trends_r.sas  trends_g.sas  distribution.sas  correlation.sas

SAS programs  programs  makefiles.sas  standards.sas  gaps.sas  gaps_g.sas  trends.sas  trends_r.sas  trends_g.sas  distribution.sas  correlation.sas  programs  makefiles.sas  standards.sas  gaps.sas  gaps_g.sas  trends.sas  trends_r.sas  trends_g.sas  distribution.sas  correlation.sas

SAS programs: setup  makefiles.sas  for state st get NAEP plausible values for subject s, grade g, and year y  get state assessment data for NAEP schools ( from NLSLASD )  merge files to getexample02.sas7bdat and example03.sas7bdat  makefiles.sas  for state st get NAEP plausible values for subject s, grade g, and year y  get state assessment data for NAEP schools ( from NLSLASD )  merge files to getexample02.sas7bdat and example03.sas7bdat

SAS programs: setup  makefiles.sas *******************************************************************************; * Project : NAEP State Analysis *; * Program : MakeFiles.SAS *; * Purpose : make source file for workshop at LSAC 2005 *; * *; * input : naep_r403 NAEP Reading grade data *; * naep_r402 NAEP Reading grade data *; * XX state XX assessment data *; * YY state YY assessment data *; * *; * output : example data *; * example data *; * *; * Author : NAEP State Analysis Project Staff *; * American Institutes for Research *; * *; *******************************************************************************;  makefiles.sas *******************************************************************************; * Project : NAEP State Analysis *; * Program : MakeFiles.SAS *; * Purpose : make source file for workshop at LSAC 2005 *; * *; * input : naep_r403 NAEP Reading grade data *; * naep_r402 NAEP Reading grade data *; * XX state XX assessment data *; * YY state YY assessment data *; * *; * output : example data *; * example data *; * *; * Author : NAEP State Analysis Project Staff *; * American Institutes for Research *; * *; *******************************************************************************;

SAS programs: setup  standards.sas  compute NAEP equivalents of state standards based on school level state assessment scores in NAEP schools  macro %stan(s,g,y,nlevs) output Stansgy file with state standard cutpoints on NAEP sample  standards.sas  compute NAEP equivalents of state standards based on school level state assessment scores in NAEP schools  macro %stan(s,g,y,nlevs) output Stansgy file with state standard cutpoints on NAEP sample sgyvarnamelevelcutstderrorpercent R403Rs5t R403Rs5t R403Rs5t StanR403

SAS programs: setup  standards.sas  generate school level file with percentages meeting levels by reporting category, with jackknife statistics  macro %StateLev(file,s,g,y)  macro %NAEP_State_Pcts(s,g,y,group)  macro %Sch_State_Pcts(standard,s,g,y) output StPcts_standard_sgy with school stats for first/recent standard, by category  macro %Criterion(standard,s,g,y) output criterion_ standard_sgy with criterion values for cutpoints  standards.sas  generate school level file with percentages meeting levels by reporting category, with jackknife statistics  macro %StateLev(file,s,g,y)  macro %NAEP_State_Pcts(s,g,y,group)  macro %Sch_State_Pcts(standard,s,g,y) output StPcts_standard_sgy with school stats for first/recent standard, by category  macro %Criterion(standard,s,g,y) output criterion_ standard_sgy with criterion values for cutpoints

SAS programs: gaps  gaps.sas  compute and plot subpopulation profiles (inverse CDF) and compute mean NAEP-state gap differences and respective standard errors, by regions of the percentile distribution  macro %gap(s,g,lev,y1,y2,group1,group2) where y1is the earlier years (need not be present) y2is the later year levis the standard for which the gap is being compared group1is the 5-char name of the focal group group2is the 5-char name of the comparison group  gaps.sas  compute and plot subpopulation profiles (inverse CDF) and compute mean NAEP-state gap differences and respective standard errors, by regions of the percentile distribution  macro %gap(s,g,lev,y1,y2,group1,group2) where y1is the earlier years (need not be present) y2is the later year levis the standard for which the gap is being compared group1is the 5-char name of the focal group group2is the 5-char name of the comparison group

SAS programs: gaps  gaps.sas  output:inverse CDF for comparison pairs ICDFr4__03group1group2  mean NAEP-State gap differences and SEs by regions of the percentile distribution DiffGapsMINtoMAXgroup1group2R4__03.XLS DiffGapsMINtoMEDgroup1group2R4__03.XLS DiffGapsMEDtoMAXgroup1group2R4__03.XLS DiffGapsMINtoQ1_group1group2R4__03.XLS DiffGapsQ1_toQ3_group1group2R4__03.XLS DiffGapsQ3_toMAXgroup1group2R4__03.XLS  gaps.sas  output:inverse CDF for comparison pairs ICDFr4__03group1group2  mean NAEP-State gap differences and SEs by regions of the percentile distribution DiffGapsMINtoMAXgroup1group2R4__03.XLS DiffGapsMINtoMEDgroup1group2R4__03.XLS DiffGapsMEDtoMAXgroup1group2R4__03.XLS DiffGapsMINtoQ1_group1group2R4__03.XLS DiffGapsQ1_toQ3_group1group2R4__03.XLS DiffGapsQ3_toMAXgroup1group2R4__03.XLS

SAS programs: gaps  gaps.sas  output:population profiles STATE_PV_03.gif state achievement profile STATE_BW_03.gif state achievement profile NAEP_PV_03.gif NAEP achievement profile NAEP_BW_03.gif NAEP achievement profile NAEP_STATE_PV_03.gif NAEP/state gap profile NAEP_STATE_BW_03.gif NAEP/state gap profile d  gaps.sas  output:population profiles STATE_PV_03.gif state achievement profile STATE_BW_03.gif state achievement profile NAEP_PV_03.gif NAEP achievement profile NAEP_BW_03.gif NAEP achievement profile NAEP_STATE_PV_03.gif NAEP/state gap profile NAEP_STATE_BW_03.gif NAEP/state gap profile d

SAS programs: gaps  gaps.sas  output:population profiles NAEP_PV_03.gif NAEP achievement profile  gaps.sas  output:population profiles NAEP_PV_03.gif NAEP achievement profile

SAS programs: gaps  gaps_g.sas  plot subpopulation profiles and place them on a four-panel template to include in report  macro %pop_profileset SAS/Graph options  macro %plot_gapsplot graphs using options  macro %createtemplatecreate four-panel template  macro %replaygapsplace graphs in template  gaps_g.sas  plot subpopulation profiles and place them on a four-panel template to include in report  macro %pop_profileset SAS/Graph options  macro %plot_gapsplot graphs using options  macro %createtemplatecreate four-panel template  macro %replaygapsplace graphs in template

SAS programs: gaps  gaps_g.sas

SAS programs: trends  trends.sas  compute difference between state and NAEP and respective standard errors  output data file trends_sy including both NAEP and NAEP state standard measures  trends_r.sas  compute gains and respective standard errors  output data file summary_s  trends.sas  compute difference between state and NAEP and respective standard errors  output data file trends_sy including both NAEP and NAEP state standard measures  trends_r.sas  compute gains and respective standard errors  output data file summary_s

SAS programs: trends  trends_g.sas  plot NAEP and state assessment trends by grade and place them on a two-panel template to include in report  compute t for testing significance of differences in gains between NAEP and state assessment  trends_g.sas  plot NAEP and state assessment trends by grade and place them on a two-panel template to include in report  compute t for testing significance of differences in gains between NAEP and state assessment

SAS programs: correlations  correlation.sas  compute NAEP-state correlations and standard errors  macro %corrs(standard,s,g,y,group) output CorrsY_standard_groupsgy file with state standard  correlation.sas  compute NAEP-state correlations and standard errors  macro %corrs(standard,s,g,y,group) output CorrsY_standard_groupsgy file with state standard RtR4032 correlation standard error

SAS programs: distribution  distribution.sas  create file with plausible value frequency distribution output distribution_sgy file  distribution.sas  create file with plausible value frequency distribution output distribution_sgy file

SAS programs  all programs and data files are available for download at including files with the imputed scale scores for excluded students we used in the report  all programs and data files are available for download at including files with the imputed scale scores for excluded students we used in the report

NAEP State Analysis Project  American Institutes for Research  Victor Bandeira de  Don  National Center for Education Statistics  Taslima  American Institutes for Research  Victor Bandeira de  Don  National Center for Education Statistics  Taslima