Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.

Slides:



Advertisements
Similar presentations
Management Plans: A Roadmap to Successful Implementation
Advertisements

Student Learning Targets (SLT)
ESEA FLEXIBILITY WAIVER Overview of Federal Requirements August 2, 2012 Alaska Department of Education & Early Development.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
New and Emerging GEAR UP Evaluators
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
Massachusetts Department of Education EDUCATOR DATABASE Informational Sessions Overview: September 2005 Web:
Title III National Professional Development (NPD) Program Grantee Performance Reporting: A Webinar for FY2011 and FY2012 Grantees February 28, 2013 Prepared.
FERPA and IRB: Implications for Testing Centers Judith W. Grant, Ph.D.,CIP NCTA Conference San Antonio, Texas August 6, 2009.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
PPA 502 – Program Evaluation Lecture 5b – Collecting Data from Agency Records.
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Common Questions What tests are students asked to take? What are students learning? How’s my school doing? Who makes decisions about Wyoming Education?
Mathematics/Science Partnerships U.S. Department of Education: New Program Grantees.
Ian Hodgkinson HMI 19 June 2015
Student Achievement in Chicago Public Schools
Measuring Changes in Teachers’ Mathematics Content Knowledge Dr. Amy Germuth Compass Consulting Group, LLC.
Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
MATH/SCIENCE PARTNERSHIP BASICS The U.S. Department of Education´s Mathematics and Science Partnerships (MSP) program is administered by the Academic Improvement.
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
DISTRICT MANAGEMENT COUNCIL ACADEMIC RETURN ON INVESTMENT (A-ROI)
Supplemental Educational Services (SES) Data Collection Process: Roles and Responsibilities of LEAs GaDOE Data Collections Conference August 17, 2011 Athens,
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Mathematics and Science Partnerships: An Introduction for New State Coordinators February /2013.
Title III Notice of Proposed Interpretations Presentation for LEP SCASS/CCSSO May 7, 2008.
CaMSP Database and APR Webinar Public Works, Inc. CaMSP Database and October 15, 2011 APR Update April 4, 2011 CaMSP Network Meeting Patty O’Driscoll Albert.
1 Educator Evaluation Overview Office of Educational Assessment and Accountability.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
Special Education Law for the General Education Administrator Charter Schools Institute Webinar October 24, 2012.
WELCOME TO THE ANNUAL TITLE I MEETING FOR PARENTS PLEASE SIGN-IN.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
1 Milwaukee Mathematics Partnership Program Evaluation Year 5 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Mathematics and Science Partnerships program U.S. Department of Education Regional Conferences February - March, 2006.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Welcome to the San Francisco Mathematics and Science Partnerships Regional Meeting March 21-23, 2011.
Scale Scoring A New Format for Provincial Assessment Reports.
Certifying Your Data The Annual Performance Report (APR) is due each fall. Data collected in APlus will be used to generate sections of the APR for each.
U.S. Department of Education Mathematics and Science Program State Coordinators’ Meeting.
Improving the Craft of Teaching: Training & Implementation of Idaho’s Teacher Evaluation Framework Nick Smith, Deputy Superintendent School Support Services.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
Application for Funding for Phase II of the Education Fund under the State Fiscal Stabilization Fund Program CFDA Number:
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
North Carolina’s READY Initiative to Prepare Students for College and Career.
ADEQUATE YEARLY PROGRESS. Adequate Yearly Progress Adequate Yearly Progress (AYP), – Is part of the federal No Child Left Behind Act (NCLB) – makes schools.
Data for Student Success May, 2010 Ann Arbor, MI “It is about focusing on building a culture of quality data through professional development and web based.
Title III Native American and Alaska Native Children in School Program Grantee Performance Reporting June 19, 2014 Prepared under the Data Quality Initiative.
GEORGIA’S CRITERION-REFERENCED COMPETENCY TESTS (CRCT) Questions and Answers for Parents of Georgia Students February 11, 2009 Presented by: MCES.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Confidentiality and HIPAA For Mentors. Basic confidentiality requirement for Mentors It is expected that all Mentors having access to client and personnel.
Spring 2013 OMSP Request For Proposal. The purpose of this PowerPoint is to highlight critical components of the Request for Proposals that have historically.
Aim: Does the US need to reform the educational system? Do Now: Make a list of the best aspects of the education you receive and make a list of the worst.
2011 MEAP Results Board of Education Presentation | 07 May 2012 Romeo Community Schools | Office of Curriculum and Instruction.
APRIL 13-16, 2016 Confidentiality in Child Nutrition Programs Stephanie Bruce, Director Nutrition Services Palm Springs USD THESE MATERIALS HAVE BEEN PREPARED.
Section 31a and Accountability
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Title III of the No Child Left Behind Act
Student Growth Measurements and Accountability
Perkins IV Secondary Accountability
Summary of Final Regulations: Accountability and State Plans
NSTA Summer Congress July, 2002
Michigan’s Educator Evaluations
Presentation transcript:

Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC

2 Overview of Session The GPRA Performance System  Overview of the Government Performance and Results Act (GPRA)  MSP GPRA Performance Indicators Acquiring Student Achievement Data  Potential issues related to using student achievement data  Acquiring student achievement data Reporting Student Achievement Data  Goals and Issues

3 The GPRA Performance Indicator System The Government Performance and Results Act of 1993 requires federally funded agencies to develop and implement an accountability system based on performance measurement. Agencies must: 1. state clearly what they intend to accomplish, 2. identify the resources required, and 3. periodically report their progress.

4 The GPRA Performance Indicator System Expected outcomes:  Increased accountability for the expenditures of public funds.  Improved congressional decision-making through more objective information on the effectiveness of federal programs.  Promotion of a new governmental focus on results, service delivery and customer satisfaction. Above excerpted from: Demonstrating Results, An Introduction to Government Performance and Results Act, Spring 1999

5 Annual Performance Report Information on program’s outcomes on GPRA indicators are collected via the Annual Performance Report (APR). Project impacts, as assessed via individual project evaluations and use of control/comparison groups are also collected via the APR. Together this information is used by Congress in determining future funding and thus it is critical that both are reported in as consistent a manner as possible.

6 MSP GPRA Performance Indicators Under GPRA, the Department of Education must account for the impact of ED funded activities, usually obtained via grantees’ evaluations. The Education Department is also required to develop performance measures that focus on what’s happening within and among programs. Thus for MSPs the following two GPRA performance indicators have been adopted:  The percentage of students who score at the basic level or above in state assessments of mathematics or science.  The percentage of students who score at the proficient level or above in state assessments of mathematics or science.

7 GPRA Data Needs From Grantees For GPRA purposes MSP projects are asked to report student achievement data aggregated at the project level. Please note - most projects will want data aggregated at the teacher/class level and possibly student-level data linked to teachers for their own evaluation needs. For GPRA, only the most recent year of data is reported, so new data is reported each year This means GPRA measures are not longitudinal measures and thus do not assess change / improvement over time; instead they provide a snapshot of the MSP program at a certain time point.

8 End Overview of GPRA Questions?

9 Issues with Use of Student Achievement Data Available state testing data for math and science based on NCLB requirements:  Currently all states should have an annual mathematics assessment for grades 3 through 8 and one given at least once in grades 10 through 12.  By the school year an annual assessment in science should be given at least once in grades 3 through 5, once in grades 6 through 9, and once in grades 10 through 12.

10 Issues with Use of Student Achievement Data Family Educational Rights and Privacy Act – 1974 Requires that schools obtain written permission from students/parents before releasing individual educational records. Aggregate (statistical) data that contains no personally identifiable information about students does not require written permission from students/parents. The information all MSPs will be required to submit does not fall under FERPA and thus does not require parental permission.

11 Issues with Use of Student Achievement Data Anonymity vs. confidentiality  Anonymity – person’s identity not known to data user  Confidentiality – person’s identity known to data user but when publicly reported no identifiers included The information all MSPs will be required to submit does not require that the data users know the teacher’s identity, thus anonymity is maintained.

12 Data To Request On All Participating Teachers 1. Total number students taught in main subject (math or science) summed across all classes and grade levels. 2. Total number of students of participating teacher who were tested on state standardized test in math/science (summed across classes/grades) 3. Total number of students of participating teacher scoring at the basic level or above (summed across classes/grades) 4. Total number of students of participating teacher scoring at proficient or above (summed across classes/grades) NOTE: Math and science should be reported separately!

13 Example Mr. Jones is a participating teacher and teaches the following: 7 th grade math: 30 students, all tested, 28 at or above basic, 24 at or above proficient 8 th grade math: 31 students, 29 tested, 21 at or above basic, 17 at or above proficient 8 th grade elective - science and civilization: 32 students The following numbers would be calculated for Mr. Jones (and similarly all participating teachers): Number taught: = 61 Number tested: = 59 Number at or above basic: = 49 Number at or above proficient: 24+17= 41 These numbers would be summed across all participating teachers, schools and districts to represent project-level data, which is what will be reported on the APR.

14 Notes on Data Requests There are various ways in which the districts might provide this data to grantees, and thus how grantees would need to transform that data into what needs to be reported.  In some cases districts are only able to report actual scores. In these instances grantees will need to convert scores to performance levels.  In other cases districts might provide the grantee with data aggregated to the classroom level. In these instances grantees will have to do the aggregation to provide project-level data.

15 Removing Roadblocks It is important to know:  what you need to ask for,  in what form you should request it,  whom to ask for it, and  when to ask for it. Always keep in mind that obtainment of data for MSP GPRA indicators is both legal and required.

16 Acquiring Student Data Identify who has access to the data needed (should be someone in district). Inform stakeholders and require maintenance of anonymity and confidentiality of data. For GPRA purposes only assessment data that cannot be linked to individual teachers or students is being requested. (However, for individual project evaluations this data may need to be linked…)

17 Acquiring Student Data Find out district requirements for requesting data, including:  Whom specifically to request data from  Timelines for requesting data  Form(s) in which data are available  When data may no longer be available! Want aggregated data in electronic form not linked to individual teachers  Realize this may require a long lead time so request data early  Ensure cooperation during proposal design phase  May want to create a MOU to ensure access to data

18 Selecting Teachers On Which To Aggregate Data Create a list of all teachers who participated in MSP at any time, regardless of when they participated. May want to also request same data for a comparison group of matched teachers who did not participate in the MSP project. Request data without IDs. Sum data across teachers, classes, schools, and districts to report project-level data

19 Notes on Selecting Teachers In some instances districts have had to have lists of students’ names in order to access their achievement data as it wasn’t available by teacher.

20 End Acquiring Student Achievement Data What have been your experiences trying to obtain student achievement data? What roadblocks have you faced? How have you overcome them?

21 Reporting Student Achievement Data Goal: To report uniform student achievement data for all participating MSP teachers in the project. Reporting Issues  Data availability  Dealing with missing data - report on all teachers even if test data missing or not available - make note of number of teachers and students for whom data is missing

22 What Should be Reported Information aggregated to the project level, including:  Number of districts  Number of schools  Number of MSP teachers  Number of students of those MSP teachers  Number of students with student assessment data in the relevant subject (math or science)  Number of students who scored at basic or above  Number of students who scored at proficient or above

23 Wrap-Up What we covered: Understanding GPRA, PART, and the APR and how they relate to each other. Reviewing the MSP performance indicators and what data must be reported. Issues around obtaining student achievement data:  Knowing in what form to request the data  Knowing from whom, when, and how to get student achievement data How to report student achievement results

24 Wrap-Up Continued Questions?