Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director.

Slides:



Advertisements
Similar presentations
Whats a P20, and why would I want one? Center for Educational Performance and Information (CEPI) Margaret Merlyn Ropp, Ph.D. Director.
Advertisements


1 Title I, Part D Data: SY 2012−13 Data Preview, Data Quality, and Upcoming CSPR Clarifications Dory Seidel and Jenna Tweedie, NDTAC.
1 Title I, Part D Data Reporting and Evaluation: What You Need To Know Dory Seidel and Jenna Tweedie, NDTAC Karen Neilson, California Department of Education.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Individualized Learning Plans A Study to Identify and Promote Promising Practices.
Using Data to Improve Student Achievement: November 17 th, 2011 A Conversation with members of the Montana Education and Local Government Interim Committee.
P-20 Longitudinal Data Systems What is the information we need, how do we get it, what do we do with it? Elizabeth Laird & Helene Stebbins.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Title I, Part D—Prevention and Intervention Programs for Children.
Title I Needs Assessment and Program Evaluation
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Reporting & Evaluation Workshop Lauren Amos, Liann Seiter, and Dory Seidel.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
1 Introduction to State Logic Models and Related Planning Stephanie Lampron, NDTAC.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
CONDUCTING PROGRAM EVALUATIONS FOR FEDERAL PROGRAMS Brooke Blair, ALSDE Mark Ward, ALSDE Erin McCann, SEDL Mary Lou Meadows, SEDL.
Using Longitudinal Data to Improve Student Achievement US Chamber of Commerce Institute Aimee Rogstad Guidera February 23, 2010.
Taking the Fast Lane to High-Quality Data Sarah Bardack and Stephanie Lampron.
ND Community Call Data Dashboards: Part 1 September 20, 2012.
Title I, Part D State Plans Katie Deal, NDTAC State Liaison.
Using Data to Improve Student Achievement Aimee R. Guidera Director, Data Quality Campaign National Center for Education Accountability April 23, 2007.
Meeting the Educational Needs of Diverse Learners DeAngela Milligan and Sarah Bardack.
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 3) 22 September 2015 – Katie Deal.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.
@EdDataCampaign Mining the Data: What States Have and Where to Find It February 7, 2012 Elizabeth Laird Director, Communications and External Affairs Data.
The Challenge We must realize that the system is the cause of weak execution due to lack of clarity, commitment, collaboration and accountability resulting.
Moving Beyond Compliance: Four Ways States Can Support Districts and Local Data Use 2012 MIS Conference Dan Domagala, Colorado Department of Education.
Melvin L. Herring, III Program Director, Title I, Part D Florida Department of Education.
The Power of Monitoring: Building Strengths While Ensuring Compliance Greta Colombi and Simon Gonsoulin, NDTAC.
Longitudinal Data Systems: What Can They Do for Me? Nancy J. Smith, Ph.D. Deputy Director Data Quality Campaign November 30, 2007.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
An Introduction to the State Performance Plan/Annual Performance Report.
Federal Flexibility Initiative and Schoolwide Programs.
Why Do State and Federal Programs Require a Needs Assessment?
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
BOCES Data Collection & Reporting October 10, 2013 Lisa Pullaro Mid-Hudson Regional Information Center.
TITLE I, PART D STATE PLANS John McLaughlin Federal Coordinator for the Title I, Part D Program NDTAC Conference May
Iowa Support System for Schools in Need of Assistance (SINA) Overview and Audit Iowa Department of Education and AEA 267 August 2011.
U.S. Department of Education Office of Special Education Programs The Legacy of IDEA 2004: Improving Results for all Students Dr. Alexa Posny.
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Archived Information The information in this presentation is archived for historical and research purposes only.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 2) 26 August 2015 – Katie Deal.
Rowland Unified School District District Local Education Agency (LEA)Plan Update Principals Meeting November 16, 2015.
1 New Coordinator Orientation Lauren Amos, Katie Deal, and Liann Seiter.
1 ND Community Call Teal Community 27 October 2015.
Building the Parent Voice
DRAFT Title I Annual Parent Meeting W.H. Rhodes Elementary School School Year.
Promoting Data Collection and Evaluation in Title I, Part D, Programs Stephanie Lampron, NDTAC; John McLaughlin, Program Officer; and Bobbi Stettner-Eaton,
Federal Flexibility Initiative and Schoolwide Programs.
1 ND Community Call Gold Community 22 October 2015.
CAREER PATHWAYS THE NEW WAY OF DOING BUSINESS. Agenda for our Discussion Today we’ll discuss: Career Pathways Systems and Programs Where we’ve been and.
Required Skills for Assessment Balance and Quality: 10 Competencies for Educational Leaders Assessment for Learning: An Action Guide for School Leaders.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
Community Liaison Training NCLB Parental Involvement Requirements “Creating an Audit Trail” October 19, 2007 Eduardo Elizondo, Director Federal Programs.
Our State. Our Students. Our Success. DRAFT. Nevada Department of Education Goals Goal 1 All students are proficient in reading by the end of 3 rd grade.
1 Effectively Addressing Administrative Challenges of Implementing Title I, Part D Katie Deal, Rob Mayo, Liann Seiter, and Jake Sokolsky.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Kansas Education Longitudinal Data System Update to Kansas Commission on Graduation and Dropout Prevention and Recovery December 2010 Kathy Gosa Director,
1 Welcome! Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3×5 notecard.
Matt Dawson Director, REL Midwest August 2009
Using Data to Improve Student Achievement Aimee R. Guidera
Implementation Guide for Linking Adults to Opportunity
Using Data to Monitor Title I, Part D
Gateway High School-Alt.Ed Annual Title 1 Parent Workshop
Presentation transcript:

Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director

2 Session Overview  The Title I, Part D Data Collection  Importance of Data Quality and Data Use  Actively Using Data for Program Improvement

3 The Title I, Part D Data Collection

4 What are Title I, Part D and NDTAC?  Title I, Part D (TIPD) of the Elementary and Secondary Education Act of 2001 – Subpart 1-State Agency – Subpart 2-LEA  National Evaluation and Technical Assistance Center for the Education of Children and Youth who Are Neglected, Delinquent or At-Risk (NDTAC)

5 NDTAC's Mission Related to Data and Evaluation  Develop a uniform evaluation model for State Education Agency (SEA) Title I, Part D, programs  Provide technical assistance (TA) to States in order to increase their capacity for data collection and their ability to use that data to improve educational programming for N & D youth

6 Background: NDTAC’s Role in Reporting and Evaluation Specific to Title I, Part D, Collections  TA prior to collection Webinars, guides, and tip sheets  TA during collection Data reviews, direct calls, and summary reports for ED  Data analysis and dissemination GPRA, Annual Report, and online Fast Facts Related TA  Data use and program evaluation

7 TIPD Basic Reporting and Evaluation Requirements Where do requirements come from?  Elementary and Secondary Education Act, amended in 2001 (No Child Left Behind) – Purpose of Title I, Part D (Sec. 1401) – Program evaluation for Title I, Part D (Sec Subpart 3) How does ED use the data?  Government Performance and Results Act (GPRA)  Federal budget requests for Title I, Part D  Federal monitoring  Provide to NDTAC for dissemination

8 Collection Categories for TIPD in the Consolidated State Performance Report (CSPR)  Types/number of students and programs funded  Demographics of students within programs  Academic and vocational outcomes  Pre-posttesting results in reading and math

9 Title I, Part D in Pennsylvania State Agency (S1)Local Agency (S2) Number of Programs US ,7122,8892,689 PA Number of Students Served US 125,456109,146106, , , ,591 PA 1,643 (1%) 1,189 (1%) 1,123 (1%) 24,863 (7%) 24,562 (7%) 26,510 (7%)

10 Local Education Agency (S2) Academic Outcomes * data are preliminary

11 Long-term Students Improvement in Reading (Subpart 2) * data are preliminary

12 Long-term Students Improvement in Math (Subpart 2) * data are preliminary

13 Data Quality & Data Use

14 Functions of Data  Help us identify whether goals are being met (accountability)  Tell our departments, delegates, and communities about the value of our programs and the return on their investments (marketing)  Help us replace hunches and hypotheses with facts concerning the changes that are needed (program management and improvement)  Help us identify root causes of problems and monitor success of changes implemented (program management and improvement)

15 You need to TRUST your data as it informs:  Funding decisions  Technical assistance (TA) needs  Student/facility programming 15 Why Is Data Quality Important?

16 What Is “high data quality”? If data quality is high, the data can be used in the manner intended because they are:  Accurate  Consistent  Unbiased  Understandable  Transparent

17 What data are the most useful? Useful data are those that can be used to answer critical questions and are…  Longitudinal  Actionable (current, user-friendly)  Contextual (comparable, part of bigger picture)  Interoperable (matched, linked, shared) Source: Data Quality Campaign

18 Should you use data that has lower quality data? YES!! You can use these data to…  Become familiar with the data and readily ID problems  Know when the data are ready to be used more broadly or how they can be used  Incentivize and motivate others

19  Insure systems, practices, processes, and/or policies are in place − Understand the collection process − Provide/request TA in advance − Develop relationships − Develop multilevel verification processes − Track problems over time − Use the data (even when problematic) − Link decisions (funding, hiring, etc.) to data evidence  Indicate needs to others 19 Data Quality Support Systems

20 Using Data Actively

21 Essential Steps Related to Data Use 1.Identify problem or goal to address 2.Explore & analyze existing data 3.Develop and implement change  Set targets and goals 4.Develop processes to monitor and review data

22 Step 1: Identify concerns or goals Identify your level of interest  State  Facility / School  Classroom Define, issue, priorities or goals  Upcoming decisions  State or district goals or initiatives  Information from needs assessments (or, conduct one) Identify how data will be used & questions Resource: NDTAC Program Administration Planning Guide-Tool 3 on Needs Assessments

23 Program Components by Data Function Program Accountability Program Marketing/ Promotion Program Improvement Student demographics Are the appropriate students being served? How are you addressing the needs of diverse learners? Which students need to be better served? Student achievement Are students learning? What are students learning? What gains have they made? How can we help improve student achievement? Student academic outcomes Are students continuing their education? What are students doing to continue their education? How can we help improve student academic outcomes?

24 Focusing the Questions Break the question into inputs and outcomes:  Inputs (what your program contributes): − Teacher education, experience, full-time/part-time − Instructional curriculum − Hours of instruction per week  Outcomes (indicators of results): − Improved posttest scores − Completed high school − Earned GED credentials

25 Focusing/Refining the Question Weak Question:  Does my school have good teachers? Good Question:  Does student learning differ by teacher? Better Question:  Do students in classes taught by instructors who have more teaching experience have higher test scores than those taught by new teachers?

26 Step 2: Explore Existing Data  Locate the data you do have  Put it in a useful format −Trends, comparisons  What story is the data telling you? −What jumps out at you about the data? −Are the data telling you something that is timely and actionable? −What questions arise? What is the data not telling you that you wish you knew?** −What data could help answer those questions?

27 Local Education Agency (S2) Academic Outcomes

28 LEA 1: Comparison data (1) Percent of Students Earning HS CC State Average LEA Average

29 Comparison Data (2): Context Per Pupil Expenditure Earning HS Course Credits FT teachers Entering below grade level % LEP Facility A$50070%565%25% Facility B$45040%510%40% Facility C$55020%591%70% Facility D$60033%550%30%

30 Longitudinal data: more context

31 Do you know enough? Sometimes, the data will lead to more questions and a need for more information…  Compare to other LEA’s facilities  Use student-level data and disaggregate  Look at monitoring information and applications  Collect additional information-surveys, interviews *Keep data quality in mind

32 Step 3: Implement improvement plan  Implement new programming, change, etc.  Set benchmarks, performance targets −In terms of your priorities, where do you want your subgrantees and facilities to be in one year? Two years? Three years? −What performance benchmarks might you set to measure progress along the way? −How will you know when to target a subgrantee or facility for technical assistance? At what point might you sound the alarm?

33 Step 4: Develop processes for reviewing data Keep using it!  Monitor change and compare against benchmarks  Review data in real time  Share it and discuss it

34 Keep in mind  Data use is not easy*  Data should be a flashlight, not a hammer*  Change takes time-set realistic goals  “No outcome” can be a useful finding  Aggregated data can usually be shared *Source: Data Quality Campaign

35 Data Capacity Exists ! (Data Quality Campaign, 2011 Report) 10 Essential Elements of Longitudinal Data Systems# States A unique student identifier52 Student-level enrollment, demographic, and program participation information52 The ability to match individual students’ test records from year to year to measure academic growth 52 Information on untested students and the reasons why they were not tested51 A teacher identifier system with the ability to match teachers to students44 Student-level transcript data, including information on courses completed and grades earned 41 Student-level college readiness test scores50 Student-level graduation and dropout data52 The ability to match student records between the P–12 and postsecondary systems49 A state data audit system assessing data quality, validity, and reliability52

36 1. Link State K-12 data systems with early learning, postsecondary education, workforce, social services, and other critical agencies Create stable, sustained support for robust state longitudinal data systems Develop governance structures to guide data collection, sharing, and use Build state data repositories that integrate student, staff, financial, and facility data Implement systems to provide all stakeholders with timely access to the information they need while protecting student privacy.2 6. Create progress reports with individual student data that provide information educators, parents, and students can use to improve student performance Create reports that include longitudinal statistics on school systems and groups of students to guide school-, district-, and state-level improvement efforts Develop a purposeful research agenda and collaborate with universities, researchers, and intermediary groups to explore the data for useful information Implement policies and promote practices, including professional development and credentialing, to ensure that educators know how to access, analyze, and use data appropriately Promote strategies to raise awareness of available data and ensure that all key stakeholders, including state policymakers, know how to access, analyze, and use the information.23 Next Step: Data Use (DQC-2011)

37 Accessible Data – N or D Related Title I, Part D Data  ED Data Express:  NDTAC State Fast Facts Pages:  Title I, Part D, Annual Report: Civil Rights Data Collection (district and school)

38 Accessible Data – N or D Related OSEP Data Collection Youth Behavior Survey (CDC) OJJDP Juvenile Justice Surveys /Data Book

39 Resources  NDTAC reporting and evaluation resources: delinquent.org/nd/topics/index2.php?id=9 delinquent.org/nd/topics/index2.php?id=9  Data Quality Campaign: Data for Action 2011—Empower With Data

40 Questions? Stephanie Lampron NDTAC Deputy Director NDTAC Data Team  Dory Seidel:  Liann Seiter: