Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.

Slides:



Advertisements
Similar presentations
Data Analysis Training Objectives 1.Understand the purpose of interpreting and analyzing data 2.Learn and use general terminology associated.
Advertisements

Title I & Title III Annual Parent Meeting
Using Data Effectively or Why Weigh the Hog If You Aren’t Going To Feed It? Presented by Ronni Ephraim, Chief Instructional Officer Los Angeles Unified.
1 Title I, Part D Data Reporting and Evaluation: What You Need To Know Dory Seidel and Jenna Tweedie, NDTAC Karen Neilson, California Department of Education.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Enhancing Critical Thinking Skills 2012 HBCU Library Alliance Leadership Institute Presented By: Violene Williams, MLIS Reference Librarian James Herbert.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Least Restrictive Environment Identification of High Percentage.
1 ND Community Call Salmon Community 21 October 2014.
ESEA FLEXIBILITY: ADDRESSING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS January 11, 2012.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Title I, Part D—Prevention and Intervention Programs for Children.
April 29, 2014 Gail A. Epps, Ed. D. Program Manager.
No Child Left Behind The Basics Of Title 1 Every Child - Now! Focus on the critical nature of doing what’s right and what’s needed – today - to help every.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Reporting & Evaluation Workshop Lauren Amos, Liann Seiter, and Dory Seidel.
Leadership For School Improvement
Student Assessment Inventory for School Districts Inventory Planning Training.
performance INDICATORs performance APPRAISAL RUBRIC
Presented by Margaret Shandorf
Kyrene Professional Growth Plan
Interactive Science Notebooks: Putting the Next Generation Practices into Action
1 Monitoring Review: What Every New Coordinator Should Know Victoria Rankin and Greta Colombi, NDTAC.
Using Data to Improve Adult Ed Programs Administrators’ Workshop.
ND Community Call Data Dashboards: Part 1 September 20, 2012.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Michael Toole Southwest Plains Regional Service Center.
Introduction to Home/School Compacts
State Laws, Recommendations, & NCLB How research becomes policies Janice Kroeger, Ph.D. Associate Professor, TLC, ECED.
Pomona Unified School District Standard Practices for Data Analysis Silvia San Martin Teacher Specialist Research and Assessment.
1 Topical Call Series: Improving Data Quality and Use Improving Data Use Wednesday, November 19, 2014.
COMMUNITY Needs Assessment Overview
Focus on Learning: Student Outcomes Assessment and the Learning College.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Meeting the Educational Needs of Diverse Learners DeAngela Milligan and Sarah Bardack.
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Creating and Using Title I, Part D, Data Report Cards Nicholas Read, NDTAC Special thanks to Natalia Pane and Patrick Kelly of AIR for their work on an.
What does progressing in general curriculum mean?  Assessing student progress?  Progress is what the fed promotes and requires  Progress in the general.
What does progressing in general curriculum mean?  Assessing student progress?  Progress is what the fed promotes and requires  Progress in the general.
Title I Annual Parent Meeting West Hialeah Gardens Elementary September 8, 2015 Sharon Gonzalez, Principal.
Teaching Students in Inclusive Settings. Getting Started Course Overview Discussion Posts and Rubrics Major Assignments Q & A Dr. Phyllis Schiffer-Simon.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
DISTRICT MANAGEMENT COUNCIL ACADEMIC RETURN ON INVESTMENT (A-ROI)
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
ND Topical Call Subgrantee Monitoring Tools: Meeting the Civil Rights Obligations to Students (Call 1) January 14, 2015.
REQUIRED ELEMENTS. Standards are the centerpiece of a strong academic program. They are your roadmap and provide the what as teachers build curriculum,
Adolescent Literacy – Professional Development
The Power of Monitoring: Building Strengths While Ensuring Compliance Greta Colombi and Simon Gonsoulin, NDTAC.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director.
Petraine Johnson, Moderator, Presenters: Millie Bentley-Memon, Fengju Zhang, Elizabeth Judd Office of English Language Acquisition Language Enhancement.
NDTAC Jeopardy True or False?. $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
DEVELOPING PARENT INVOLVEMENT POLICIES Title I No Child Left Behind (NCLB) Section 1118.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 2) 26 August 2015 – Katie Deal.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
1 New Coordinator Orientation Lauren Amos, Katie Deal, and Liann Seiter.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Ensuring Progress in the General Education Curriculum ED 222 Spring 2010.
COMPONENT THREE MONITORING LEARNING PROGRESS WHAT IS THE SCHOOL’S ASSESSMENT PLAN? HOW IS THE ASSESSMENT DATA ANALYZED AND KNOWN? HOW DID THE RESULTS IMPACT.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
1 Effectively Addressing Administrative Challenges of Implementing Title I, Part D Katie Deal, Rob Mayo, Liann Seiter, and Jake Sokolsky.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
1 Welcome! Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3×5 notecard.
Intro to NRS Data Diving
Introduction to The Many Uses of Data
Presentation transcript:

Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED

2 How Do You Use Your Title I, Part D, Data? Do you:  Look at your Consolidated State Performance Reports (CSPRs) each year?  Incorporate what you know into applications for funding?  Make statements about how these tables do not really reflect what goes on in the classroom?  Attempt to explain them to your stakeholders?  Put results away in your files?  Hope that the CSPR will go away before next year?

 Why using data is important  How data can be used  Strategies for using data  Activities 3 Agenda

 Consider ways you can better use your data  Consider what technical assistance (TA) and support you could provide to your subgrantees to encourage data use 4 Outcomes

5 Data Tells You and Your Programs...  Where you’ve been  Where you are  Where you’re going  How to get there

6 Barriers to Using Data  Your program’s data are handled separately from your program.  Your program’s culture does not focus on data.  Gathering data is perceived to be a waste of time.  Staff lack adequate orientation and training in the value of data collection.  Staff have had negative experiences with data collection.  Staff are not aware of other programs’ successes in using data.  Staff think that data are collected “just for the State or the Feds.”

7 Working With What You Have The same data you collect and report…  Demographics of students (race/ethnicity, age, and gender)  Academic performance in reading and mathematics  Academic and vocational outcomes  Student and facility counts  Program spending* can be used for…  Accountability  Program promotion/marketing  Program management and improvement * States do not report program spending within the CSPR, but should have this information at hand.

8 Functions of Data  Help us identify whether goals are being met (accountability)  Tell our departments, delegates, and communities about the value of our programs and the return on their investments (marketing)  Help us replace hunches and hypotheses with facts concerning the changes that are needed (program management and improvement)  Help us identify root causes of problems (program management and improvement)

9 Program Components by Data Function Program Accountability Program Marketing/ Promotion Program Improvement Student demographics Are the appropriate students being served? How are you addressing the needs of diverse learners? Which students need to be better served? Student achievement Are students learning? What are students learning? What gains have they made? How can we help improve student achievement? Student academic outcomes Are students continuing their education? What are students doing to continue their education? How can we help improve student academic outcomes?

10 Strategies for Improving Data Use  Accountability –Monitor data based on national benchmarks –Set State benchmarks and monitor program performance  Program Improvement –Evaluate program (formative and/or summative)  Marketing –Develop and distribute State/program report cards

11 Data Use Improvement Activities  Meet with SEA staff (data, programmatic) to analyze the data you have  Request disaggregated data from subgrantees/ programs to improve data use  Communicate findings with subgrantees  Support subgrantee/program evaluations –Communicate allowability of funding –Include evaluation requirements in program applications/formal agreements –Provide TA at conferences/meetings or during monitoring on benefits of using data and how to do so

12 Data Use Model Two Components of the Model:  Data Analysis and  Program Improvement Model developed by the National Reporting System for Adult Education support project at the American Institutes for Research

13

14 Focusing the Question Break the question into inputs and outcomes:  Inputs (what your program contributes):  Teacher education, experience, full-time/part-time  Instructional curriculum  Hours of instruction per week  Outcomes (indicators of results):  Improved posttest scores  Completed high school  Earned GED credentials

15 Focusing/Refining the Question (1) Poor Question:  Does my program have good teachers? Good Question:  Does student learning differ by teacher? Better Question:  Do students in classes taught by instructors who have more teaching experience have higher test scores than those taught by new teachers?

16 Focusing/Refining the Question (2) Poor Question:  Is my program helping the most needy students? Good Question:  Are students who are below grade level learning less in my program than other students? Better Question:  Are students who are below grade level advancing levels at the same rate as students at grade level?

17 Developing a Data Analysis Plan  What data do you already have that will answer your question?  What additional data, if any, will you need to answer your question?  If so, where will you get the additional data?  What’s your plan for obtaining the data you need— and what’s your timeline?

18 Analyzing and Interpreting Your Data  Keep your original question in mind.  Look for patterns and differences.  Use appropriate data and statistics.  Disaggregate the data.  Consider data quality.  Draw appropriate conclusion(s).  Remember serendipity: Be open to the unexpected.

19 Presenting Your Data (1) Frequency Tables  Show numbers and percentages by category, e.g., ethnicity, gender, age.  Provide crosstabulations, e.g., ethnicity by age.

20 Presenting Your Data (2) Graphs and Charts Bar Chart:  Categories are displayed as bars, e.g., students by age. Pie Chart:  A slice of the pie shows proportion of the whole, e.g., various ethnicities of total students. Line Chart:  Data form a continuous measure/trend (not categories), e.g., posttest scores.

21 Presenting Your Data (3) Communication Strategies  Article by education reporter in local newspaper  Public meeting or news conference presented by superintendent or dean  Newsletters  Special events, e.g., open house  Web sites  Annual report

22 Conclusion  Using your data can help you (1) ensure accountability, (2) make program improvements, and (3) market your program.  It is key to look at your data, involve others, and consider how you can use the data.  You can integrate data use activities in regularly scheduled activities.

23 Activities  Activity 1 –Discuss possible reasons for the scenarios included in the handout –Consider: (1) What additional data would you need to better understand the root cause of the problem? and (2) What could be done about it?

24 Activities  Activity 2 –Review the scenario –Consider how you could address the issue(s) in regularly scheduled activities