Who We Are Naomi Hupert and Tomoe Kanaya Education Development Center/Center for Children and Technology--a non-profit education research institute. www.edc.org/CCT.

Slides:



Advertisements
Similar presentations
WV High Quality Standards for Schools
Advertisements

School Leadership Team Fall Conference West Virginia Department of Education Division of Educator Quality and System Support Bridgeport Conference Center.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Leon County Schools Performance Feedback Process August 2006 For more information
Instructional Decision Making
When Students Can’t Read…
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Delta Sierra Middle School Napa/Solano County Office of Education School Assistance and Intervention Team Monitoring Report #8 – July 2008 Mary Camezon,
 Reading School Committee January 23,
Campus Staffing Changes Positions to be deleted from CNA/CIP  Title I, Title II, SCE  Academic Deans (211)  Administrative Assistants.
The SWIFT Center SCHOOLWIDE INTEGRATED FRAMEWORK FOR TRANSFORMATION.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
Oregon Reading First: Statewide Mentor Coach Meeting February 18, 2005 © 2005 by the Oregon Reading First Center Center on Teaching and Learning.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Meeting the Needs of English Learners With Reading Difficulties Through a Multitiered Instructional Framework OSEP Project Directors’ Meeting July 2014.
Best Practices. Overview of Best Practices Literacy Best Practice Documents: Were developed by curriculum staff and area specialists, with coaches’ and.
1 Supporting Striving Readers & Writers: A Systemic Approach United States Department of Education Public Input Meeting - November 19, 2010 Dorothy S.
Wisconsin Statewide Title I Network CESA #1 and the Wisconsin DPI
CONNECTICUT ACCOUNTABILTY FOR LEARNING INITIATIVE Executive Coaching.
Research, evidence and engaging learning Profiling the influence of school librarianship Penny Moore
Student Assessment Inventory for School Districts Context setting, assessment scenarios, and communications.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Intro to Positive Behavior Interventions & Supports (PBiS)
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
PARENT COORDINATOR INFORMATION SESSION PARENT ACCOUNTABILITY Wednesday, July 20, 2011 Madelene Chan, Supt. D24 Danielle DiMango, Supt. D25.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
Who We Are The Center for Children and Technology (A division of the Education Development Center) A non-profit education research group
Division Liaison Update Division Liaison Meeting The College of William and Mary January 7, 2013.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Leading improvement using the Primary Framework. Keys to further improvement A growing body of research identifies important and interrelated keys to.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Using handheld computers to support the collection and use of reading assessment data Naomi Hupert.
“Lessons learned” regarding Michigan’s state-wide implementation of schoolwide behavior and reading support Margie McGlinchey Kathryn Schallmo Steve Goodman.
School Writing Programs Kentucky Department of Education October 2009.
EARLY WARNING SYSTEMS EARLY ADOPTERS’ SURVEY Interpretive Summary Highlights of EWS Early Adopters Learning and Sharing Summit Survey, George W. Bush Institute,
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Teresa K. Todd EDAD 684 School Finance/Ethics March 23, 2011.
Winston/Salem Forsyth County Schools RESPONSIVENESS TO INSTRUCTION (RTI)
Common Core State Standards: Supporting Implementation and Moving to Sustainability Based on ASCD’s Fulfilling the Promise of the Common Core State Standards:
Implementation of CCSS CCCOE Curriculum Council November 2011.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
The Michigan Statewide System of Support for Title I Schools.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Scholastic Inc.1 Navigating Waves of Change: Driving Academic Improvement Through Challenging Times Sam Howe November 2010.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Maine Department of Education Maine Reading First Course Session #1 Introduction to Reading First.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
TEACHER EVALUATION IMPLEMENTATION DAY: STUDENT GROWTH AND GOAL SETTING September 25, 2015 Shorewood High School 9/25/15 1.
By: Jill Mullins. RtI is… the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time and.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Florida Charter School Conference Orlando, Florida November, 2009 Clark Dorman Project Leader Florida Statewide Problem-Solving/RtI Project University.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Office of Service Quality
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Statewide System of Support For High Priority Schools Office of School Improvement.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Data Review Team Time Spring 2014.
Using Evidence to Refine a Partnership
Oregon Team : Carla Wade : Jan McCoy :Dave Cook : Jennifer Arns
Administrator Evaluation Orientation
Researchers as Partners with State Part C and Preschool Special Education Agencies in Collecting Data on Child Outcomes Kathy Hebbeler, ECO at SRI International.
Presentation transcript:

Who We Are Naomi Hupert and Tomoe Kanaya Education Development Center/Center for Children and Technology--a non-profit education research institute.

Some things we’ll touch on in this presentation Why we think talking about Reading First evaluation is important What we’ve learned (informally) about states’ Reading First evaluations Some interesting challenges we’ve encountered in planning a Reading First Evaluation Some early trends we are seeing in our evaluation work

Federal guidance for Reading First evaluation* 1. Implementation Evidence 2. Achievement Gains 3. Program Effectiveness 4. Reducing Students Reading Below Grade Level Statewide * From Guidance for the Reading First Program, US Department of Education, April 2002

What we’ve been hearing about Reading First and evaluation: Through informal conversations with evaluators, technical assistance providers, RF directors, and others we’ve collected the following information: –States are postponing their evaluation activities until their programs are underway –States are finding they are not equipped to collect and analyze the information necessary for their evaluations –Many states intend to identify outside evaluators –While most states’ RF proposals indicate implementing summative evaluations only, our conversations indicate states are interested in formative evaluation as well

A few things to consider when putting an evaluation plan into action: What approach best meets the needs of the implementation plan: summative, formative and summative? What kinds of data are available? (and will you need to supplement with additional data collection activities?) What is the data collection and reporting capacity of your participating schools and districts?

Summative Evaluation Focuses on program impact Measures success based on criteria set forth by the Federal Government and a state’s Reading First proposal Keep in mind: “impact” can mean more than change in student achievement, it can include change in classroom practice, change in teacher knowledge, change in school and district approach to reading

Formative Evaluation Primary goal is to support and inform program growth, development and improvement Provides ongoing feedback loop from schools to state Reading First program staff Focuses on implementation Requires a team effort--evaluators and program staff and participants form partnership

Where to find Data: Some possible sources of data to inform the evaluation process: –Literacy assessment tools –State developed early screening or assessment tools –Publisher generated measures of student literacy skill –Student data collected by school, district, state: Attendance Library use Referral to Special Education English Language Learner status in relation to literacy acquisition Student mobility

Other sources of data: Interviews with relevant educators: –Teachers –Principals –Superintendents –Coach/professional development staff –Literacy specialists Observations of relevant activities: –Professional development events –Classroom activities –Coaching

Data Collection: Options for data collection assistance In school: –Reading first coordinators –Teachers –School-level administration –District-level administration Non-school: –Outside evaluator –Graduate students/program collaboration w/preservice –Parents/school volunteers

Some interesting challenges we’ve encountered in planning a Reading First evaluation: The challenges can be grouped into two categories: 1.The data analysis perspective 2.Other things…

Our Goal: To monitor students’ progress over time ACCURATELY 1.We need a large number of students (sample size). 2.We need to ‘measure’ as many contextual variables as possible 3.We need the ability to follow individual students over time 4.We need to be patient

1. Sample Size When the sample size is small, the results may be due to just a few unique individuals. The more students we have, the more confident we can be that the results are real and ‘generalizable’.

2. Contextual variables Students’ improvement may be due to various contexts that surround the child, rather than the Reading First initiative. Therefore, it is important to ‘measure’ and account for all potential contextual variables that could impact literacy. –Examples: Student-teacher ratio Exact age of student Baseline performance of student Other initiatives that may support literacy skill development

3. Following Students Measuring classroom/school level performance is not enough It is difficult to determine how students are improving if we cannot follow individual students at each testing We cannot determine the nature of the improvement (e.g., which contextual variables matter for which particular students) without following individual students over time

Example: Classroom A and B both improved 10% after 1 year in Reading First Initial Conclusion: Classroom A and B performed equally well Conclusion after following students: In Classroom A all girls improved 20% while boys had 0% improvement In Classroom B, every student improved 10%

4. Patience, Patience, Patience! Changes and improvements do not happen overnight—or over 1 testing cycle Small changes at an early age can eventually turn into large differences later in life

Evaluation planning: other things to keep in mind Student mobility Issues around student and teacher privacy Issues around data collection and data entry

A few more things: too many interviews, questionnaires, and surveys students assessed too often who administers assessments to students the challenge of creating an evaluation design that is rigorous yet compatible with a Reading First program design

More things… developing partnerships with teachers and schools making evaluation “safe” the formative loop

Our work in New Mexico Year 1: 32 schools in 10 districts (Approx. 30 schools will be added in year 2 and 30 more in year 3) Year 1: 491 K - 3rd grade teachers administered DIBELS assessments during November, 2003 and February, 2004 –7097 students were assessed in November, 2003 –7586 students were assessed in February, 2004 –4924 students for whom we have complete and matched data from both assessment periods

Assessment analysis plan Fall, winter and spring assessments First assessment may be inaccurate/unreliable Look at movement of individual students from one support recommendation level to another –(ie: from Benchmark to Strategic Support) Examine demographic data in relation to student assessment findings Match DIBELS findings with school and/or district program and implementation

Assessment: Some trends Nearly all districts have implemented some kind of professional development to support DIBELS and TPRI –Administering the DIBELS and TPRI was nearly universal by first assessment period –Using the Palm Pilot to administer the DIBELS and TPRI was adopted by nearly every RF teacher –Using the DIBELS and TPRI data in some way to inform classroom instruction, or to raise questions about past practice, is rapidly spreading to all RF schools

Assessment: Trends Professional development appears to be more targeted to teacher need in schools with a dedicated instructional or literacy coach, or with a geographically close cadre member.

Assessment: Trends Teachers are changing what they are doing in classrooms Teachers are looking at their students’ assessment data Teachers are making instructional decisions based on assessment data Professional Development providers (cadre members, coaches, or staff developers) are tailoring their presentations to address teachers’ questions about using assessment data

Professional Development The majority of Reading First teachers had access to a wide range of professional development activities provided by: The state District level staff School level staff Outside Consultants/Experts in Literacy Publishers/Vendors

Professional Development: Trends In some schools/districts the systematic use of assessments, and the expectation that assessment data will guide instruction has begun to guide professional development Teachers are asking for more information about how to teach those students most in need of support Schools are reaching out for help in addressing the needs of their most struggling readers

Professional Development: Implications Interest in cross-school and cross-district collaboration and information sharing Interest among Principals for meeting with other Principals Interest among Reading First coordinators in regional meetings to share with other schools different approaches to Reading First Interest among sites in increased offerings of Professional Development from State, particularly among smaller districts, and preferably offered regionally

Role of NM Cadre Member (Lit. expert assigned to NMRF school) : Some Trends Geographic proximity greatly improved the significance of a coach/cadre member’s role in a school Schools with an active cadre member/coach appeared to be more likely to discuss use of assessment data to inform classroom practice

School Level Leadership Principals: express high levels of support, but report lack of adequate knowledge about the program Planning time: The majority of schools have instituted new or extended planning times for Reading First teachers, either school-wide, or across grade levels (depending on size of school) Study groups: Many schools have instituted a range of activities that engage teachers in discussion about key issues of reading first

School Level Leadership: Professional development on classroom level A combination of: –1) support from principal, –2) time to plan, and –3) time to engage in pedagogical discussion support an atmosphere of ongoing professional development that focuses on supporting students’ reading development

Issues particular to states with large numbers of very small school districts Equity in program supports Funding issues in relation to school size, location and resource availability Access to professional development opportunities, resources, information, and exposure to new ideas

Some Technical Assistance resources for Reading First: NCREL SEDL RMC Research Corporation:

Education Development Center, Inc. Center for Children and Technology To learn more about our organization, go to our web site: – Contact information: –Naomi Hupert