Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Research Design Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation.

Similar presentations


Presentation on theme: "The Research Design Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation."— Presentation transcript:

1 The Research Design Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation

2 What research questions will we ask about MSRP impact? 1.Does MCLA effect core subject teachers’ knowledge and use of research-based literacy strategies? 2.What are the separate and combined effects of MCLA and Read 180 on students’ reading achievement levels, especially students identified as struggling readers? 3.What are the separate and combined effects of MCLA and Read 180 on students’ achievement in core subjects, especially students identified as struggling readers?

3 What outcome measures will we use? 1.Iowa Test of Basic Skills (ITBS) Vocabulary, Fluency, Comprehension 2.TCAP Reading, Social Studies, Science, Mathematics 3.Gateway and End of Course Assessments ELA, Mathematics, and Science

4 What research questions will we ask about MSRP implementation? 1. To what degree do the implemented MCLA & R180 treatments match the intended program standards and features? 2. What contextual district and school level factors may be influencing the implementation of MCLA & R180? 3. How do the professional development events, materials, or structures present in the control schools compare to what is present in the treatment schools?

5 Research Design for MCLA 4 matched pairs of schools (N=8) randomly assigned to treatment (MCLA) or control (no MCLA) condition Content area teachers in cohort 1 to participate in MCLA for Years 1 and 2 Control group teachers (cohort 2) to participate in MCLA in Years 3 and 4

6 MCLA: Random Assignment of Schools

7 MCLA: Exploring Efficacy Attempts to address questions about whether or not MCLA can work Depends upon rapid turnaround of data collected Relies upon formative feedback to guide program revisions Requires close collaboration among project stakeholders –To develop measures –To share information and data –To communicate regularly about changes and challenges –To troubleshoot and cooperatively address challenges

8 Research Design for Read 180 TM Random assignment of “eligible” students enrolled at 8 SR schools, where eligibility means: –No prior participation in READ 180 TM –Two or more grade levels behind in reading –Scores in bottom quartile on state assessment (TCAP) READ 180 TM is the treatment Counterfactual (business as usual*) is the control

9 Read 180: Random Assignment of Students

10 Read 180: Exploring Effectiveness Attempts to address questions about whether or not Read 180 will work… Provides evidence about what happens when R180 is implemented “off the shelf,” (without formative eval support) Requires MCS to set aside local need for feedback to address questions of importance to field Establishes a one-way firewall between MCS and RBS

11 Please review the safety card in the seat pocket… 1.Balance local knowledge of students’ needs within the identified “eligible pool” without creating selection bias 2.Address high rates of student mobility 3.Accurately describe the counterfactual 4.Obtain parental consent (and students’ assent) to administer the ITBS 5.Design procedures to prevent crossover 6.Deal with (inevitable) startup delays

12 Air Traffic Control: Did Random Assignment Work?

13 Are the student groups comparable? Students eligible for READ180 ™ : N = 2,277 Total students in 8 SR schools: N = 6,170 Students eligible as % of total: 36.9% No differences in race, gender, ethnicity, or poverty level between conditions Higher % of ELLs in control group (87 of 1,337 students, or 6.5%) than in R180 ™ (35 of 940 students, or 3.7%) Higher % of Sp Ed 8th graders in R180 ™ group (28.2%) vs control (20.9%)

14 What, how, and from whom should data be collected? Use multiple measures and methods –Interview developers, instructors, coaches, & principals –Surveys of teacher knowledge and attitudes –Focus group discussions with teachers –Evaluator observations of PD sessions –Evaluator observations of classroom implementation Use data to challenge/confirm findings from single sources Share findings with key stakeholders to determine whether: –data collected are appropriate to support decision making –evaluation findings reflect actual experiences –revisions to the logic model, IC map, and/or instruments are needed

15 Helen/Bob’s piece here…

16 The Flight Plan The MCLA Program Logic Model

17 Inputs: Funding, staff, curriculum resource center, facilities, incentives, research materials Principals Attend four three-hour principal fellowship sessions each year for two (or four?) years Participate in motivational, recruitment and celebratory events Discuss MCLA at faculty meetings Conduct walkthrough observations Provide opptys for teacher collab Allocate space for CRC materials Teachers Attend # weekly MCLA training Develop and implement 8 CAPs per year? Meet with coaches for feedback to improve implementation of MCLA strategies Integrate use of leveled texts to support development of content literacy among struggling readers Students Use MCLA strategies to read/react to content related text (independently? In collaborative groups? Neither? Both?) P Principals # hours of Principal Fellowship participation # of MCLA events attended Teachers # of hours of MCLA training attended # hours of coaching (contacts) # of CAPS implemented? Observed? videotaped? # of new lesson plans integrating literacy in content area lessons # and type of materials checked out of CRC Students # classes taught by teachers participating in MCLA # MCLA strategies students learn # (freq?) of MCLA strategy use Memphis Content Literacy Academy Evaluation Logic Model P Principals Awareness of and interest in staff implementation of MCLA concepts and strategies Teachers Increased knowledge of MCLA strategies Improved preparedness to use research-based literacy strategies to teach core academic content Increased use of direct, explicit instruction to teach reseach- based comprehension, fluency, and vocabulary strategies in content area classes Integrated use of MCLA strategies to support development of content literacy Students Increased familiarity with and use of MCLA strategies when engaging with text Increased internalization of literacy strategies Increased interest in school/learning Principals Improved school climate School-wide plans include focus on content literacy Improved instructional leadership Teachers Increased effectiveness supporting students’ content literacy development Continued collaboration among community of teachers to develop and implement CAPs Students Improved reading achievement and content literacy:  10% increase in students scoring proficient in Reading/LA and other subject areas of TCAP  mean increase of five NCEs on ITBS (comprehension? vocab?) Activities Outputs Short–term Outcomes Long-term Outcomes Higher Quality Teaching & student achievement

18 Defining what will be evaluated Developing the MCLA Innovation Configuration (IC) Map Involve diverse groups of stakeholders The development team The implementation team (MCS administrators & coaches) Experienced users Evaluators Identify major components of MCLA Provide observable descriptions of each component Describe a range of implementation levels

19 MCLA: The Conceptual Framework

20 Wheels Up: Resisting Premature Use of “Auto Pilot” With the IC map guiding development, the following measures were designed to collect data a/b MCLA implementation: Surveys –Teacher knowledge about & preparedness to use MCLA strategies –Teacher demographic characteristics –Teachers’ MCLA Feedback Interviews –Principals, coaches, development team, and MCS administrators Teachers Focus Group Discussions

21 Operationally defining components: “Job Definition”

22 Aligning the IC Map and Instrument Development: “Job Definition” – Teacher Survey

23 “Job Definition” - Principal Interviews

24 Where the rubber hits the runway… Classroom Implementation

25 Operationally defining components: Implementation of Lesson Plans

26 Implementation of lesson plans: Collecting classroom observation data

27

28 Please remain seated with your seatbelts fastened… Timely turnaround of data summaries Team meetings to debrief/interpret findings Testing what you think you “know:” –Productive (& challenging) conversations –Data-driven decision making –Taking Action –Following up (ongoing formative evaluation feedback)

29 Elizabeth’s piece here

30 Complimentary Refreshments: CRC Materials

31

32 Percentage Distribution of Planned Coaching Activities Logged in Year 1 (N=4,233 entries logged) ActivityFrequencyPercentage Coach’s administrative tasks135832.2 Conferencing with teachers71617.0 Observation69816.5 School administrative tasks3398.0 Collaborative teacher support3307.8 Coach’s professional development3037.2 Assisting teachers in class1383.3 Striving Readers evaluation tasks1383.3 Helping teachers prepare711.7 Modeling591.4 Videotaping/other731.7 ActivityFrequencyPercentage Coach’s administrative tasks135832.2 Conferencing with teachers71617.0 Observation69816.5 School administrative tasks3398.0 Collaborative teacher support3307.8 Coach’s professional development3037.2 Assisting teachers in class1383.3 Striving Readers evaluation tasks1383.3 Helping teachers prepare711.7 Modeling591.4 Videotaping/other731.7

33 Ground Transportation: The Coaching Role Trust b/w coach and teacher(s) is critical: To provision of CAP implementation support Pre-conference meeting CAP Observation –Co-teaching; modeling –Videotapes for use to train teachers, coaches, evaluators Post observation conference To effective and strategic selection of CRC & supplemental resources

34 Avoiding Wind Shear… Team’s unwavering commitment to helping teachers support the success of struggling adolescent readers sum > individual parts

35 …and we have the data to prove it!

36 Across grade levels, the picture is the same…

37 8 th Graders’ Reading Levels

38 School-wide comparisons with schools nation-wide

39


Download ppt "The Research Design Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation."

Similar presentations


Ads by Google