Presentation is loading. Please wait.

Presentation is loading. Please wait.

CREP Center for Research in Educational Policy The Center for Research in Educational Policy Best Practices in Program Evaluation: Strategies for Increasing.

Similar presentations


Presentation on theme: "CREP Center for Research in Educational Policy The Center for Research in Educational Policy Best Practices in Program Evaluation: Strategies for Increasing."— Presentation transcript:

1 CREP Center for Research in Educational Policy The Center for Research in Educational Policy Best Practices in Program Evaluation: Strategies for Increasing Survey Response Rates Presentation for the CREATE Conference October 8, 2009 Louisville, KY

2 Background Established in 1989 State of TN Center of Excellence Interim Director: Dr. Marty Alberg Staff Includes: –23 Research/Research Support –5 Statisticians –8 GA/Student Workers –12 Administrative/Accounting/ Technical Support

3 Educational research and evaluation in a wide variety of areas in PK–12 education –Scientifically-based research –Program evaluation –Formative evaluation –Data collection training –Instrument development –Leadership academies What do we do?

4 Who do we work with? Federal Government State departments of education Regional Education Laboratories Higher education institutions Evaluation organizations School districts Individual schools Community-based organizations Program developers

5 Project Areas Literacy and early literacy Charter schools Supplemental Educational Services Educational technology Teacher education and mentoring Principal training/development Urban school reform Psychometrics

6 Supplemental Supplemental Educational Services CREP has been involved with SES evaluations since 2004 Conducted multi-year evaluations in 13 states Worked with the Center on Innovation and Improvement –Evaluating Supplemental Educational Service Providers: Suggested Strategies for States –Improving SES Quality State Approval, Monitoring, and Evaluation of SES Providers

7 Why Evaluate? Meet Federal Requirements: –States must remove providers from approved list if they fail to: Increase students’ achievement for 2 consecutive years Provide services consistent with applicable federal, state, and local health, safety and civil rights requirements

8 Why Evaluate? Formalize accountability system Communicate plan, expectations and results Identify Strengths and Weaknesses Base improvement planning on objective data Document successes as supportive evidence

9 Figure 1. Components of a Comprehensive SES/Evaluation Modeling Plan STUDENT ACHIEVEMENT Service Delivery Customer Satisfaction Provider Survey District Coordinator Survey Principal/Liaison Survey Teacher Survey Parent Survey Additional Tests State Tests Overall Provider Assessment

10 Possible Research Questions Some of the questions currently used to address issues concerning: Provider Effectiveness NCLB compliance District and state level implementation

11 Possible Research Questions What are the effects of provider services on students in reading/language arts and mathematics? Do districts make SES available to eligible students? Do schools and providers work together to integrate services to meet the needs of eligible SES students? Are providers communicating regularly with stakeholders? Are providers adapting tutoring services aligned with each school’s curriculum and/or classroom curriculum? Are providers aligning curriculum with local and state academic standards? Are providers offering services to students designated as special education or English Language Learner (ELL)? What are the stakeholders’ (non-providers) overall assessments of provider performance? What are providers’ experiences with and assessments of SES interventions?

12 Why Survey? Stakeholder perceptions are vital in understanding implementation Federal guidelines strongly encourage parental feedback Survey results can inform decisions when achievement results are insignificant or negligible Provide fuller picture of the quality of service and implementation

13 Paper-Based and Online Surveys SES Evaluations: Stakeholder Feedback Paper-Based Survey for Parents Online Survey for District Coordinators Online Survey for Principals/Site Coordinators Online Survey for Teachers Online Survey for Providers

14 An SES Evaluation Contains: An Overall Statewide Assessment of SES: –Aggregated Stakeholder Results –Student Achievement Results for SES Providers Reading/Language Arts and Mathematics Individual Provider Assessments: –Stakeholder Results –Student Achievement Results Reading/language and Mathematics

15 Rubric of Overall Evaluation of Provider Effectiveness OutcomeInsufficient Information Below Standards Marginal Quality AcceptableAbove Standards 1. Student Achievement Insufficient information (insufficient sample size; non-significant results; or no achievement data) Students have not shown gains related to tutoring. Results are statistically significant and favor non-SES students There is evidence that some tutored students are making achievement gains. Overall comparison is statistically significant, with effect size up to +.17 There is evidence that some tutored students are making achievement gains. Overall comparison is statistically significant, with effect size ranging from +.18 to +.25 There is evidence that some tutored students are making substantive achievement gains. Overall comparison is statistically significant, with effect size greater than +.25 2. Communication Insufficient InformationProvider communication weak or nonexistent Provider communication inconsistent Provider is adequately communicating with key stakeholders. Provider regularly and frequently communicates with key stakeholders. 3. Instructional Plans Insufficient InformationInstructional plans not geared to student needs or reinforcement of regular academic program Provider inconsistently planned instruction geared to student needs or reinforcement of regular academic program Provider made attempts to plan instruction geared to student needs or reinforcement of regular academic program Provider instructional plans geared to student needs or reinforcement of regular academic program 4. Local and State Standards Insufficient InformationProvider services not in alignment with local and state academic standards Provider services inconsistently aligned with local and state academic standards Provider services sometimes aligned with local and state academic standards Provider services in alignment with local and state academic standards 5. Special Education and ELL Students Insufficient InformationProvider did not offer accommodations to special education or ELL students Provider inconsistently offered accommodations to special education or ELL students Provider sometimes offered accommodations to special education or ELL students Provider offered accommodations to special education or ELL students. 6. Assessment of Provider Overall Insufficient InformationDissatisfaction with provider overall Inconsistent satisfaction with the provider overall Some satisfaction with provider overall Satisfaction with provider overall

16 Parent Paper Survey Distribution During the 2008-2009 school year: –Over 31,000 paper parent surveys were distributed in 9 states –In 8 states, surveys were printed in English on one side and Spanish on the other side –Surveys were packaged and sent in bundles to SES directors to deliver to SES schools

17 Parent Paper Survey Distribution Traditional Distribution Method: 1.Surveys with instructions are printed, packaged and sent to SES districts 2.Between 40 and 60 survey packets are sent, boxed, for the district coordinator to deliver to SES schools 3.Principals/site coordinators distribute to SES students 4.Students bring them home, parents complete them. 5.Students bring them back to school. 6.After several weeks, principals/site coordinators mail surveys to CREP in postage-paid envelopes provided by CREP

18 Parent Paper Survey Distribution During the 2008-2009 school year, 2 states opted for a different delivery mode: –One district chose to directly mail surveys to the homes of a sample of parents– the other districts in the state used the traditional method –One state opted for district coordinators to directly address packet envelopes with parents’ names, deliver packets to school with cover letter from the district, and collect surveys from schools

19 2008-2009 Parent Response Rates Overall response rates for 2008-2009, based on number of surveys sent to schools: –Range was from 8% to 38% –Median of 17% –An increase from previous year’s median (12%)

20 Paper Survey Distribution States which distributed traditionally: –Response rates ranged from 10% to 33% –The median rate was 17% The district that directly mailed to a sample of parents: –Response rate was 8%; overall response rate for the state was 11% The state in which surveys were addressed to parents and delivered by district coordinators had a response rate of 38%

21 Challenges in Reaching Parents Timing Tight survey window due to standardized testing Contractual agreements can delay survey process Distribution Communicating with schools and districts can be challenging Difficulty in determining number of parents Can be hard to determine how many students were served at each school prior to mailing surveys Response rates may change once achievement data is received

22 Challenges in Reaching Parents Lessons Learned: –Communication is the key Between Evaluator and State and Districts Between State and Districts Between Districts and Schools –Pre-coding information, if possible, may increase response rate –Involvement of district coordinator is crucial –Earlier distribution is better

23 Challenges in Reaching Parents For Consideration… Sample surveys for non-participating parents Student surveys Focus groups Online surveys for parents

24 Online Survey Distribution Online Surveys are Utilized for other SES stakeholders During the 2008-2009 school year, CREP disseminated login information for : –9 SES state directors –491 SES providers –204 SES district coordinators –588 SES school personnel (sent to district coordinators)

25 Online Survey Distribution In order to access the online system, user identification codes and passwords are needed Log in Information is sent in the spring via email Test emails are sent to: –Verify correct participant and email address –Introduce and inform participant of the study

26 Online Survey Distribution The log in information is sent to: –State SES directors –SES Providers –SES District Coordinators The log in information for school personnel is sent to district coordinators to forward to the SES schools –Periodic reminders are sent in the weeks following

27 2008-2009 Online Response Rates Response rates vary among stakeholder groups: –SES Providers: Ranged from 45% to 100% Median of 79% –SES District Coordinators: Ranged from 47% to 100% Median of 79% –SES Principals/Site Coordinators: Ranged from 15% to 88% Median of 35% –SES Teachers: Responses representative of 7% to 50% of SES schools Median of 24%

28 Online Survey Response Rates Wide variances between stakeholder groups –Providers and District Coordinators most likely to respond –Principals/Site coordinators and teachers less likely: Communication goes through district Have less time

29 Challenges in Online Response Rates Lessons Learned: –Communication is the key Between Evaluator and Districts and Schools –Reminders are essential –Involvement of district coordinator is crucial –Earlier distribution of login information is better Would allow for reminders to school personnel Would allow feedback to state and district coordinators concerning lack of representation from school personnel

30 The Process of Evaluating SES is Most Effective when: Districts and School personnel are invested in the process Communication between all stakeholders is strong and meaningful Feedback regarding the impact of providers is timely and understandable


Download ppt "CREP Center for Research in Educational Policy The Center for Research in Educational Policy Best Practices in Program Evaluation: Strategies for Increasing."

Similar presentations


Ads by Google