Download presentation
Presentation is loading. Please wait.
Published byReginald Johns Modified over 8 years ago
2
Developing an evaluation of professional development Overview session: Critical elements of professional development planning and evaluation 1
3
Information and materials mentioned or shown during this presentation are provided as resources and examples for the viewer's convenience. Their inclusion is not intended as an endorsement by the Regional Educational Laboratory Southeast or its funding source, the Institute of Education Sciences (Contract ED-IES-12-C-0011). In addition, the instructional practices and assessments discussed or shown in these presentations are not intended to mandate, direct, or control a State’s, local educational agency’s, or school’s specific instructional content, academic achievement system and assessments, curriculum, or program of instruction. State and local programs may use any instructional content, achievement system and assessments, curriculum, or program of instruction they wish. 2
4
Webinar 1: Outline Brief review of lessons learned – Drs. Barbara Foorman & Russell Gersten Best practices in design and planning – Dr. Sharon Koon Logic models – Dr. Barbara Foorman Designing evaluation questions – Dr. Adrea Truckenmiller Question & answer session 3
5
LESSONS LEARNED Drs. Barbara Foorman & Russell Gersten 4
6
What are the key components of professional development evaluation? Teacher buy-in On-going support Enhancing existing practices Emphasis on coaching, for sustainability Fidelity Provide feedback Not evaluative Institutional commitment 5
7
How did you work with schools to identify a feasible comparison group? Define comparison condition Use a meaningful curriculum Use a waitlist control Measure what is happening in the comparison group 6
8
What types of outcome measures do you use? Multiple aspects of the construct (e.g., oral & written vocabulary) Survey to further improve PD & instruction 7 Nationally-normed measure Student outcomes of PD target Outcomes important to stakeholders Observation
9
What else would you highlight from your projects involving professional development? Relate teacher knowledge to classroom practice – Use teacher knowledge as a moderator – Can inform future PD & instruction Lesson design Role play/practice Relate classroom activity to student gains Measure fidelity Choose reliable measures Carefully consider the elements of the PD you choose 8 “It’s a critical field for ambitious people to begin to tackle.” – Russell Gersten on professional development evaluation
10
BEST PRACTICES IN DESIGN AND PLANNING 9 Dr. Sharon Koon
11
Guiding framework in this webinar series: The What Works Clearinghouse (WWC) The WWC assesses the quality of effectiveness research. High-quality effectiveness research rules out other causes of effects. The WWC evidence standards: – Were developed by panels of national experts – Focus on causal validity of the study design – Are applied to each study by certified reviewers 10 Source: http://ies.ed.gov/ncee/wwc/multimedia.aspx?sid=18
12
Distinction between WWC evidence standards and additional qualities of strong studies WWC design considerations for assessing effectiveness research: – Two distinct groups—a treatment group (T) and a comparison group (C). – For randomized controlled trials (RCTs), low attrition for both the T and C groups. – For quasi-experimental designs (QEDs), baseline equivalence between T and C groups. – Contrast between T and C groups measures impact of the treatment. – Valid and reliable outcome data used to measure the impact of a treatment (T). – No known confounding factors. – Outcome(s) not overaligned with the treatment. – Same data collection process—same instruments, same time/year—for the T and C groups. 11 Source: http://www.dir-online.com/wp-content/uploads/2015/11/Designing-and-Conducting-Strong-Quasi- Experiments-in-Education-Version-2.pdf
13
Distinction between WWC evidence standards and additional qualities of strong studies (cont.) Additional qualities of strong studies: – Pre-specified and clear primary and secondary research questions. – Generalizability of the study results. – Clear criteria for research sample eligibility and matching methods. – Sample size large enough to detect meaningful and statistically significant differences between the T and C groups overall and for specific subgroups of interest. – Analysis methods reflect the research questions, design, and sample selection procedures. – A clear plan to document the implementation experiences of the T and C conditions. 12 Source: http://www.dir-online.com/wp-content/uploads/2015/11/Designing-and-Conducting-Strong-Quasi- Experiments-in-Education-Version-2.pdf
14
Our purpose Integrate WWC standards and other best practice recommendations into this webinar series so that participants have the resources necessary to design an evaluation aligned to best practices to make causal conclusions. Links to sources are provided on the bottom of each slide when applicable. 13
15
Types of WWC group design studies RCTs – At baseline, study sample randomly divided into groups, such as a T group and a C group (e.g., “business as usual”) – Different study groups are offered access to an intervention or not (or can be offered the intervention at a later time in a “wait list control” design) QEDs – Study groups are not created by randomly dividing the study sample into the groups – Researcher constructs groups that are similar at baseline and are offered (or already offered) an intervention or not Regression Discontinuity Designs (RDDs) – Applicable when a continuous “scoring” rule is used to assign the intervention to study units (e.g., districts, schools, or teachers). – Units with scores below a pre-set cutoff value are assigned to the T group and units with scores above the cutoff value are assigned to the C group, or vice versa. Sources: http://ies.ed.gov/ncee/wwc/multimedia.aspx?sid=18, http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=231 http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=231 14
16
Key RCT features examined and WWC ratings Sources: http://ies.ed.gov/ncee/wwc/multimedia.aspx?sid=18, http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=19http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=19 15
17
Key QED (and high-attrition RCT) features examined and WWC ratings Sources: http://ies.ed.gov/ncee/wwc/multimedia.aspx?sid=18, http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=19http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=19 16
18
Other issues for both RCTs and QEDs Avoiding confounds – A confound is a component completely aligned with only one study condition (e.g, in a school-level RCT, only one school is assigned to each of the study groups). – Impossible to separate the effect of intervention and confounding factor – Cannot attribute impact solely to intervention – Studies with confounds are rated Does Not Meet Standards Using eligible outcomes Source: http://ies.ed.gov/ncee/wwc/multimedia.aspx?sid=18 17
19
WWC Reporting Guide for Study Authors: Study characteristics Describe the study’s: Intervention condition Comparison condition Setting Participants Source: http://ies.ed.gov/ncee/wwc/pdf/reference_resources/wwc_gsa_v1.pdf 18
20
WWC Reporting Guide for Study Authors: Study design and analysis Describe the study’s: Sample formation Outcome measures Analytic approach Statistical adjustments Method for addressing missing data Source: http://ies.ed.gov/ncee/wwc/pdf/reference_resources/wwc_gsa_v1.pdf 19
21
WWC Reporting Guide for Study Authors: Study data Provide pre- intervention data on the baseline and analytic samples, and post- intervention data on the analytic sample. Source: http://ies.ed.gov/ncee/wwc/pdf/referenc e_resources/wwc_gsa_v1.pdf 20
22
GOLD STANDARD EXAMPLES Dr. Adrea Truckenmiller 21
23
Goal of PD: to implement a reading comprehension practice in grade 8 social studies classrooms PD: 2-day workshop In-class support Problem solving Evaluation hypothesis: students would improve on social studies units with negligible effects on general social studies & reading comprehension outcomes Design: RCT with 5 teachers in 2 schools - each teacher had treatment & control sections Outcomes: implementation on checklist; proximal measures of social studies & reading; distal measures of reading comprehension Fidelity observation checklist & plan Example study #1 22 Vaughn et al., 2009
24
Goal of PD: learn to integrate cognitive strategies & process writing – Project Pathway PD: 6 full-day sessions 5 after school sessions Coaching Evaluation hypothesis: teachers’ participation in the Pathway Project for 1 full year would improve student performance Design: teacher RCT specifically targeted to mainstream EL students: score 4 or 5 on CELDT & mid-basic on state test Outcomes: implementation on checklist; intermediate measures of on-demand writing; state standards test of ELA Fidelity observation checklist & plan Example study #2 23 Kim, Olson, Scarcella, Kramer, Pearson, van Dyk, Collins, & Land, 2011
25
Student performance on the California Standards Test in English language arts improves Students pass CAHSEE, graduate from high school, and pursue postsecondary education Outcome Student performance on the on-demand writing assessment improves Teachers use of cognitive strategies in reading and writing activities in their classroom lessons Outcome Example study #2: Logic model 24 (1)Teachers: English teachers in Grades 6 to 12. (2)Students: Latino ELLs scoring at or above intermediate on the CELDT Participants (1)Teachers learn to use the Reader’s and Writer’s Tool Kit (2)Teachers learn to use pretest, on- demand writing results and Pathway materials to teach a cognitive strategies approach to text- based analytical writing (3)Coaches help teachers integrate analytical writing strategies into the ELA curriculum Pathway PD
26
LOGIC MODELS 25 Dr. Barbara Foorman
27
A logic model exercise completed by the planning team can: Lead to consensus on final program outcome(s) Determine inputs (resources) to consider in the process Determine outputs (activities to complete and expected products) Determine short, medium, and long-term outcomes – In PD projects, these are typically teacher & student outcomes Consider assumptions and external factors that may impact planning and delivery of the PD effort REL Pacific Education Logic Model Application Link to introductory videos and resources for the REL Pacific Education Logic Model Application: http://relpacific.mcrel.org/resources/elm-apphttp://relpacific.mcrel.org/resources/elm-app REL Southeast Summer Reading Camp Logic Model Video link: http://interact.mysdhc.org/p5vz1eumv53/ (16:04-1:14:50)http://interact.mysdhc.org/p5vz1eumv53/ 26 Using a Logic Model for Professional Development Planning
28
Possible members of the planning team: School and district administrators leading the PD effort One or more of the PD trainers One or more of those scheduled to provide follow-up support (coaches, lead teachers, etc.) One or more experienced teacher participant(s) One or two less experienced teacher participant(s) Research team member(s) charged with evaluating the PD effort 27 Using a Logic Model for Professional Development Planning
29
Sample Logic Model 28 Increased student reading test scores Long-term outcomes Increased teacher use of alternative strategies for presenting reading content Mid-term outcomes Increased positive student attitudes toward learning Increased student understanding of reading content Short-term outcomes Increased teacher knowledge of multiple instructional strategies to teach reading Increased teacher knowledge of reading content Number and type of guides and sample lessons for each grade level Outputs Number of participants per workshop and total hours each participant attended the workshop Develop and provide teaching guides and sample lessons Conduct teacher workshops Activities Research-based guidance on reading strategies Curriculum coordinators Elementary school teachers Resources Source: http://files.eric.ed.gov/fulltext/ED544779.pdf
30
29 Using a Logic Model for Professional Development Planning http://rel-se.fcrr.org/_/documents/pd_webinar/blank_logic_model_template.pdf http://rel-se.fcrr.org/_/documents/pd_webinar/blank_logic_model_template.docx
31
DESIGNING EVALUATION QUESTIONS 30 Dr. Adrea Truckenmiller
32
Content of evaluation questions What is the intended/stated goal of the PD? What is the goal of professional development for the participants? What are the direct effects between the PD and intended outcomes? 31
33
Example evaluation questions Did the PD impact participants’ efficacy and knowledge with the topic? After sufficient PD, was there a related change in the participants’ teaching practices? Did the PD have an indirect impact on students’ performance in the targeted area? Did the PD have an indirect impact on students’ performance in the generalized skill? 32
34
Refining evaluation questions Use words like ‘impact’ ‘association’ ‘relation’ ‘descriptive’ 33 SMART SpecificMeasurableAchievableRelevantTimely
35
Questions & Answers Homework: Meet with PD team Complete logic model Bring questions to sessions 2 - 5 34
36
Developing an evaluation of professional development Webinar 2: Going deeper into planning the design 1/14/2016, 2:00pm Webinar 3: Going Deeper into Identifying & Measuring Target Outcomes 1/15/2016, 2:00pm Webinar 4: Going Deeper into Analyzing Results 1/19/2016, 2:00pm Webinar 5: Going Deeper into Interpreting Results & Presenting Findings 1/21/2016, 2:00pm 35
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.