Wednesday, October 28 2:30 – 3:30 PM

Slides:



Advertisements
Similar presentations
Detailed Slideshow. Why is North Carolina developing Home Base? The central focus of READY is improving every students learning... …and enabling and ensuring.
Advertisements

Teacher Effectiveness Evaluation Pilot September 1, 2011 – September 30, 2012 NJ State Board of Education, July 13, 2011.
Open Future Doors through Succession Planning Principal? Curriculum Supervisor? Assistant Superintendent? Special Services Director?
PARCC Progress Update 1 June 26, In the Last Year… 2 June 2012 Minimum Technology Specifications, Version 1.0, Released Item Development Began August.
IDEA Advisory Special Education Roland-Grise Middle School Auditorium August 21, 2009.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
1 Presented by Media Services Media Specialists Connections and Issues Training: November – December Zone Based Meetings.
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
REGIONAL PEER REVIEW PANELS (PRP) August Peer Review Panel: Background  As a requirement of the ESEA waiver, ODE must establish a process to ensure.
Critical Information SAGE Critical Information 1 Judy Park, Ed.D. Associate Superintendent Utah State Office of Education.
Agenda Overview of evaluation Timeline Next steps.
Principals’ Council Meetings May  Given feedback from multiple stakeholders and after much deliberation, PDE has made the determination to classify.
Interim Joint Committee on Education June 11, 2012.
Student Teacher Exit Survey DARTEP Presentation February 2006.
Professional Growth & Effectiveness System (PGES) Webcast July 24, 2013.
As Adopted by Emergency Action June, 2015 Slides updated
Updated Performance Management for Exempt Staff Fall 2009.
Early Childhood Education (ZA) Endorsement Program Reviewer Technical Assistance Office of Professional Preparation Services Michigan Department of Education.
Professional Teaching Portfolio Training
Honors Level Course Implementation Webinar Honors Rubric and Portfolio Review Process October 7, 2013.
Kindergarten Individual Development Survey (KIDS) District 97 pilot involvement December 11, 2012.
Council of Research Associate Deans January 20, 2011.
TEAM-Math Teacher Leader Meeting October 28, 2004.
Documents posted at QRIS 2011 Program Quality Improvement Grant RFP Bidder’s Conferences February & March 2011 Wendy Valentine Director,
Rhode Island Innovation Evaluation & Support System (RIIESS) for Support Professionals Fall 2013.
May 29, 2013 Chanute USD 413 And Kansas State Department of Education.
MISSOURI PERFORMANCE ASSESSMENTS An Overview. Content of the Assessments 2  Pre-Service Teacher Assessments  Entry Level  Exit Level  School Leader.
Oregon Framework for Teacher and Administrator Evaluation and Support UPDATE Presented by the Oregon Department of Education November 19, 2012.
Educator Performance Assessments ESE Spring Convening May 27 and 28, 2015 Presented by: Jennifer Briggs.
From Mad River Schools Waiver Day Presentation May 15, 2013.
Governor’s Teacher Network Action Research Project Dr. Debra Harwell-Braun
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Missouri Department of Elementary and Secondary Education September 2015 Missouri Assessment Program Assessment Updates Shaun Bates Director.
BEGINNING EDUCATOR INDUCTION PROGRAM MEETING CCSD Professional Development Mrs. Jackie Miller Dr. Shannon Carroll August 6, 2014.
Oregon Standards: An Update 2009 Superintendent’s Summer Institute Oregon Department of Education August 3, 2009.
Texas Educator Evaluation & Support System Systems US Department of Education NCLB Waiver-Fall 2013 Condition of Waiver to develop new Texas evaluation.
SD WriteToLearn Assessment Pilot TIE Conference April 20, 2010.
2015 NCAAHPERD-SM ASW What We Learned in Year One Today’s ppt is available on the Professional Development page of Healthful Living Wiki at
PROFESSIONAL DEVELOPMENT UPDATES ON IMPLEMENTING THE NEW REGULATIONS JUNE 2, 2009 MARY JANE PARKE.
Next Generation Assessment Stakeholder Meeting December 10,
OCTEO Conference ODE Update ∙ October 17, Overview Ohio Assessments for Educators Title II Report Resident Educator Program
APRIL 2, 2012 EDUCATOR PREPARATION POLICY & PRACTICE UPDATE.
Click to edit Master title style SNAP-Ed NEOPB FFY GUIDANCE FUNDING APPLICATION REQUEST Informational Conference Call October 21, :30-11:30am.
Educator Effectiveness Updates April Updates Closing up Looking forward to
Introduction to the Pennsylvania Kindergarten Entry Inventory.
Pennsylvania Training and Technical Assistance Network Specialists and Licensed Professionals Spring Mini-Pilot Angela Kirby-Wehr 3/22/13.
Unit Meeting Feb. 15, ◦ Appreciative Inquiry Process-BOT Steering Committee and Committee Structure. ◦ Four strategies identified from AIP: Each.
Ohio Department of Education OCTEO Conference March 19,
Quality Review Updates for Presented by Mary Barton, SATIF CFN 204 Assistant Principals’ Conference September 2, 2011.
LinguaFolio Update December 17, LinguaFolio standards-based self-directed formative assessment tool THAT records ongoing learner progress external.
Lenoir County Public Schools New North Carolina Principal Evaluation Process 2008.
Professional Growth and Effectiveness System Update Kentucky Board of Education August 8,
Teacher Licensure PI-34 Wisconsin’s New Process. New License Stages  Initial Educator 5 year, non-renewable  Professional Educator 5 year renewable.
January 21, 2014 Welcome! The January COP call will begin in a few moments. –Event password: 5COPcall –Event number: 666– To hear the presentation.
CAEP Standard 4 Program Impact Case Study
Teacher Evaluation Timeline
UPDATE Continuous Improvement in Educator Preparation:  A Data-Informed Approach to State Program Review Presentation to the Alabama State Board of Education.
OCTEO Conference ODE Update ∙ October 28, 2016.
Elayne Colón and Tom Dana
Developing a Community of Practice for Faculty Training
Thursday, October 27, 2:45 – 3:45 PM
ECDS Early Childhood Data System Texas Student Data System
Ohio Department of Higher Education Fall 2017 Update to OCTEO
UTEAC Update 11/7.
ECDS Early Childhood Data System Texas Student Data System August 2018
ECDS Early Childhood Data System Texas Student Data System
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Training Chapter for the Advanced Field Experience Form (Pre-CPAST
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Wednesday, October 28 2:30 – 3:30 PM Valid and Reliable Instruments for Educator Preparation Programs (VARI-EPP) Student Teaching Form (VE-ST Form) Implementation Team OCTEO Update ERICA Wednesday, October 28 2:30 – 3:30 PM VARI-EPP Coordination Team: Erica Brownstein, Kristall Day, and Carolyn Kaplan

VE-ST Form: The ATV of Instruments ERICA All Terrain Vehicle Sources: http://media.dma.mil http://www.mykemptvillenow.com https://www.expedia.com

History of Valid and Reliable Instruments Project Inviting people to join our adventure…

VARI-EPP Partners role in the process OSTP Professional Standards (InTASC) Research (Ball, Marzano, etc.) Enables VE-ST Form Measures ST Performance Used for Data Formative assmt. - for student teachers Prog. improvement - Availability and program use of comparison mean scores (aggregated from all users) The VE-ST Form Process ERICA VARI-EPP Partners role in the process

What data are collected? - VE-ST Consensus Score - edTPA scores - OAE scores - Demographics All IHEs IRR IHEs Data plus - three observations - IRR Supervisor final evaluation ERICA

At least 10 institutions collecting data in fall Should have received a data collection spreadsheet & instructions ERICA

Project Timeline 2015- 2016 Summer 2014 Spring 2015 June 2015 Instrument Development Summer 2014 Pilot Implementation Spring 2015 Data Analysis: Implementation, Feedback, Content Validity June 2015 2015- 2016 Instrument Revision July 2015 Training Module 2.0 Development: For supervisor use of forms Summer 2015 Implementation of VE-ST Form Version Two: Implementation Team August 2015 to May 2016 Data Analysis January 2016 CAROLYN

What to expect for remainder of year Second review by content experts Request for IRR demographics November 2015 Release of training version 3.0 Early December 2015 Selection of IRR IHEs Late December 2015 Fall data due to Ohio State for analysis January 4, 2016 Training webinar for IRR IHEs Early January Submit form to CAEP for advanced rubric evaluation Winter 2016 Updated data collection form and webinar February 2016 Spring data due to Ohio State for analysis IRR funds dispensed to IRR supervisors after data submission June 2016 Release of training version 4.0 Summer 2016 CAROLYN

Subset of IHEs Inter-Rater Reliability (IRR) Study Different (and in addition to) statistical analyses Spring term only Two supervisors will observe the same student teacher Supervisor: University-assigned supervisor, completes duties as assigned, completes VARI-EPP form as formative and summative IRR Supervisor: A minimum of 3 observations, completes VARI-EPP form as summative IRR Supervisor receives $200 If chosen to be part of the IRR study, the institution will receive A spreadsheet with designated columns for data collection Detailed instructions and webinar for how to complete the study KRISTALL (This is what you said before for the webinar) The second big component of the project includes inter-rater reliability studies. This is different than the analyses discussed on the previous slide and there are several additional steps for this part of the project. Not all institutions will be participating in this part of the study. The funding we have available to support this part of the study is limited; therefore, we may not be able to include everyone who is interested. We hope to include a variety of institutions and programs so that our data are more representative of the total population. The general procedures for this part of the study include the following steps. First, you will identify a group who is willing to participate. This means the student teacher, the assigned supervisor, and the inter-rater supervisor have all agreed to be in the study and have signed the appropriate consents. In addition to these individuals providing consent, you will also need approval from the mentor teacher and the school. A conversation with the mentor teacher explaining the study should occur and you should obtain a letter of support from the building principal. Once you have IRB approval, we will provide the consent documents and templates for the letters of support. Once all of the appropriate permissions have been obtained, the University Supervisor (the supervisor who was officially assigned to the student teacher) will follow the procedures that are typical to the institution. For example, if your institution requires the supervisor to observe 6 times, the supervisor will observe 6 times. If your institution requires completion of a narrative feedback form during each visit, the university supervisor will use that form. The only difference is that the university supervisor will complete the VARI-EPP form as a formative (midterm) and summative (final) assessment. It is critical that we have a record of the university supervisor’s independent ratings in addition to the consensus scores. The IRR Supervisor, or the inter-rater supervisor, will observe the student teacher a minimum of 3 times. They can complete any forms the program uses as well, but at minimum they are required to complete the VARI-EPP form as a summative evaluation based on their 3 observations. Again, it is critical that we have the independent ratings for the supervisors so that they can be compared. The IRR Supervisor is not required to debrief with the student teacher and mentor teacher, but they can if the program chooses to do this. The IRR Supervisor should not observe at the same time as University supervisor and the two supervisors should not discuss the student teacher’s ratings and progress. The IRR Supervisor will not be part of the three-way conferences for the midterm and final and IRR supervisor will not assign any grades for the student teacher. IRR Supervisor receives the $200 incentive for participating in the study. The university-assigned supervisor does not receive an incentive because they are not doing additional work. Once you have IRB approval and you have been contacted to participate in the Inter-rater study, we will provide data collection tools for this portion of the study as well. Before turning back to Erica to discuss the form, are there any questions about the research?

IRB Update Many IHEs have been approved to begin Non-FWA IHEs will hear from OSU IRB soon NIH Training required If unsure, please contact Kristall Day (day.368@osu.edu) KRISTALL

Erica: The form has been linked to CAEP, InTASC and OSTP Erica: The form has been linked to CAEP, InTASC and OSTP. We have moved away from the term “alignment” because we are not trying to say the form has the breadth and depth of the standard, but does have linkages.

Tools Available to VARI-EPP Partners Two-page project description for interested stakeholders, administrators, etc. Supervisor Checklist The Form The ‘Look Fors’ Document Consensus Form for conference ERICA – Solicit any partners to speak about the tools they have been using

Thank you for your patience… With the rollout of Training 2.0 Feedback from supervisors informed revisions Version 3.0 will be available in early December (or earlier) May need to enlist assistance of other institutions with other LMSs We are investigating how to simplify the use of Training Module 3.0 in other systems CAROLYN

FAQ What to do if my students have multiple placements? What do I use for ongoing observation? Isn’t the VE-ST Form an observation tool? Are institutions developing or going to develop their own cut scores? ERICA Source: https://openclipart.org/detail/213531/icon-faq

Discussion Questions In small groups, please discuss RECONVENE Why participate? What are the obstacles? What would you like to share? What is working? Not working? RECONVENE Group reporting Questions? Comments? Project Process Future ERICA

Current Partners: We need your help! Development of Assessment Questions for Training Version 3.0 Questions encouraging trainees to examine form/ ‘Look Fors’ document Due by November 23 For example: For more information, and to volunteer, please contact Carolyn (kaplan.169@osu.edu) CAROLYN

It’s great to be in the same room!! If you have any questions AT ANY TIME, feel free to contact: Name Email Phone Topic Erica Brownstein Brownstein.2@osu.edu (614) 292-1414 “Big Picture” Project Questions, Rubric Questions Kristall Day Day.368@osu.edu (614) 292-5044 IRB, Data Collection, Timeline Carolyn Kaplan Kaplan.169@osu.edu (614) 292-2581 Online Training Module, Data Collection, Timeline James Yao Yao.298@osu.edu Data Collection ERICA

And onward we go! ERICA