Tony Kirchner (tony.kirchner@wku.edu) Developing a Quality Assurance Plan with the Teacher Work Sample as the Linchpin Tony Kirchner (tony.kirchner@wku.edu)

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

1 Using K-12 Assessment Data from Teacher Work Samples as Credible Evidence of a Teacher Candidate’s Ability to Produce Student Learning Presented by Roger.
Student Growth Developing Quality Growth Goals II
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Ohhhhh, Christopher Robin, I am not the right one for this job... The House on Pooh Corner, A.A. Milne.
The Academic Assessment Process
Assessment of Student Teaching and Internships Sam Evans Western Kentucky University 9/26/02.
performance INDICATORs performance APPRAISAL RUBRIC
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Overall Teacher Judgements
CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP.
CONNECT WITH CAEP | | CAEP Standard 3: Candidate quality, recruitment and selectivity Jennifer Carinci,
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
CONNECT WITH CAEP | | The Next Horizon Incorporating Student Perception Surveys into the Continuous.
Alternative Assessment
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Evidence for Impact on Student Learning Tony Norman Associate Dean, CEBS In P. R. Denner (Chair), Evidence for Impact on Student Learning from the Renaissance.
The Role of the Internal and External Evaluators in Student Assessment Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Council for the Accreditationof EducatorPreparation Standard 5: Provider Quality Assurance and Continuous Improvement 2014 CAEP –Conference Nashville,
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
The Continuum of Interventions in a 3 Tier Model Oakland Schools 3 Tier Literacy Leadership Team Training November
edTPA: Task 1 Support Module
CAEP Standard 4 Program Impact Case Study
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
Presented by Deborah Eldridge, CAEP Consultant
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Eastern’s Assessment System
The assessment process For Administrative units
Clinical Practice evaluations and Performance Review
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Partnership for Practice
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Integrating Theory into Practice
Child Outcomes Summary Process April 26, 2017
The Continuum of Interventions in a 3 Tier Model
Michael Kelly, Ed. D. John Gratto, Ed. D. Virginia Tech
Elayne Colón and Tom Dana
Surface energy modification for biomedical material by corona streamer plasma processing to mitigate bacterial adhesion Ibrahim Al-Hamarneh Patrick.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Standard 3 Candidate Quality, Recruitment, and Selectivity
Educator Effectiveness System Overview
Integrating Outcomes Learning Community Call February 8, 2012
Implementation Guide for Linking Adults to Opportunity
Introduction to Student Achievement Objectives
Standard Four Program Impact
Analyzing Student Work Sample 2 Instructional Next Steps
Introduction to Core Professionalism
EDA: Educator Disposition Assessment
Strategic Plan: Tri-Cities High School
Unit 7: Instructional Communication and Technology
EDD/581 Week 1.
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Student Evaluations of Teaching (SETs)
Deborah Anne Banker Committee Chair
TLQAA STANDARDS & TOOLS
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

Tony Kirchner (tony.kirchner@wku.edu) Developing a Quality Assurance Plan with the Teacher Work Sample as the Linchpin Tony Kirchner (tony.kirchner@wku.edu) Tony Norman (tony.norman@wku.edu) WKU CAEP SSR Leads

Mindset shifts… From continuous assessment to quality assurance plan From multiple of varying quality to a few targeted and defensible assessments For initial preparation, from program specific to EPP-wide assessments

Candidates look the same Continuous Assessment NCATE Large amounts of data Collect it, we might need it Capture Everything All students can be moved to acceptable levels Must meet proficiency to move on Meet Proficiency No variability in the data Candidates look the same Quality Assurance CAEP Limited number of data points Must have Validity/Reliability Across all programs (IP) Fewer “Key” Assessments Students should be scored “where they are” May be below proficiency Aspirational Increases variability in the data Predictability

So what fits under the Quality Assurance System Plan (QASP)?

Standard 5: Provider Quality, Continuous Improvement, and Capacity Quality and Strategic Evaluation 5.1 The provider’s quality assurance system is comprised of multiple measures that can monitor candidate progress, completer achievements, and provider operational effectiveness. Evidence demonstrates that the provider satisfies all CAEP standards. 5.2 The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent. Continuous Improvement 5.3 REQUIRED COMPONENT The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes. 5.4 REQUIRED COMPONENT Measures of completer impact, including available outcome data on P-12 student growth, are summarized, externally benchmarked, analyzed, shared widely, and acted upon in decision-making related to programs, resource allocation, and future direction. 5.5 The provider assures that appropriate stakeholders, including alumni, employers, practitioners, school and community partners, and others defined by the provider, are involved in program evaluation, improvement, and identification of models of excellence.

www.wku.edu/cebs/caep/documents/wku_quality_assurance_system.pdf

Avoiding getting lost in the assessment validity and reliability “weeds”

Develop a systematic (and system-wide) approach to V/R Avoid assessment level processes to determine V/R Document the process as part of the QAS Look for assessments with previous V/R Focus on “high stakes” assessments first Determine the process “status” of each assessment

Building on your best assessments

TWS DEVELOPMENT Originally developed by the Renaissance Partnership, as part of a six-year Title II Improving Teacher Quality Program grant, to assess growth of teacher candidates, as well as their ability to affect student learning. During TWS development, university education and education-related content faulty and P-12 partners (teachers and administrators) from 11 higher education institutions worked together to create the teaching standards, prompts, and rubrics. Semi-annual meetings occurred over six years to develop, pilot, score, and continually refine the TWS.

TWS VALIDITY As described in Denner, Norman, Salzman, Pankratz, and Evans (2004), TWS validity was established following a panel of expert raters process (Crocker, 1997) for judging content representativeness on four criteria: (1) frequency of TWS teaching behaviors to actual teaching, (2) criticality (importance) of TWS tasks to actual teaching, (3) authenticity (realism) of TWS tasks to actual teaching, and (4) representativeness of TWS tasks to target standards. Other studies have measured the concurrent and predictive relationships between TWS performance and other measures of quality teaching. Furthermore, Denner, Norman, and Linn (2008) delineate research at two institutions using the TWS (WKU and Idaho State University) that provides evidence that the TWS is adequately free from bias (consequential validity and disparate impact analysis).

TWS RELIABILITY Since as early as 2005 special attention has been given to assuring and reporting that the TWS is scored fairly, accurately, and consistently (see Denner et al., 2008; Denner et al., 2004; Kirchner, Evans, & Norman, 2010; Norman, Evans, & Pankratz, 2011; Stobaugh, Tassell, & Norman, 2010). In fall 2009, a unit-wide TWS taskforce was formed to revisit all aspects of the Teacher Work Sample to address various faculty concerns in order to improve this instrument. The current TWS rendition reflects changes last made to the instrument in 2011.

Our rationale for building on our best Portions of culminating assessment are now “pre-assessments” in earlier courses Provides exposure to required exit skills earlier in each program Leverages previous V/R work and evidence Combines future V/R efforts Leverages inherent predictive relationships between earlier “pre- assessments” and culminating assessment Buys time!

So YOU “get it”… now getting all other EPP members on board

Shake the foundation Outside consultant Avoid top-down directives (as much as possible) Find allies among faculty peers to share message and lead work Leverage state requirements (e.g., Kentucky Program Review Process) Use CAEP rubrics and tools Encourage others to seek training as CAEP evaluators

QUESTIONS?