The CAEP Accreditation Review Process:

Slides:



Advertisements
Similar presentations
CONNECT WITH CAEP | | Teachers Know Their Content And Teach Effectively: CAEP Standard 1 Stevie Chepko,
Advertisements

Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
DISTRICT IMPROVEMENT PLAN Student Achievement Annual Progress Report Lakewood School District # 306.
BY Karen Liu, Ph. D. Indiana State University August 18,
CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP.
CONNECT WITH CAEP | | CAEP Standard 3: Candidate quality, recruitment and selectivity Jennifer Carinci,
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Student Learning Objectives: Approval Criteria and Data Tracking September 17, 2013 This presentation contains copyrighted material used under the educational.
South Western School District Differentiated Supervision Plan DRAFT 2010.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Student Growth Measures in Teacher Evaluation Using Data to Inform Growth Targets and Submitting Your SLO 1.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
CONNECT WITH CAEP | | CAEP Accreditation and STEM Stevie Chepko, Sr. VP for Accreditation
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
CONNECT WITH CAEP | | CAEP Update Stevie Chepko, CAEP Sr. VP for Accreditation.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Council for the Accreditationof EducatorPreparation Standard 1: CONTENT AND PEDAGOGICAL KNOWLEDGE 2014 CAEP –Conference Nashville, TN March 26-28, 2014.
Performance-Based Accreditation
National Board Process
Data Conventions and Analysis: Focus on the CAEP Self-Study
What it means for New Teachers
OCTEO April 1, 2016 Margaret D. Crutchfield, Ph.D.
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
Presented by Deborah Eldridge, CAEP Consultant
Standard 3 Candidate Quality, Recruitment, and Selectivity
Eastern’s Assessment System
Maja Holmes and Margaret Stout West Virginia University
FALL 2019 AND BEYOND!!! Preparing and Writing the Self-Study Report
STANDARD 1 Content and Pedagogical Knowledge
Nancy Burstein Sue Sears California State University, Northridge
Partnership for Practice
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
UPDATE Continuous Improvement in Educator Preparation:  A Data-Informed Approach to State Program Review Presentation to the Alabama State Board of Education.
Office of Field and Clinical Partnerships and Outreach: Updates
NCATE Standard 3: Field Experiences & Clinical Practice
Elayne Colón and Tom Dana
Iowa Teaching Standards & Criteria
CAEP Orientation: Newcomers
Why Consider Becoming a Teacher?
TACTE Session: Accreditation Overview and Advanced Standards
NCATE 2000 Unit Standards Overview.
STANDARD 2/A.2 Clinical Partnerships and Practice
Special Education Teachers and Highly Qualified Requirements
Standard 3 Candidate Quality, Recruitment, and Selectivity
NYSATE/NYCATE FallCon: CAEP Accreditation
PROGRAM REVIEW AS PART OF THE CAEP ACCREDITATION PROCESS
April 17, 2018 Gary Railsback, Vice President What’s new at CAEP.
DESE Educator Evaluation System for Superintendents
Jean Scott & Logan Searcy July 22, MEGA
2018 OSEP Project Directors’ Conference
Implementation Guide for Linking Adults to Opportunity
california Standards for the Teaching Profession
Standard Four Program Impact
Resident Educator Program
Unit 7: Instructional Communication and Technology
STANDARD A.1 Content and Pedagogical Knowledge
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Standard one: revisions
Advisory Committees for Educator Preparation Programs
Presentation transcript:

The CAEP Accreditation Review Process: A Workshop For EPPs Working Toward CAEP Accreditation. Fall 2017 Gary Railsback, PhD, MBA CAEP Volunteer Lead Site Visitor Azusa Pacific University grailsback@apu.edu Vince O’Neill, EdD Accreditation Director Counselors and Accreditation Decisions vince.oneill@caepnet.org

CAEPCon F2017 Post conference Workshops Tuesday, September 26, 2017 Time Scheduled 8:30 am – 12:00 Pm CAEP Accreditation Review Process- Standards 1-4 8:30 am – 10:15 am Writing the SSR Writing to Standard 1 Writing to Standard 2 10:15 am – 10:30 am Break Writing to Standard 3 Writing to Standard 4

TURNING SANDBOX INTO A PDF

TURNING SANDBOX INTO A PDF

Standard 1/A.1 Content and Pedagogical Knowledge

CAEP Standard 1/A.1 Content Knowledge and Pedagogical Knowledge Candidate Knowledge, Skills, and Professional Dispositions 1.1 A.1.1 Provider Responsibilities 1.2-1.5 A.1.2

CAEP Standard 1 Initial Programs Content Knowledge and Pedagogical Knowledge The provider ensures that candidates develop a deep understanding of the critical concepts and principles of their discipline [components 1.1, 1.3] and, by completion, can use discipline-specific practices flexibly to advance the learning of all students toward attainment of college- and career-readiness standards [component 1.4].

CAEP Standard 1/A.1 Advanced Programs Content Knowledge and Pedagogical Knowledge The provider ensures that candidates for professional specialties develop a deep understanding of the critical concepts and principles of their field of preparation [component A.1.1] and, by completion, can use professional specialty practices flexibly to advance the learning of all P- 12 students toward attainment of college- and career-readiness standards [component A.1.2].

Step 1. Review Rules for Standard 1/A.1 General for all Standards All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses

Step 1. Review Rules for Standard 1/A.1 Special for Standard 1/A.1 No required components All data disaggregated by specialty licensure area Evidence from Standard 1/A.1 cited in support of continuous improvement, part of overall review system from The Accreditation Handbook, p. 15

CAEP Standard 1/A.1 Content Knowledge and Pedagogical Knowledge Replace this screen shot to get ride of “sample schedule” as it will be confusing to participants Notice character limit (at 2450 characters per page this is 12 pages)

Uploading your evidence to AIMS NOTE THERE IS AN INSTRUCTIONAL VIDEO

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT CHOOSE EITHER FILE OR FOLDER

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS NAME YOUR FILE TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Uploading your evidence to AIMS TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Disaggregating Data by Regional Campuses TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

Component 1.1 – Key Language Candidates demonstrate an understanding of the 10 InTASC standards at the appropriate progression level(s) in the following categories: the learner and learning; content; instructional practice; and professional responsibility. Reflect on: What evidence do I have that would demonstrate developing an understanding over time in these four categories?

-Portfolios (Rubrics & data results) Standard 1 Component 1.1 -Clinical Experience/Observation Instruments -Lesson/Unit Plan (Rubrics & data results) -Portfolios (Rubrics & data results) -Teacher Work Samples (Rubrics & data results) -GPA, Courses Specific P-12  Learner (See CAEP “Grade Policy”) -Dispositional (Rubrics & data results)

Standard 1 Component 1.1 -As an example - Major Field Test (ETS) in Math that comparing Math Education and non- education majors at completion of program. -End of Course/Program assessments -Pre-Service Measures of Candidate Impact -Capstone/Thesis + Proprietary Assessments/Measures + State Assessments/Measures

EPP-created Assessments Initial Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments 

Exercise 1: Supporting Evidence for Standard 1 Open: CAEP Standards 1-pager (Initial) (Sent by email) CAEP Evaluation Framework for EPP-Created Assessments (Sent by email) Handout: Exercise 1, Supporting Evidence for Standard 1 (sent by email)

Exercise 1: Supporting Evidence for Standard 1

Exercise 1: Supporting Evidence for Standard 1 Instructions: Consider the full range of evidence that could potentially be used for the four InTASC Categories. In the second column insert the name of the EPP created assessment instrument. Use the handout to evaluate a piece of evidence more deeply. Which sources could provide the strongest suite of cumulative evidence? Where are there gaps?

Exercise Resource- When you get back...

Evidence Sufficiency Criteria, 1.1 CANDIDATES DEMONSTRATE UNDERSTANDING OF 10 InTASC STANDARDS All four of the InTASC categories are addressed with multiple indicators across the four categories Indicators/measures specific to application of content knowledge in clinical settings are identified

Evidence Sufficiency Criteria, 1.1 CANDIDATES DEMONSTRATE UNDERSTANDING OF 10 InTASC STANDARDS Data/evidence are analyzed including identification of trends/patterns, comparisons, and/or differences Averages at/above acceptable levels on EPP’s scoring indicators, on InTASC standards (categories) If applicable, demonstration candidates’ performance is comparable to non-candidates’ performance in same courses or majors Performances indicate competency and benchmarked against the average licensure area performance of other providers Interpretations and conclusions are supported by data/evidence

MOVING TO ADVANCED STANDARDS

Component A.1.1– Key Language Candidates for advanced preparation demonstrate their proficiencies to understand and apply knowledge and skills appropriate to their professional field of specialization so that learning and development opportunities for all P-12 are enhanced… Reflect on: What evidence do I have that would demonstrate proficiencies in the content and skills referenced in Component A.1.1 for a specialization?

-Capstones/Portfolios/Thesis Standard A.1, component A 1.1 -Action Research -Capstones/Portfolios/Thesis -Dispositional/Professional Responsibility Data -Problem-based projects with coursework/group projects -Problem-based projects with school/district This is repeated 2 slides later, why?

-End of key-course tests (Does it meet CAEP Sufficiency level?) Standard A.1, component A 1.1 -Pre- and post-data and reflections on interpretations and use of data -End of key-course tests (Does it meet CAEP Sufficiency level?) -Grades, by program field (See CAEP policy on using grades) -Survey Data from Completers/Employers + State Assessments/Surveys + Other Proficiency Measures

EPP Created- Assessments Standard A.1, Component A 1.1 EPP Created- Assessments Advanced Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments 

Exercise A1: Supporting Evidence for Standard A.1 Open: CAEP Standards 1-pager (Advanced) Evidence Sufficiency Criteria for Advanced Programs Handout: Exercise A1, Supporting Evidence for Component A.1.1

Exercise A1: Supporting Evidence for Standard A.1 Instructions: Examine the list you generated when asked what evidence you have that would demonstrate proficiencies in the content and skills referenced in Component A.1.1 for a specialization? Choose one EPP-created assessment; practice describing evidence quality: relevance, representativeness, actionability, verifiability.

Exercise A1: Supporting Evidence for Standard A.1

EVIDENCE SUFFICIENCY CRITERIA, A.1.1 CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Demonstrates that most candidates pass state/nationally- benchmarked content/licensure exams

EVIDENCE SUFFICIENCY CRITERIA, A.1.1 CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Addresses all of the professional skills listed in the component Documents proficiency for at least three of the skills for each specialty field Utilizes multiple measures to assess each proficiency Utilizes measures that meet criteria in CAEP Evaluation Framework for EPP-Created Assessments

EVIDENCE SUFFICIENCY CRITERIA, A.1.1 CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Phase-In Plans for Component A.1.1 meet the criteria for the CAEP **Guidelines for Plans and are consistent with the Phase-In Schedule.

Component 1.2 – Key Language Providers ensure that candidates use research and evidence to develop an understanding of the teaching profession and use both to measure their P-12 students’ progress and their own professional practice. Reflect on: What evidence do I have that would demonstrate using research and assessment (evidence) for student and professional learning?

Evidence Sufficiency Criteria, 1.2 CANDIDATES USE RESEARCH/EVIDENCE TOWARD TEACHING PROFESSION Data/evidence document effective Candidate use of Research/evidence for planning, implementing, and evaluating students progress Data to reflect on teaching effectiveness and own practice Data to assess P-12 student progress and then modify instruction based on student data

Component A.1.2 – Key Language Providers ensure that advanced program completers have opportunities to learn and apply specialized content and discipline knowledge contained in approved state and/or national discipline-specific standards.

Component A.1.2 – Key Language These specialized standards include, but are not limited to, Specialized Professional Association (SPA) standards, individual state standards, standards of the National Board for Professional Teaching Standards, and standards of other accrediting bodies [e.g., Council for Accreditation of Counseling and Related Educational Programs (CACREP)] Reflect on: What evidence do I have that would demonstrate candidates application of advanced professional knowledge, at professional standard levels?

EVIDENCE SUFFICIENCY CRITERIA, A.1.2 CANDIDATES APPLY ADVANCED PREPARATION KNOWLEDGE Documents that the majority of programs meet the standards of the selected program review option(s) A majority submitted for SPA Review achieved National Recognition State Review reports document how well individual programs perform in relation to the state’s selected standards and that the majority meet the standards Program Review with Feedback results show that the state-selected state or national standards are met for the majority of programs I

EVIDENCE SUFFICIENCY CRITERIA, A.1.2 CANDIDATES APPLY ADVANCED PREPARATION KNOWLEDGE Includes a discussion of performance trends and compares across specialty areas. Component A.1.2 is not eligible for Phase-in Plan submission

Component 1.3 – Key Language Providers ensure that candidates apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Specialized Professional Associations (SPA), the National Board for Professional Teaching Standards (NBPTS), states, or other accrediting bodies (e.g., National Association of Schools of Music – NASM). Reflect on: What evidence do I have that would demonstrate the application of CK and PK in response to other professional standards?

Specialized Professional Associations (SPA) Examples: NCTM WHAT IS A SPA? Specialized Professional Associations (SPA) Examples: NCTM Some states require SPAS, other it is optional Get up to date list

Program Review Weblink: http://caepnet.org/accreditation/caep-accreditation/program-review-options

Program Review Options Available program review options for EPPs in states with agreements: SPA review with National Recognition (3 years prior to site visit) CAEP program review with feedback (part of self-study report) State review of programs (determined by state)

Program Review Options Available program review options for EPPs in states without agreements: SPA review with National Recognition (3 years prior to site visit) CAEP program review with feedback (part of self-study report) QUESTIONS ON CAEP Program Review - contact Banhi Bhattacharya: Banhi.Bhattacharya@caepnet.org

Program Review Options

Program Review Options

Program Review Options

Program Review Options

Component 1.4 – Key Language Providers ensure that candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards (e.g., Next Generation Science Standards, National Career Readiness Certificate, Common Core State Standards). Reflect on: What evidence do I have that would demonstrate skills and commitment to access for all students? Skip to standard 2 if behind schedule

Component 1.4 Suggested Evidence Evidence specific to college- and career-readiness Plans, assignments, and observational data demonstrate candidates’ skills for Deep content knowledge Eliciting P-12 student application of their knowledge to solve problems and think critically Cross-discipline teaching Differentiation instruction Ability to identify and interpret assessments to match P-12 college- and career-readiness goals/objective

Evidence Sufficiency Criteria, 1.4 CANDIDATES DEMONSTRATE TO COLLEGE-AND-CAREER-READY STANDARDS Multiple indicators/measures specific to evaluating proficiencies for Candidate’s ability to Provide effective instruction for all students (differentiation of instruction) Have students apply knowledge to solve problems and think critically Include cross-discipline learning experiences and to teach for transfer of skills Design and implement learning experiences that require collaboration and communication skills

Component 1.5 – Key Language Providers ensure that candidates model and apply technology standards as they design, implement and assess learning experiences to engage students and improve learning; and enrich professional practice. Reflect on: What evidence do I have that would demonstrate modeling and application of technology skills to enhance learning for students and self?

Component 1.5, Technology… Design Analysis of Learning and Teaching Facilitate Planning for Integration of Instructional Technology Evaluate Post-Instruction Evaluation and Review

Evidence Sufficiency Criteria, 1.5 CANDIDATES MODEL AND APPLY TECHNOLOGY Candidates demonstrate Knowledge and skill proficiencies including accessing databases, digital media, and/or electronic sources The ability to design and facilitate digital learning The ability to track and share student performance data digitally

There is a place in the SSR at the end for cross-cutting themes of diversity & technology, you can add your technology data there

In Summary - The Case for Standard 1/A.1 Information is provided from several sources and provides evidence of candidate knowledge, skills, and dispositions. Grades, scores, pass rates, and other data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined.

In Summary - The Case for Standard 1/A.1 Information is provided from several sources and provides evidence of candidate knowledge, skills, and dispositions. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.

Standard 2/A.2 Clinical Partnerships and Practice

CAEP Standard 2/A.2 Clinical Partnerships and Practice Partnerships for Clinical Preparation 2.1 A.2.1 Clinical Educators 2.2 Clinical Experiences 2.3 A.2.2

CAEP Standard 2 Clinical Partnerships and Practice The provider ensures that effective partnerships [components 2.1 and 2.2] and high-quality clinical practice [component 2.3] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions necessary to demonstrate positive impact on all P-12 students’ learning and development.

SCREENSHOT OF SANDBOX ST 2 - INITIAL

CAEP Standard A.2 Clinical Partnerships and Practice The provider ensures that effective partnerships [component A.2.1] and high-quality clinical practice [component A.2.2] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions appropriate for their professional specialty field.

SCREENSHOT OF SANDBOX ST 2 - ADVANCED Upload docs here so there are examples

Special for Standard 2/A.2 Rules for Standard 2/A.2 General for all Standards All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses Special for Standard 2/A.2 No required components (CAEP Accreditation Handbook 2016, p. 25)

Component 2.1 – Key Language Partners co-construct mutually beneficial P-12 school and community arrangements, including technology-based collaborations, for clinical preparation and share responsibility for continuous improvement of candidate preparation.

Component 2.1 – Key Language Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for candidate outcomes. Reflect on: What evidence do I have that would demonstrate mutually beneficial and accountable partnerships in which decision-making is shared?

If you are an EPP attempting to provide evidence of co- construction of Clinical practice experiences, which one of the following would be the best example of co-construction? https://www.polleverywhere.com/multiple_choice_polls/5 gmeolAKPlH6ogk

Component 2.1 – Co-construction Question: If you are an EPP attempting to provide evidence of co-construction of Clinical practice experiences, which one of the following would be the best example of co-construction? “Yes or No” EPP sending an MOU to a school principal outlining what needs to take place during clinical practice. EPP with a small enrollment documents in Self-Study report that conversations happen with principals and administrators in the local area. Poll everywhere

Component 2.1 – Co-construction Video file (.mov or mp3) of a conference call between EPP Field Work coordinator requesting places for 12 student teachers in the upcoming semester. Minutes of an EPP Advisory council documenting anticipated placements for the upcoming semester, reviewing evaluations from the previous semester, and discussion of clinical practice evaluation forms. School District sending a letter approving 12 placements for student teachers in the upcoming semester.

Co-Construction of Clinical Experiences Co-Construct the opportunities, challenges, and responsibilities, along with the support and guidance of clinical educators and designated faculty. Co- Constructed opportunities allow Candidates to apply the knowledge, dispositions and skills developed in general education and professional courses.

Co-Construction of Clinical Experiences Candidates should continue learning to adapt to the various conditions of classrooms in Co-Construction opportunities. Application, Introduction, Participation, Culmination, Roles/Responsibilities, Evaluate…

Evidence Sufficiency Criteria, 2.1 EVIDENCE THAT A COLLABORATIVE PROCESS IN PLACE AND REVIEWED Documentation provided for a shared responsibility model that includes elements of Co-construction of instruments and evaluations Co-construction of criteria for selection of mentor teachers Involvement in on-going decision-making Input into curriculum development EPP and P-12 educators provide descriptive feedback to candidates Opportunities for candidates to observe and implement effective teaching strategies linked to coursework

Component 2.2 – Key Language Partners co-select, prepare, evaluate, support, and retain high-quality clinical educators, both provider- and school-based, who demonstrate a positive impact on candidates’ development and P-12 student learning and development.

Component 2.2 – Key Language In collaboration with their partners, providers use multiple indicators and appropriate technology-based applications to establish, maintain, and refine criteria for selection, professional development, performance evaluation, continuous improvement, and retention of clinical educators in all clinical placement settings. Reflect on: What evidence do I have that would demonstrate the depth of partnership around highly effective clinical educators?

Clinical Educator Development/Responsibilities Process of collaboration with partnerships; further demonstrate partnerships, in field-experiences Developed - criteria, reflective teaching and learning, mutual engagement,… Monitored - facilitate learning and development Evaluated - opportunities for partners to…

Evidence Sufficiency Criteria, 2.2 EVIDENCE EPP AND P-12 CLINICAL EDUCATORS/ADMINISTRATORS CO-CONSTRUCT CRITERIA FOR CO- SELECTION Clinical educators receive Professional development, resources, and support Are involved in creation of professional development opportunities, the use of evaluation instruments, professional disposition evaluation of candidates, specific goals/objectives of the clinical experience, and providing feedback Data collected are used by EPPs and P-12 clinical educators for modification of selection criteria, future assignments of candidates, and changes in clinical experiences

Component 2.3 – Key Language The provider works with partners to design clinical experiences of sufficient depth, breadth, diversity, coherence, and duration to ensure that candidates demonstrate their developing effectiveness and positive impact on all students’ learning and development.

Component 2.3 – Key Language Clinical experiences, including technology-enhanced learning opportunities, are structured to have multiple performance-based assessments at key points within the program to demonstrate candidates’ development of the knowledge, skills, and professional dispositions, as delineated in Standard 1, that are associated with a positive impact on the learning and development of all P-12 students. Reflect on: What evidence do I have that clinical experiences develop candidates’ Knowledge, Skills, and Dispositions to have a positive impact on P-12 learning?

Clinical Experience Table Course Sample

Clinical Experience Table Program Sample

Evidence Sufficiency Criteria, 2.3 EVIDENCE ALL CANDIDATES HAVE CLINICAL EXPERIENCES IN DIVERSE SETTINGS Attributes (depth, breadth, diversity, coherence, and duration) are linked to student outcomes and candidate/completer performance documented in Standards 1 and 4 Evidence documents a sequence of clinical experiences that are focused, purposeful, and varied with specific goals Clinical experiences include focused teaching experience where specific strategies are practiced Clinical experiences are assessed using performance-based

CAEP Standard 2.3/A.2.1 & A.2.2 Clinical Partnerships and Practice The provider ensures that effective partnerships [components 2.1 and 2.2] and high-quality clinical practice [component 2.3] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions necessary to demonstrate positive impact on all P-12 students’ learning and development.

CAEP Standard 2.3/A.2.1 & A.2.2 Clinical Partnerships and Practice The provider ensures that effective partnerships [component A.2.1] and high-quality clinical practice [component A.2.2] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions appropriate for their professional specialty field.

Component A.2.1: Key language Partners co-construct mutually beneficial P-12 school and community arrangements, including technology-based collaborations, for clinical preparation and shared responsibility for continuous improvement of candidate preparation.

Component A.2.1: Key language Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for candidate outcomes.

SUGGESTED EVIDENCE: PARTNERSHIPS FOR CLINICAL PREPARATION Documents illustrating co-construction of a collaborative relationship Documents outlining provider and partner responsibilities for examining and improving clinical preparation

SUGGESTED EVIDENCE: PARTNERSHIPS FOR CLINICAL PREPARATION Evidence that assessments and performance standards are mutually acceptable to providers and partners Documentation of shared perspective on appropriate uses of technology for the candidate’s future role

Evidence Sufficiency Criteria, A.2.1 PARTNERSHIPS FOR CLINICAL PREPARATION Illustrates specific benefits to provider and P-12 partners Outlines the collaborative nature of the relationship Documents that effectiveness of the partnership is reviewed at least annually

Evidence Sufficiency Criteria, A.2.1 PARTNERSHIPS FOR CLINICAL PREPARATION Shows that the EPP seeks input from partners to refine criteria for entry/exit to clinical experiences Documents partner participation in development and review activities (e.g., for clinical instruments, clinical curriculum, EPP-curriculum) Phase-in Plans meet CAEP guidelines and schedule Instruments for evaluating partnership (if any) meet CAEP’s assessment sufficiency criteria

Component A.2.2: Key language The provider works with partners to design varied and developmental clinical settings which allow opportunities for candidates to practice applications of content knowledge and skills emphasized by the courses and other experiences of the advanced preparation program.

Component A.2.2: Key language The opportunities lead to appropriate culminating experiences in which candidates demonstrate their proficiencies, through problem-based tasks or research (e.g., qualitative, quantitative, mixed methods, action) that are characteristic of their professional specialization as detailed in component A.1.1

SUGGESTED EVIDENCE: CLINICAL EXPERIENCES Charts illustrating the breadth, depth, duration, and coherence of the opportunities to practice applying content knowledge and skills to practical challenges in their specialty area

SUGGESTED EVIDENCE: CLINICAL EXPERIENCES Evidence mapping the developmental trajectory of specific practical knowledge and skills as candidates’ progress through courses and the clinical experiences embedded within or external to the courses. Candidate evaluations of connection between coursework and fieldwork

Clinical Experience Table Course Sample Clinical Internships & Associated Description (Observation and/or Implementation) Program Fields Hours Measures Schools/Districts EDU 2100: This supervised practicum in elementary settings, exposes candidates with practical experiences in workplace settings and scenarios to evaluate the connections between coursework and fieldwork M.Ed., Ed.D. 45 hours of Observation and/or Implementation -Dispositional/ Professional Responsibility Data -Problem-based projects, coursework Internship- Must be Approved by PDS/D during semester of application prior to… EDU 2900: This clinical internship in elementary education, is designed for Candidates to appropriately and effectively apply research based instructional learning theory/strategies for their fields of specialization, in P -12 60 hours of Observation and Implementation -Problem-based projects, school/district -Action Research -Capstones/ Portfolios/ Thesis

Clinical Experience Table Program Sample Field   Field Experiences & Associated Hours (Observation) Clinical Internships & Associated Hours (Implementation) Hours M.Ed., Secondary Mathematics Education EDUM 552, EDUM 553, EDUM 554, EDUM 555, EDUM 556 (Practicum) – 200 hours observation EDU-M 699 – 500 hours of participation and implementation of coursework and fieldwork 700 M.Ed., English as a Second Language (TESL) TESL 500 (Practicum) – 250 hours of observation and participation EDU-TESL 699 – 500 hours of participation and implementation of research based instructional learning strategies 750

Evidence Sufficiency Criteria, A.2.2 CLINICAL EXPERIENCES Documents that all candidates have practical experiences in workplace settings Illustrates that candidates observe and implement appropriate and effective strategies for their fields of specialization Documents the attributes of clinical/practical experiences Illustrates that they are varied and developmentally progressive Illustrates that they relate to coursework .

Evidence Sufficiency Criteria, A.2.2 CLINICAL EXPERIENCES Demonstrates a relationship between clinical/practical experiences and candidate outcomes reported in Standard A.1 Phase-in Plans meet CAEP guidelines and schedule .

In Summary - The Case for Standard 2/A.2 Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, schools/districts, and continuous functioning. Data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined in relation to clinical experiences, as appropriate.

In Summary - The Case for Standard 2/A.2 Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, schools/districts, and continuous functioning. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.

training.questions@caepnet.org

Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity

CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity Recruitment/Admission of Diverse Candidates who Meet Employment Needs 3.1 A.3.1 Admission Standards Indicate That Candidates Have High Academic Achievement and Ability 3.2 A.3.2 Additional Selectivity Factors (non-academic) 3.3 Selectivity During Preparation (performance standards) 3.4 A.3.3 Selection At Completion (ready, not just finished) 3.5-3.6 A.3.4

CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity The provider demonstrates that the quality of candidates is a continuing and purposeful part of its responsibility from recruitment [component 3.1], at admission [component 3.2], through the progression of courses and clinical experiences [components 3.3 and 3.4], and to decisions that completers are prepared to teach effectively and are recommended for certification [components 3.5 and 3.6].

CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity The provider demonstrates that development of candidate quality is the goal of educator preparation in all phases of the program. This process is ultimately determined by a program’s meeting of Standard 4. The provider demonstrates that the quality of advanced program candidates [components A.3.1 and A.3.2] is a continuing and purposeful part of its responsibility [component A.3.3] so that completers are prepared to perform effectively and can be recommended for certification where applicable [component A.3.4].

Rules for Standard 3/A.3 General for all Standards All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses

Rules for Standard 3/A.3 Special for Standard 3/A.3 Component 3.2/A.3.2 are required in order to meet Standard 3 (CAEP Accreditation Handbook 2016, p. 36)

Component 3.1 – Key Language The provider presents plans and goals to recruit and support completion of high-quality candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of America’s P-12 students.

Component 3.1 – Key Language The provider demonstrates efforts to know and address community, state, national, regional, or local needs for hard-to-staff schools and shortage fields, currently, STEM, English-language learning, and students with disabilities. Reflect on: What recruitment evidence (plans and goals) do I have that demonstrates attracting diverse candidates to meet identified needs?

An example of a Recruitment Plan’s Common elements… Introduction and Planning Organization, College, Department, etc… Background of College/Department College/Department Self-Assessment Recruitment of Candidates Develop EPP’s “Message” Develop “How To” Recruit Develop, Schedule, Conduct Orientations

An example of a Recruitment Plan’s Common elements… Retention of Candidates Assign Support/Supervisor Provide Learning Opportunities of Foundations, Methods, and Clinical Experiences Evaluate Content and Pedagogical Development Provide Academic/non-Academic Resources Transition of Candidates to Completers Communicate with Completers regularly via surveys, polls, questionnaires, census Recognize professional support, supervisor(s), and resources

An example of a Recruitment Plan’s Common elements… Managing and Evaluating Design the Evaluation Collect, Organize, and Analyze Data Report Results, Conclusions Reached, and Recommendations Resources

Evidence Sufficiency Criteria, 3.1 PLAN/GOALS TO RECRUIT/SUPPORT HIGH-QUALITY CANDIDATES Recruitment plan with base points and goals; including academic ability, diversity, and employment needs Data on applicants, admitted, and enrolled candidates are disaggregated by relevant demographics Evidence that results are recorded, monitored, and used in planning and modification of recruitment strategies Plan and demonstrates knowledge of and addresses employment opportunities in schools, districts, and/or regions

Component 3.2 – Key Language The provider meets CAEP minimum criteria or the state’s minimum criteria for academic achievement, whichever are higher, and presents disaggregated data on the enrolled candidates whose preparation begins during an academic year.

Academic Selection Samples CAEP minimum criteria 1) Admissions 2) Prior to program completion  “Starting in academic year 2016-2017, the CAEP minimum criteria apply to the group average of enrolled candidates whose preparation begins during an academic year. The provider determines whether the CAEP minimum criteria will be measured (1) at admissions, OR (2) at some other time prior to candidate completion.”

Let’s compare initial and advanced programs Initial; ‘The CAEP minimum criteria are a grade point average of 3.0 and a group average performance on nationally normed assessments or substantially equivalent state-normed assessments of mathematical, reading and writing achievement in the top 50 percent of those assessed.” Advanced Standards “The CAEP minimum criteria are a college grade point average of 3.0 or a group average performance on nationally normed assessments, or substantially equivalent state-normed or EPP administered assessments, of mathematical, reading, and writing achievement in the top 50 percent of those assessed.

Evidence Sufficiency Criteria, 3.2 CANDIDATES DEMONSTRATE ACADEMIC ACHIEVEMENT Average scores for group of candidates during in an academic year meet CAEP minimum GPA of 3.0 AND performance on nationally-normed, substantially equivalent state- normed, or EPP administered assessments is in the top 50% for all test takers of the selected assessment Assessments examine candidate performance in mathematical and reading achievement Beginning in 2021 in writing achievement Group average: The GPA and standardized test scores are averaged for all members of a cohort or class of admitted candidates. Averaging does not require that every candidate meet the specified score. Thus, there may be a range of candidates’ grades and scores on standardized tests.

Component 3.3 – Key Language Educator preparation providers establish and monitor attributes and dispositions beyond academic ability that candidates must demonstrate at admissions and during the program.

Component 3.3 – Key Language The provider selects criteria, describes the measures used and evidence of the reliability and validity of those measures, and reports data that show how the academic and non-academic factors predict candidate performance in the program and effective teaching. Reflect on: What data can I present to demonstrate the other things (besides GPA and test scores) we look for at admissions that result in selecting high quality candidates?

Non-Academic Samples Admission to Teacher Education Admission to Clinical Experience

Evidence Sufficiency Criteria, 3.3 PROVIDERS ESTABLISHES/MONITORS CANDIDATE ATTRIBUTS/DISPOSITIONS, BEYOND ACADEMICS Rationale for established non-academic criteria Makes evidence-based case for the selection and implementation Evidence that EPP monitors candidate progress on established non-academic criteria at multiple points; takes appropriate actions based on results Evidence of association/correlation of non-academic criteria with candidate and completer performance

Component 3.4 – Key Language The provider creates criteria for program progression and monitors candidates’ advancement from admissions through completion. All candidates demonstrate the ability to teach to college- and career-ready standards.

Component 3.4 – Key Language Providers present multiple forms of evidence to indicate candidates’ developing content knowledge, pedagogical content knowledge, pedagogical skills, and the integration of technology in all of these domains Reflect on: What data can I present to demonstrate that my EPP continues to be selective of candidates throughout our programs?

Monitoring Table of Candidates

Evidence Sufficiency Criteria, 3.4 PROVIDER CRITERIA FOR PROGRAM PROGRESSION/MONITORING OF CANDIDATES Evidence of candidates developing proficiencies in or evidence of developing proficiencies in candidates at 2 or more gateways of progression Ability to teach to college- and career-ready standards Pedagogical/Content knowledge Integration of use of technology

Evidence Sufficiency Criteria, 3.4 PROVIDER CRITERIA FOR PROGRAM PROGRESSION/MONITORING OF CANDIDATES Results and stated candidate progressions criteria align with evidence of actions taken such as: Changes in curriculum or clinical experiences Providing interventions/Counseling out

Component 3.5 – Key Language Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate has reached a high standard for content knowledge in the fields where certification is sought and can teach effectively with positive impacts on P-12 student learning and development Reflect on: What data can I present to demonstrate that exit criteria are rigorous?

Evidence Sufficiency Criteria, 3.5 PROVIDER DEMONSTRATES; CANDIDATES HAVE CONTENT KNOWLEDGE IN CERTIFICATION FIELD Evidence is the same as that for 1.1 Evidence of effective teaching including positive impacts on P-12 student learning and development for all candidates as noted in Standard 1

Component 3.6 – Key Language Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate understands the expectations of the profession, including codes of ethics, professional standards of practice, and relevant laws and policies. CAEP monitors the development of measures that assess candidates’ success and revises standards in light of new results Reflect on: What data can I present to document that our candidates understand the professional dos and don'ts of teaching?

Evidence Sufficiency Criteria, 3.6 PROVIDER DEMONSTRATES, CANDIDATES UNDERSTAND EXPECTATIONS OF PROFESSION Candidates’ understanding; codes of ethics and professional standards of practice Evidence that candidates have knowledge of relevant laws and policies 504 disability provisions, education regulations; bullying, etc.

Sandbox Screenshot of Standard 3 Initial-Level

CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity The provider demonstrates that the quality of candidates is a continuing and purposeful part of its responsibility from recruitment [component 3.1], at admission [component 3.2], through the progression of courses and clinical experiences [components 3.3 and 3.4], and to decisions that completers are prepared to teach effectively and are recommended for certification [components 3.5 and 3.6].

CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity The provider demonstrates that development of candidate quality is the goal of educator preparation in all phases of the program. This process is ultimately determined by a program’s meeting of Standard 4. The provider demonstrates that the quality of advanced program candidates [components A.3.1 and A.3.2] is a continuing and purposeful part of its responsibility [component A.3.3] so that completers are prepared to perform effectively and can be recommended for certification where applicable [component A.3.4].

Rules for Standard 3/A.3 General for all Standard All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses

Rules for Standard 3/A.3 Special for Standard 3/A.3 Component 3.2/A.3.2 are required in order to meet Standard 3 (See CAEP Accreditation Handbook 2016, Page 36)

Component A.3.1 – Key Language The provider sets goals and monitors progress for admission and support of high-quality advanced program candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of America’s teacher pool and, over time, should reflect the diversity of P-12 students.

Component A.3.1 – Key Language The provider demonstrates efforts to know and address community, state, national, regional, or local needs for school and district staff prepared in advanced fields. Reflect on: What recruitment evidence (plans and goals) do I have that demonstrate to base points and annual monitoring?

Screenshot of Standard 3 Advanced-Level

Evidence Sufficiency Criteria, A.3.1 ADMISSION OF DIVERSE CANDIDATES, WHO MEET EMPLOYMENT NEEDS Recruitment plan with base points and annual monitoring; including academic ability, diversity, and employment needs Data on applicants, admitted, and enrolled candidates are disaggregated by relevant demographics Evidence that results are recorded, monitored, and used in planning and modification of recruitment strategies Plan and demonstrates knowledge of and addresses employment opportunities in schools, districts, and/or regions

Component A.3.2 – Key Language Required Component: The provider sets admissions requirements for academic achievement, including CAEP minimum criteria, the state’s minimum criteria, or graduate school minimum criteria, whichever is highest, and gathers data to monitor candidates from admission to completion.

Component A.3.2 – Key Language The provider determines additional criteria intended to ensure that candidates have, or develop, abilities to complete the program successfully and arranges appropriate support and counseling for candidates whose progress falls behind.

Academic Selection Samples CAEP Minimum Criteria 1) Admissions 2) Prior to program completion  3.0 GPA Initial-Level or Advanced-Level Standards State Licensure Test Scores Relevant surveys or assessments of completers

Academic Selection Samples Other Proficiency Measures -Action Research -Capstones/Portfolios/Thesis -Dispositional/Professional Responsibility Data -Problem-based projects with coursework/group projects -Problem-based projects with school/district -Pre- and post-data and reflections on interpretations and use of data -End of key-course tests -Grades, by program field -Survey Data from Completers/Employers

Component A.3.3 – Key Language Before the provider recommends any advanced program candidate for completion, it documents that the candidate has reached a high standard for content knowledge in the field of specialization, data literacy and research-driven decision making, effective use of collaborative skills, applications of technology, and applications of dispositions, laws, codes of ethics and professional standards appropriate for the field of specialization. Reflect on: What data can I present to demonstrate following documentation that candidates (besides GPA and test scores) have developed and progressed through program?

EPP Created- Assessments Academic Selection Samples -Action Research -Capstones/Portfolios/Thesis -Dispositional/Professional Responsibility Data -Problem-based projects with coursework/group projects -Problem-based projects with school/district

EPP Created- Assessments Academic Selection Samples -Pre- and post-data and reflections on interpretations and use of data -End of key-course tests -Grades, by program field -Survey Data from Completers/Employers + State Assessments/Surveys + Other Proficiency Measures

EPP Created- Assessments Advanced Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments 

Evidence Sufficiency Criteria, A.3.3 SELECTIVITY DURING PREPARATION Evidence of candidates developing proficiencies in or evidence of developing proficiencies in candidates at 2 or more gateways of progression Proficiencies to understand and apply knowledge and skills appropriate to program fields of specialization, see generic skills in component A.1.1

Evidence Sufficiency Criteria, A.3.3 SELECTIVITY DURING PREPARATION Results and stated candidate progressions criteria align with evidence of actions taken such as: Changes in curriculum or clinical experiences Providing interventions/Counseling out

Component A.3.4 – Key Language The provider creates criteria for program progression and uses disaggregated data to monitor candidates’ advancement from admissions through completion. Reflect on: What data can I present to demonstrate at exit the proficiencies of completing candidates?

Monitoring Table of Candidates

Evidence Sufficiency Criteria, A.3.4 SELECTION AT COMPLETION Evidence is the same as that for A.1.1 Evidence of effective teaching including positive impacts on P-12 student learning and development for all candidates as noted in Standard 1

In Summary - The Case for Standard 3/A.3 Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, and continuous functioning. Data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined in relation to components 3.1 and 3.2 (recruitment and admissions), as appropriate.

In Summary - The Case for Standard 3/A.3 Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.

Standard 4/A.4 Program Impact

CAEP Standard 4/A.4 Program Impact Impact on P-12 Student Learning and Development 4.1 Indicators of Teaching Effectiveness 4.2 Satisfaction of Employers 4.3 A.4.1 Satisfaction of Completers 4.4 A.4.2

CAEP Standard 4 Program Impact WARNING initial-level programs only program impact data at this time!

New Guidance on Standard 4- see CAEP Accreditation Weekly Update September 22, 2017 EPPs have followed the CAEP Standards and Handbook procedures, and the CAEP review procedures have been in place with site teams and the Accreditation Council for decisions in October 2016 and April 2017. So now, for the first time, CAEP is able to describe what EPPs are including in their self- study reports as evidence and to provide actual examples.

another fifth were case studies to create student learning data. Update on Standard 4 “The EPP self-study report evidence can be categorized by type. Drawing on the 17 cases decided in April 2017, we find that about one third of the evidence was for State or district measures of P-12 student learning or growth, and another fifth were case studies to create student learning data. One EPP provided student survey results, and another offered a teacher evaluation tool as evidence.” P. 2

CAEP Standard 4 Program Impact The provider demonstrates the impact of its completers on P-12 student learning and development [component 4.1], classroom instruction [component 4.2] and schools [component 4.3], and the satisfaction of its completers [component 4.4] with the relevance and effectiveness of their preparation.

Initial Component 4.1 – Key Language REQUIRED COMPONENT: The provider documents, using multiple measures that program completers contribute to an expected level of student-learning growth. Multiple measures shall include all available growth measures (including value-added measures, student-growth percentiles, and student learning and development objectives) required by the state for its teachers and available to educator preparation providers, other state-supported P-12 impact measures, and any other measures employed by the provider.

Initial Component 4.1 – Key Language Reflect on: What evidence do you have that would demonstrate graduates’ impact on P-12 student learning? What research methodologies could you feasibly employ to gain such information?

EPPs that have access to data from states about completer impact: Demonstrate that they are familiar with the sources of the P-12 student learning impact data and the state’s model for preparing the data that are attributed to the EPP’s preparation program. Document the EPP’s analysis and evaluation of information provided on P-12 student learning.

EPPs that have access to data from states about completer impact: Interpretations of the data. Judge the implications of the data and analyses for the preparation program. If judged to be invalid, use other valid evidence.

EPPs that do not have access to data from states about completer impact: The EPP creates data similar to state data in conjunction with student assessment and teacher evaluations conducted in school districts where some portion of its completers are employed This type of EPP study could be phased in VInce - can you check out the specifics of “calendar year 2018” to make sure we are correct?

EPPs that do not have access to data from states about completer impact: By 2016, all EPPs should at least have a design in place and pilot data collection under way One year of data needed for calendar year 2018 EPP collaborations encouraged Also presented by EPPs that are supplementing state or district data with data on subjects or grades not covered VInce - can you check out the specifics of “calendar year 2018” to make sure we are correct?

4-Examples Self-study report evidence that “early adopter” EPPs have submitted Example 1—STATE University Example 2—PRIVATE University Example 3—PRIVATE College Example 4—PUBLIC University The next set of slides are excerpt with key points from each example ** Based on time constraints- we may not be able to go over these slides in detail. The examples are here for participants to access after the workshop.

Example 1: STATE University P-12 academic achievement comparison using available data, with confirmation from correlated measures: Enrollment 23000; EPP enrollment around 2300 Use data available to you; (these are student growth measures) Develop case study designs similar to teacher work sample from pre-service Cites findings from research that associates teaching strategies demonstrating an impact on student learning FIRST build a graduate tracking system!!

Example 2: PRIVATE University P-12 student growth complemented by planned teacher action research Enrollment around 3100; EPP enrollment around 80 Student growth percentiles available from state, information was highly summarized, so of little use to the EPP on information available Pilot a teacher action research project using volunteering completers, and constructed as an annually recurring activity The design permits links with pre-service data for the same completers Candidate tasks are similar to those in a teacher work sample assessment These include pre and post measures of student learning associated with teaching a comprehensive unit of instruction

Example 3: PRIVATE College Evidence from preservice and indirectly through employers - NOT TO USE State shares no data with EPPs An early adopter EPP attempts to respond to Standard 4 Almost all the evidence provided was taken from pre-service preparation, so is not responsive to 4.1 Self-Study Report included a richly descriptive principal survey that included questions on assessment (although not on learning), but this is not a substitute for measuring student learning.

Example 4: PUBLIC University (problematic) P-12 student value added data as part of state teacher evaluation, complemented by planned teacher action research State value added data were available but are aggregated, so of little use to the EPP on information available Data was part of a state teacher evaluation (50% student learning / 50% other factors), so evidence for 4.1 and 4.2 are linked EPP shows options it has considered to complement the state data and designates one as the path forward (plan but not data on this element) Provider will work with a school partner to gather and evaluate classroom data from novice teachers

Summary of Key Points, Component 4.1 Use available data, but learn their strengths and weaknesses State data on P-12 learning are highly variable from state to state, and early adopters generally found them insufficient as feedback on their own performance

Summary of Key Points, Component 4.1 Case Studies were most common approach for early adopters Some of these are fashioned so that preservice Teacher Work Sample (TWS) assessments cold serve as a point of comparison for new in-service teacher data One approach was in the form of teacher action research Collaborating with partner school districts was one strategy One important lesson…build a tracking system

Step 1 Review - Rules for Standard 4/A.4 General for all Standards All components addressed EPP-Created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential Disaggregated data on candidates, for main/branch campuses

Step 1 Review - Rules for Standard 4/A.4 Special for Standard 4 All components for Standard 4 are required All components must be met for the standard to be considered met. All phase-in requirements are met.

Step 2. Inventory Evidence toward… Candidate performance Completer performance Other CAEP requirements

Step 3. Information, Categorize, and Prepare Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement

Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3

Exercise 4: Supporting Evidence for Standard 4 Open: CAEP Standards 1-pager (Initial) Evidence Sufficiency Criteria for Initial Programs Handout: Exercise 4, Supporting Evidence for Component 4.1 We will look at each of these resources and practice applying them to prospective evidence. The process of building a case that Standard 4 is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, we will begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component 5.2. It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.

Exercise 4: Supporting Evidence for Standard 4 Instructions: Select one of the measures that you listed for Component 4.1. Which category or categories in the evidence evaluation are the greatest source of concern for this assessment or evidence suite? In the Weaknesses space, list the concerns and the types of resources that would help you address the concern? We will look at each of these resources and practice applying them to prospective evidence. The process of building a case that Standard 4 is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, we will begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component 5.2. It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.

Exercise 4: Supporting Evidence for Standard 4 We will look at each of these resources and practice applying them to prospective evidence. The process of building a case that Standard 4 is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, we will begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component 5.2. It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.

SUGGESTED EVIDENCE: IMPACT ON LEARNING Direct measures of student learning and development Addresses diverse subjects and grades P-12 impact or growth data from state teacher evaluations (if available) If state data are not available: Teacher-linked student assessments from districts Classroom-based research (e.g., action research, case studies)

EVIDENCE SUFFICIENCY CRITERIA, 4.1 SUFFICIENT EVIDENCE Presents multiple measures showing positive impact on student learning One or more state-provided or two or more EPP-generated From a representative or purposive sample of graduates 1-3 years post-exit EPP-generated data utilizes research-based methodology( e.g., cases study, action research)

EVIDENCE SUFFICIENCY CRITERIA, 4.1 SUFFICIENT EVIDENCE Describes the measures and context Describes representativeness of sample/data Analyzes data and interprets results appropriately Conclusions are supported by results

Component 4.2 – Key Language REQUIRED COMPONENT: The provider demonstrates, through structured and validated observation instruments and/or student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve. Reflect on: What evidence do I have (beyond measures of P-12 student learning) that would demonstrate in-service graduates are effective teachers?

SUGGESTED EVIDENCE: CLASSROOM INSTRUCTION Teaching Observations Aligned to the 4 InTASC categories Aligned to state standards for teachers / local teacher evaluation framework P-12 Student Surveys Aligned to the InTASC categories Corroboration for observation/evaluation data The 4 InTASC categories addressed in Standard 1 are: Learner and Learning, Content, Instructional Practice, and Professional Responsibility.

SUGGESTED EVIDENCE: CLASSROOM INSTRUCTION Employer Surveys Aligned to the InTASC Corroboration for observation/evaluation data The 4 InTASC categories addressed in Standard 1 are: Learner and Learning, Content, Instructional Practice, and Professional Responsibility.

EVIDENCE SUFFICIENCY CRITERIA, 4.2 SUFFICIENT EVIDENCE Measures classroom-based demonstration of professional knowledge, skills, and dispositions (e.g., InTASC, state/district teacher performance standards) Utilizing structured and validated teaching observation tools and/or P-12 student surveys Utilizing a representative sample that covers most licensure areas Obtaining survey return rates of 20% or higher SECTION OR OTHER CONTENT

EVIDENCE SUFFICIENCY CRITERIA, 4.2 SUFFICIENT EVIDENCE Analyzes data and interprets results appropriately Conclusions are supported by results SECTION OR OTHER CONTENT

Component 4.3 : Key Language REQUIRED COMPONENT: The provider demonstrates, using measures that result in valid and reliable data and including employment milestones such as promotion and retention, that employers are satisfied with the completers’ preparation for their assigned responsibilities in working with P-12 students. Reflect on: What evidence do we have that would demonstrate that employers are satisfied with the professional knowledge, skills, and dispositions of your program graduates who are working at their location?

SUGGESTED EVIDENCE: SATISFACTION Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Can triangulate with observation/evaluation, survey, and impact data Employer Surveys Corroboration for observation/evaluation and data

EVIDENCE SUFFICIENCY CRITERIA, 4.3 SUFFICIENT EVIDENCE Shows that employers perceive completers’ preparation was sufficient for their job responsibilities and attainment of employment milestones (e.g., retention) Utilizing valid and reliable measures Obtaining response rates of 20% or higher

EVIDENCE SUFFICIENCY CRITERIA, 4.3 SUFFICIENT EVIDENCE Describes representativeness of sample/data for licensure areas Discusses satisfaction patterns with respect to employment contexts (e.g., shortage fields, hard-to-staff schools, schooling level, school demographics) Data analysis is appropriate and conclusions are supported by data

Component 4.4: Key language REQUIRED COMPONENT: The provider demonstrates, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. Reflect on: What evidence do we have that would demonstrate your program graduates are satisfied with how well the program prepared them for their job?

SUGGESTED EVIDENCE: SATISFACTION Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Can triangulate with observation/evaluation, survey, and impact data Employer Surveys Corroboration for observation/evaluation and data

EVIDENCE SUFFICIENCY CRITERIA, 4.4 SUFFICIENT EVIDENCE Shows that completers perceive their preparation was sufficient for their job responsibilities and was effective Utilizing valid and reliable measures Obtaining response rates of 20% or higher

EVIDENCE SUFFICIENCY CRITERIA, 4.4 SUFFICIENT EVIDENCE Describes representativeness of sample/data for licensure areas Discusses satisfaction patterns with respect to employment contexts (e.g., shortage fields, hard-to-staff schools, schooling level, school demographics) Data analysis is appropriate and conclusions are supported by data

Component A.4.1 – Key Language REQUIRED COMPONENT: The provider demonstrates that employers are satisfied with completers’ preparation and that completers reach employment milestones such as promotion and retention. Reflect on: What evidence do I have that would demonstrate graduates’ impact on P-12 student learning? What research methodologies could we feasibly employ to gain such information?

Step 2. Inventory Evidence toward… Candidate performance Completer performance Other CAEP requirements

Step 3. Information, Categorize, and Prepare Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement

Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3

Exercise A4: Supporting Evidence for Standard A.4 Open: CAEP Standards 1-pager (Advanced) Evidence Sufficiency Criteria for Advanced Programs Handout: Exercise A4, Supporting Evidence for Component A.4.1

Exercise A4: Supporting Evidence for Standard A.4 Instructions: Select one of the measures that you listed for Component A.4.1. Which category or categories in the evidence evaluation are the greatest source of concern for this assessment or evidence suite? In the Weaknesses space, list the concerns and the types of resources that would help you address the concern?

Exercise A4: Supporting Evidence for Standard A.4

EVIDENCE SUFFICIENCY CRITERIA, A.4.1 SATISFACTION OF EMPLOYERS Provider includes appropriate analysis and interpretation of results Describes a system for analysis, evaluation, and interpretation of data Utilizing valid and reliable measures Obtaining response rates of 20% or higher

EVIDENCE SUFFICIENCY CRITERIA, A.4.1 SATISFACTION OF EMPLOYERS Conclusions supported by data Provide documentation of employment milestones

Component A.4.2 – Key Language REQUIRED COMPONENT: The provider demonstrates that advanced program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. Reflect on: What evidence do I have that would demonstrate the appropriate clinical settings for candidates to demonstrate proficiencies?

EVIDENCE SUFFICIENCY CRITERIA, A.4.2 SATISFACTION OF COMPLETERS Provider includes appropriate analysis and interpretation of results Describes a system for analysis, evaluation, and interpretation of data Utilizing valid and reliable measures Obtaining response rates of 20% or higher

EVIDENCE SUFFICIENCY CRITERIA, A.4.2 SATISFACTION OF COMPLETERS Evidence of an adequate and representative sample Analysis and interpretation of data aligned to standard/component Conclusions supported by data

In Summary - The Case for Standard 4/A.4 Information is provided from several sources and provides evidence of program impact on graduates (in-service). Data are analyzed for completer effectiveness, completer satisfaction, and employer satisfaction. Differences and similarities across licensure/field areas and demographic categories are examined.

In Summary - The Case for Standard 4/A.4 Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification or “staying the course”. Based on the analysis of data, there are planned or completed actions for change that are described.

Brain Break

Cross Cutting Themes Diversity and Technology

Cross-Cutting Themes Embedded in Every Aspect of Educator Preparation Coursework Diversity Technology Fieldwork Interpersonal Interactions

Cross Cutting Themes: Diversity Standard 1 Candidates must demonstrate skills and commitment that provide all P-12 students access to rigorous college and career ready standards. Standard 2 Clinical experiences prepare candidates to work with all students. Standard 3 Providers are committed to outreach efforts to recruit a more able and diverse candidate pool.

Cross Cutting Themes: Diversity

Cross Cutting Themes: Technology Standard 1 Endorses InTASC teacher standards. Providers are to “…ensure that candidates model and apply technology standards as they design, implement, and assess learning experiences to engage students and improving learning and enrich professional practice.”

Cross Cutting Themes: Technology Standard 2 Technology-enhanced learning opportunities Appropriate technology-based applications Technology-based collaborations Standard 3 Candidates integrate technology into all learning domains.

Cross Cutting Themes: Technology

Areas for Improvement (AFIs)

AFIs An EPP must address AFIs in their Annual Report. During the next accreditation review the EPP must demonstrate that the AFIs have been corrected. If the AFIs have not been corrected, a stipulation may be cited in the same area.

Review: Seven Steps to Preparing the (SSR) Review the CAEP standards Inventory of available evidence Gather information, categorize and prepare evidence to upload, and draft table to be completed Take stock Analyze and discuss the evidence and draft of the Self-Study Report Formulate summary/narrative statements Draft Self-Study Report Iterative process used throughout the writing of the SSR. Will be repeated throughout the presentation.

Step 7. Draft Self-Study Report Compile a complete draft of report Including evidence; tagged to the appropriate standard(s), component(s), crossing- cutting themes, and data quality documentation Summary and analysis statements Review the draft with stakeholders Revise as needed Upload the final into Accreditation Information Management System (AIMS)

What remaining questions do you have about writing your Self-Study Report? https://www.polleverywhere.com/discourses/ycjHsqSeumd HhwT

Thank You

Contact Information Vince O’Neill, EdD Accreditation Director Counselors and Accreditation Decisions vince.oneill@caepnet.org Gary Railsback, PhD, MBA CAEP Volunteer Lead Site Visitor Azusa Pacific University grailsback@apu.edu

Resources Example Narrative: Login for Sandbox: Initial Only: ID 29535 PW boe1 Advanced Only: ID 29319 PW boe0 Advances and Initial: ID 29536 PW boe2 Example Narrative: NC State’s Self-Study Narrative: go.ncsu.edu/create There are 3 separate test environments- one for each type of EPP. You can access any or all of these “sandboxes” at any time.