Download presentation
Presentation is loading. Please wait.
Published byBerniece Andrews Modified over 6 years ago
1
FALL 2019 AND BEYOND!!! Preparing and Writing the Self-Study Report
Tatiana Rivadeneyra, Ed.D. Accreditation Director, Site Visitor Development and EPP Accreditation
2
Overview On formatting, submission process, and tips for writing a Self-Study Report toward CAEP Accreditation. Common problems will be highlighted and tips will be provided for making the case to meet CAEP standards and for increasing the manageability of the process.
3
Preparing the Self-Study Report - STEPS
Demonstrate to Standards Review Inventory of available evidence Gather information, categorize and prepare evidence to upload, and draft table to be completed Take stock Analyze and discuss the evidence and draft of the Selected Improvement Plan Formulate summary/narrative statements Draft and submit Self-Study Report
4
Step 1. Review for Initial Standards
Evidence Sufficiency Criteria Evaluation Criteria for Self-Study Evidence – All Standards CAEP Guidelines for Plans for phase-in plan content SSRs can present plan with progress data Site visits in F18 and beyond are not eligible for phase-in Assessment Sufficiency Criteria CAEP Evaluation Framework for EPP-Created Assessments
5
Step 1. Review for Advanced Standards
Evidence Sufficiency Criteria Evaluation Criteria for Self-Study Evidence – All Standards CAEP Guidelines for Plans for phase-in plan content SSR submitted through academic year 2018/2019 can include plans for Standards for Advanced-Level Programs and designated components SSRs can present plan with progress data for Standards for Advanced-Level Programs and designated components Site visits in F22 and beyond are not eligible for phase-in Assessment Sufficiency Criteria CAEP Evaluation Framework for EPP-Created Assessments The process of building a case that standard is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, it may help to begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component 5.2/A It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.
6
Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements
7
Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement (legacies)
8
Step 4. Take Stock With Stakeholders…Faculty, Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3
9
Step 5. Analyze and Interpret
The Evidence and Assessment results… That Education Preparation Provider’s program management and operations (e.g., systems, processes, and practices) related to meeting the CAEP standards Develop the plan for action
10
Step 6. Formulate Summative and Analytical Statements
Frame the argument to be made for standard - what points will be offered, which support the argument Describe the data sources and representativeness, relevant to supporting the standard - why are the data credible for this standard Present the results in a way that aligns with the standard Draw a conclusion about how the data supports the standard Where appropriate, address triangulation and convergence of different forms of evidence; compensates for limitations of any one data source Discuss the implications of the findings for subsequent action by the provider
11
Summative Statements vs. Analytic Statements
Why are Summary Statements not enough? Summaries identify the information provided Does not provide analysis and interpretation over the quality of the evidence and data Does not identify trends/patterns, comparisons, gaps, and/or differences Provider (required) does not demonstrate conducting their own analysis and evaluation
12
A Summative Statement DC State College has provided data on the accepted applicant cohort’s GPAs and SAT scores required for admission. Many of EPP’s candidates have a 3.0 GPA average and are in the top 30th percentile.
13
An Analytical Statement
The Educator Preparation Provider (EPP) requires applicants to have a 2.8 GPA and to be in the top 50th percentile on the SAT or ACT (Exhibit 32, Catalog Admission Requirements). Data provided from indicates that the cohort average GPA for 22 admitted candidates was a 3.3 and applicants, were in the top 30th percentile on the SAT and top 40th percentile on the ACT. These data are also consistent with those provided from and accepted cohort scores.
14
Always Consider…remember
EPP’s Analysis and Interpretations Quality Appropriateness Sufficiency Trends Gaps
15
Standard 1/A.1 Content and Pedagogical Knowledge
16
CAEP Standard 1/A.1 Content Knowledge and Pedagogical Knowledge
Candidate Knowledge, Skills, and Professional Dispositions 1.1 A.1.1 Provider Responsibilities A.1.2
17
CAEP Standard 1/A.1 Content Knowledge and Pedagogical Knowledge
The provider ensures that candidates develop a deep understanding of the critical concepts and principles of their discipline [components 1.1, 1.3] and, by completion, can use discipline-specific practices flexibly to advance the learning of all students toward attainment of college- and career-readiness standards [component 1.4]. The provider ensures that candidates for professional specialties develop a deep understanding of the critical concepts and principles of their field of preparation [component A.1.1] and, by completion, can use professional specialty practices flexibly to advance the learning of all P-12 students toward attainment of college- and career-readiness standards [component A.1.2].
18
Standard 1/A.1’s Holistic Case
The development of content knowledge and pedagogical knowledge comes together in candidates’ understanding of content, discipline specific practices, and the ability to address, for all, the college-and- career readiness standards. …CAEP standards for Initial/Advanced Programs, Evidence Sufficiency Criteria that the whole is more than merely the sum of its parts
19
Step 1. Rules for Standard 1/A.1
General for all Standards Special for Standard 1/A.1 Key concepts in standard and components are addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses (if applicable) No required components All data disaggregated by specialty licensure area Evidence from Standard 1/A.1 cited in support of continuous improvement, part of overall review system
20
Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements
21
Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement Selected Improvement Plan
22
Component 1.1 – Key Language
Candidates demonstrate an understanding of the 10 InTASC standards at the appropriate progression level(s) in the following categories: the learner and learning; content; instructional practice; and professional responsibility. Reflect on: What evidence do I have that would demonstrate developing an understanding over time in these four categories?
23
Evidence Sufficiency Criteria, 1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF 10 InTASC STANDARDS All four of the InTASC categories are addressed with multiple indicators across the four categories Indicators/measures specific to application of content knowledge in clinical settings are identified Data/evidence are analyzed including identification of trends/patterns, comparisons, and/or differences Averages at/above acceptable levels on EPP’s scoring indicators, on InTASC standards (categories) If applicable, demonstration candidates performance is comparable to non-candidates’ performance in same courses or majors Performances indicate competency and benchmarked against the average licensure area performance of other providers Interpretations and conclusions are supported by data/evidence
24
COMPONENT A.1.1: Key Language
Candidates for advanced preparation demonstrate their proficiencies to understand and apply knowledge and skills appropriate to their professional field of specialization so that learning and development opportunities for P-12 are enhanced, through: Application of data literacy; Use of research and understanding of qualitative, quantitative and/or mixed methods research methodologies; Use of data analysis and evidence to develop supportive school environments; Leading and/or participating in collaborative activities with others such as peers, colleagues, teachers, administrators, community organizations, and parents; Application of appropriate technology for their field of specialization; and Application of professional dispositions, laws and policies, codes of ethics and professional standards appropriate to their field of specialization. Evidence of candidate content knowledge appropriate for the professional specialty will be documented by state licensure test scores or other proficiency measures.
25
EVIDENCE SUFFICIENCY CRITERIA, A.1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Demonstrates that most candidates pass state/nationally-benchmarked content/licensure exams Addresses all of the professional skills listed in the component Documents proficiency for at least three of the skills for each specialty field Utilizes multiple measures to assess each proficiency Utilizes measures that meet criteria in CAEP Evaluation Framework for EPP-Created Assessments Phase-In Plans for Component A.1.1 meet the criteria for the CAEP Guidelines for Plans and are consistent with the Phase-In Schedule.
26
Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3
27
Evaluating EPP-Created Assessments
28
Expectations for Evidence: Standard 5
The provider maintains a quality assurance system comprised of valid data from multiple measures, including evidence of candidates’ and completers’ positive impact on P-12 student learning and development. Standard 5 relates to Quality Assurance and Continuous Improvement. The intent of the standard is to focus attention on the need for higher quality data and regular review and use of those data by programs. The standard states [slide]
29
Component 5.2/A: Key language
The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent. Reflect on: What evidence do I have that would demonstrate the quality of assessment measures?
30
EPP-Created Assessments
Standard 1 Component 1.1 Clinical Experience/Observation Instruments Lesson/Unit Plans (Rubrics) Portfolios Teacher Work Samples GPA, Courses Specific P-12 Learner Dispositional Data Comparisons of Education and other IHE attendees on provider end-of- major projects End of Course/Program assessments Pre-Service Measures of Candidate Impact Capstone/Thesis EPP-Created Assessments Initial Level Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments
31
EPP Created- Assessments
Standard A.1, component A 1.1 Action Research Capstones/Portfolios/Thesis Dispositional/Professional Responsibility Data Problem-based projects with coursework/ group projects Problem-based projects with school/district Pre- and post-data and reflections on interpretations and use of data End of key-course tests Grades, by program field Survey Data from Completers/Employers EPP Created- Assessments Advanced Level Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments
32
Evidence Quality Relevant
Evidence that the measures provided are applicable to CAEP standards/components (relates to validity)
33
Evidence Quality Representative
Evidence that data samples are free of bias and are typical of completed assessments If not, the EPP clearly delineates the limits of generalizability (relates to validity)
34
Methodology Reliability Validity
Evidence Quality Actionable Evidence is accessible and is in a form that can guide EPP in evaluating outcomes, making decisions; modeling, implementing, and evaluating innovations. Methodology Reliability Validity
35
Evidence Quality Verifiable
Data records are accurate and analyses can be replicated by a third party (relates to reliability)
36
Evidence Quality Cumulative
Data sets are based on multiple concordant measures for each standard and ≥ 3 administrations of the assessments
37
Component 1.2 – Key Language
Providers ensure that candidates use research and evidence to develop an understanding of the teaching profession and use both to measure their P-12 students’ progress and their own professional practice. Reflect on: What evidence do I have that would demonstrate using research and assessment (evidence) for student and professional learning?
38
Evidence Sufficiency Criteria, 1.2
CANDIDATES USE RESEARCH/EVIDENCE TOWARD TEACHING PROFESSION Data/evidence document effective Candidate use of Research/evidence for planning, implementing, and evaluating students progress Data to reflect on teaching effectiveness and own practice Data to assess P-12 student progress and then modify instruction based on student data
39
Component 1.3 – Key Language
Providers ensure that candidates apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Specialized Professional Associations (SPA), the National Board for Professional Teaching Standards (NBPTS), states, or other accrediting bodies (e.g., National Association of Schools of Music – NASM). Reflect on: What evidence do I have that would demonstrate the application of CK and PK in response to other professional standards?
40
Evidence Sufficiency Criteria, 1.3
CANDIDATES APPLY CONTENT/PEDAGOGICAL KNOWLEDGE; IN RESPONSE TO SPAs Licensure area questions are completed/supported by analysis/accurate interpretations of specialty licensure area data Note: The Specialty Licensure Area Questions are: How have the results of specialty licensure area or SPA evidence been used to inform decision making and improve instruction and candidate learning outcomes? What has been learned about different specialty licensure areas as a result of the review of the disaggregated data? How does the specialty licensure area data provide evidence for meeting the state-selected standards? How is specialty licensure area evidence aligned with the identified state standards?
41
Component 1.4 – Key Language
Providers ensure that candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards (e.g., Next Generation Science Standards, National Career Readiness Certificate, Common Core State Standards). Reflect on: What evidence do I have that would demonstrate skills and commitment to access for all students?
42
Evidence Sufficiency Criteria, 1.4
CANDIDATES DEMONSTRATE TO COLLEGE-AND-CAREER-READY STANDARDS Multiple indicators/measures specific to evaluating proficiencies for Candidate’s ability to Provide effective instruction for all students (differentiation of instruction) Have students apply knowledge to solve problems and think critically Include cross-discipline learning experiences and to teach for transfer of skills Design and implement learning experiences that require collaboration and communication skills
43
Component 1.4 Suggested Evidence
Evidence specific to college- and career-readiness Plans, assignments, and observational data demonstrate candidates’ skills for Deep content knowledge Eliciting P-12 student application of their knowledge to solve problems and think critically Cross-discipline teaching Differentiation instruction Ability to identify and interpret assessments to match P-12 college- and career-readiness goals/objective
44
Component 1.5 – Key Language
Providers ensure that candidates model and apply technology standards as they design, implement and assess learning experiences to engage students and improve learning; and enrich professional practice. Reflect on: What evidence do I have that would demonstrate modeling and application of technology skills to enhance learning for students and self?
45
Evidence Sufficiency Criteria, 1.5
CANDIDATES MODEL AND APPLY TECHNOLOGY Candidates demonstrate Knowledge and skill proficiencies including accessing databases, digital media, and/or electronic sources The ability to design and facilitate digital learning The ability to track and share student performance data digitally
46
Component 1.5, Technology…
Design Analysis of Learning and Teaching Facilitate Planning for Integration of Instructional Technology Evaluate Post-Instruction Evaluation and Review
47
Component A.1.2 – Key Language
Providers ensure that advanced program completers have opportunities to learn and apply specialized content and discipline knowledge contained in approved state and/or national discipline-specific standards. These specialized standards include, but are not limited to, Specialized Professional Association (SPA) standards, individual state standards, standards of the National Board for Professional Teaching Standards, and standards of other accrediting bodies [e.g., Council for Accreditation of Counseling and Related Educational Programs (CACREP)] Reflect on: What evidence do I have that would demonstrate candidates application of advanced professional knowledge, at professional standard levels?
48
EVIDENCE SUFFICIENCY CRITERIA, A.1.2
CANDIDATES APPLY ADVANCED PREPARATION KNOWLEDGE Documents that the majority of programs meet the standards of the selected program review option(s) A majority submitted for SPA Review achieved National Recognition State Review reports document how well individual programs perform in relation to the state’s selected standards and that the majority meet the standards Program Review with Feedback results show that the state-selected state or national standards are met for the majority of programs Includes a discussion of performance trends and compares across specialty areas. Component A.1.2 is not eligible for Phase-in Plan submission
49
Step 5. Analyze and Interpret
The Evidence and Assessment results… That Education Preparation Provider’s program management and operations (e.g., systems, processes, and practices) related to meeting the CAEP standards Develop the plan for action
50
In Summary - The Case for Standard 1/A.1
Information is provided from several sources and provides evidence of candidate knowledge, skills, and dispositions. Grades, scores, pass rates, and other data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.
51
CAEP Tips… Standard 1
52
Caution EPP-Created Assessments
Yes. Multiple Assessments with Multiple Indicators. No. An Assessment with Multiple Indicators toward Multiple Components to demonstrate toward standard.
53
Program Review Options
Available program review options for EPPs in states with agreements: SPA review with National Recognition (3 years prior to site visit) CAEP program review with feedback (part of self-study report) State review of programs (determined by state) Available program review options for EPPs in states without agreements: CAEP program review with feedback (part of self-study report)
54
Standard 2/A.2 Clinical Partnerships and Practice
55
CAEP Standard 2/A.2 Clinical Partnerships and Practice
Partnerships for Clinical Preparation 2.1 A.2.1 Clinical Educators 2.2 Clinical Experiences 2.3 A.2.2
56
CAEP Standard 2/A.2 Clinical Partnerships and Practice
The provider ensures that effective partnerships [components 2.1 and 2.2] and high-quality clinical practice [component 2.3] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions necessary to demonstrate positive impact on all P- 12 students’ learning and development. The provider ensures that effective partnerships [component A.2.1] and high-quality clinical practice [component A.2.2] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions appropriate for their professional specialty field.
57
Standard 2/A.2’s Holistic Case
That a strong collaborative clinical preparation is only as strong as the P- 12 partnerships, clinical educators (initial), and the clinical experiences. …CAEP Standards for Initial/Advanced Programs, Evidence Sufficiency Criteria that the whole is more than merely the sum of its parts
58
Step 1. Rules for Standard 2/A.2
General for all Standards Special for Standard 2 Key concepts in standard and components are addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses (if applicable) No required components
59
Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements
60
Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement
61
Step 4. Take Stock With Stakeholders…Faculty, Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3
62
Component 2.1 – Key Language
Partners co-construct mutually beneficial P-12 school and community arrangements, including technology-based collaborations, for clinical preparation and share responsibility for continuous improvement of candidate preparation. Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for candidate outcomes. Reflect on: What evidence do I have that would demonstrate mutually beneficial and accountable partnerships in which decision-making is shared?
63
Evidence Sufficiency Criteria, 2.1
EVIDENCE THAT A COLLABORATIVE PROCESS IN PLACE AND REVIEWED Documentation provided for a shared responsibility model that includes elements of Input into curriculum development Co-construction of instruments and evaluations Co-construction of criteria for selection of mentor teachers EPP and P-12 educators provide descriptive feedback to candidates Opportunities for candidates to observe and implement effective teaching strategies linked to coursework Involvement in on-going decision-making
64
Co-Construction of Clinical Experiences
Co-Construct the opportunities, challenges, and responsibilities, along with the support and guidance of clinical educators and designated faculty. Co- Constructed opportunities allow Candidates to apply the knowledge, dispositions and skills developed in general education and professional courses. Candidates should continue learning to adapt to the various conditions of classrooms in Co-Construction opportunities. Application, Introduction, Participation, Culmination, Roles/Responsibilities, Evaluate…
65
Component 2.2 – Key Language
Partners co-select, prepare, evaluate, support, and retain high-quality clinical educators, both provider- and school-based, who demonstrate a positive impact on candidates’ development and P-12 student learning and development. In collaboration with their partners, providers use multiple indicators and appropriate technology-based applications to establish, maintain, and refine criteria for selection, professional development, performance evaluation, continuous improvement, and retention of clinical educators in all clinical placement settings. Reflect on: What evidence do I have that would demonstrate the depth of partnership around highly effective clinical educators?
66
Evidence Sufficiency Criteria, 2.2
EVIDENCE EPP AND P-12 CLINICAL EDUCATORS/ADMINISTRATORS CO- CONSTRUCT CRITERIA FOR CO- SELECTION Clinical educators receive Professional development, resources, and support Are involved in creation of professional development opportunities, the use of evaluation instruments, professional disposition evaluation of candidates, specific goals/objectives of the clinical experience, and providing feedback Data collected are used by EPPs and P-12 clinical educators for modification of selection criteria, future assignments of candidates, and changes in clinical experiences
67
Clinical Educator Development/Responsibilities
Process of collaboration with partnerships; further demonstrate partnerships, in field-experiences Developed - criteria, reflective teaching and learning, mutual engagement,… Monitored - facilitate learning and development Evaluated - opportunities for partners to…
68
Component 2.3 – Key Language
The provider works with partners to design clinical experiences of sufficient depth, breadth, diversity, coherence, and duration to ensure that candidates demonstrate their developing effectiveness and positive impact on all students’ learning and development. Clinical experiences, including technology-enhanced learning opportunities, are structured to have multiple performance-based assessments at key points within the program to demonstrate candidates’ development of the knowledge, skills, and professional dispositions, as delineated in Standard 1, that are associated with a positive impact on the learning and development of all P-12 students. Reflect on: What evidence do I have that clinical experiences develop candidates’ Knowledge, Skills, and Dispositions to have a positive impact on P-12 learning?
69
Evidence Sufficiency Criteria, 2.3
EVIDENCE ALL CANDIDATES HAVE CLINICAL EXPERIENCES IN DIVERSE SETTINGS Attributes (depth, breadth, diversity, coherence, and duration) are linked to student outcomes and candidate/completer performance documented in Standards 1 and 4 Evidence documents a sequence of clinical experiences that are focused, purposeful, and varied with specific goals Clinical experiences include focused teaching experience where specific strategies are practiced Clinical experiences are assessed using performance-based
70
Component A.2.1: Key language
Partners co-construct mutually beneficial P-12 school and community arrangements, including technology-based collaborations, for clinical preparation and shared responsibility for continuous improvement of candidate preparation. Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for candidate outcomes.
71
Evidence Sufficiency Criteria, A.2.1
PARTNERSHIPS FOR CLINICAL PREPARATION Illustrates specific benefits to provider and P-12 partners Outlines the collaborative nature of the relationship Documents that effectiveness of the partnership is reviewed at least annually Shows that the EPP seeks input from partners to refine criteria for entry/exit to clinical experiences Documents partner participation in development and review activities (e.g., for clinical instruments, clinical curriculum, EPP-curriculum) Phase-in Plans meet CAEP guidelines and schedule Instruments for evaluating partnership (if any) meet CAEP’s assessment sufficiency criteria
72
SUGGESTED EVIDENCE: PARTNERSHIPS FOR CLINICAL PREPARATION
Documents illustrating co-construction of a collaborative relationship Documents outlining provider and partner responsibilities for examining and improving clinical preparation Evidence that assessments and performance standards are mutually acceptable to providers and partners Documentation of shared perspective on appropriate uses of technology for the candidate’s future role
73
Component A.2.2: Key language
The provider works with partners to design varied and developmental clinical settings which allow opportunities for candidates to practice applications of content knowledge and skills emphasized by the courses and other experiences of the advanced preparation program. The opportunities lead to appropriate culminating experiences in which candidates demonstrate their proficiencies, through problem-based tasks or research (e.g., qualitative, quantitative, mixed methods, action) that are characteristic of their professional specialization as detailed in component A.1.1
74
Evidence Sufficiency Criteria, A.2.2
CLINICAL EXPERIENCES Documents that all candidates have practical experiences in workplace settings Illustrates that candidates observe and implement appropriate and effective strategies for their fields of specialization Documents the attributes of clinical/practical experiences Illustrates that they are varied and developmentally progressive Illustrates that they relate to coursework Demonstrates a relationship between clinical/practical experiences and candidate outcomes reported in Standard A.1 Phase-in Plans meet CAEP guidelines and schedule .
75
SUGGESTED EVIDENCE: CLINICAL EXPERIENCES
Charts illustrating the breadth, depth, duration, and coherence of the opportunities to practice applying content knowledge and skills to practical challenges in their specialty area Evidence mapping the developmental trajectory of specific practical knowledge and skills as candidates’ progress through courses and the clinical experiences embedded within or external to the courses. Candidate evaluations of connection between coursework and fieldwork
76
Step 5. Analyze and Interpret
The Evidence and Assessment results… That Education Preparation Provider’s program management and operations (e.g., systems, processes, and practices) related to meeting the CAEP standards Develop the plan for action
77
In Summary - The Case for Standard 2/A.2
Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, schools/districts, and continuous functioning. Data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined in relation to clinical experiences, as appropriate. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.
78
tips… Standard 2
79
Clinical (Field) Experiences Guidelines
Evidence that clinical experiences have “…sufficient depth, breadth, diversity, coherence, and duration” Description of clinical experience goals and operational design Documentation that clinical experiences are implemented as described Scope and sequence matrix is provided that charts the depth, breadth, and diversity of clinical experiences Experiences are deliberate, purposeful, sequential, and assessed using performance- based protocols
80
Clinical Experience Table Course Sample
81
Clinical Experience Table
Program Sample
82
Clinical Experience Table Course Sample
Clinical Internships & Associated Description (Observation and/or Implementation) Program Fields Hours Measures Schools/Districts EDU 2100: This supervised practicum in elementary settings, exposes candidates with practical experiences in workplace settings and scenarios to evaluate the connections between coursework and fieldwork M.Ed., Ed.D. 45 hours of Observation and/or Implementation -Dispositional/ Professional Responsibility Data -Problem-based projects, coursework Internship- Must be Approved by PDS/D during semester of application prior to… EDU 2900: This clinical internship in elementary education, is designed for Candidates to appropriately and effectively apply research based instructional learning theory/strategies for their fields of specialization, in P -12 60 hours of Observation and Implementation -Problem-based projects, school/district -Action Research -Capstones/ Portfolios/ Thesis
83
Clinical Experience Table Program Sample
Field Field Experiences & Associated Hours (Observation) Clinical Internships & Associated Hours (Implementation) Hours M.Ed., Secondary Mathematics Education EDUM 552, EDUM 553, EDUM 554, EDUM 555, EDUM 556 (Practicum) – 200 hours observation EDU-M 699 – 500 hours of participation and implementation of coursework and fieldwork 700 M.Ed., English as a Second Language (TESL) TESL 500 (Practicum) – 250 hours of observation and participation EDU-TESL 699 – 500 hours of participation and implementation of research based instructional learning strategies 750
84
Other Guidelines Other guidelines for evidence include the following:
Only parts of instruments, surveys, handbooks, minutes, meeting notes, or other documents specific to the standard or component should be submitted as evidence. Complete handbooks, catalogs, advising guides, and other documents should not be submitted in their entirety. Only the sections specific to a Standard(s) or component(s) should be tagged and identified as evidence.
85
Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
86
CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
Recruitment/Admission of Diverse Candidates who Meet Employment Needs 3.1 A.3.1 Admission Standards Indicate That Candidates Have High Academic Achievement and Ability 3.2 A.3.2 Additional Selectivity Factors (non-academic) 3.3 Selectivity During Preparation (performance standards) 3.4 A.3.3 Selection At Completion (ready, not just finished) A.3.4
87
CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
The provider demonstrates that the quality of candidates is a continuing and purposeful part of its responsibility from recruitment [component 3.1], at admission [component 3.2], through the progression of courses and clinical experiences [components 3.3 and 3.4], and to decisions that completers are prepared to teach effectively and are recommended for certification [components 3.5 and 3.6]. The provider demonstrates that development of candidate quality is the goal of educator preparation in all phases of the program. This process is ultimately determined by a program’s meeting of Standard 4. The provider demonstrates that the quality of advanced program candidates [components A.3.1 and A.3.2] is a continuing and purposeful part of its responsibility [component A.3.3] so that completers are prepared to perform effectively and can be recommended for certification where applicable [component A.3.4].
88
Standard 3/A.3’s Holistic Case
Providers continuously and purposely recruit, admit, monitor, and recommend towards licensure of candidates from quality educator preparation programs. …CAEP Standards for Initial/Advanced Programs, Evidence Sufficiency Criteria that the whole is more than merely the sum of its parts
89
Step 1. Rules for Standard 3/A.3
General for all Standards Special for Standard 3/A.3 Key concepts in standard and components are addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses (if applicable) Component 3.2/A.3.2 are required in order to meet Standard 3 Emphasis on evidence; must demonstrate component is met Evidence must be provided for component 3.2, for the standard to be met.
90
Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements
91
Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement
92
Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3
93
Component 3.1 – Key Language
The provider presents plans and goals to recruit and support completion of high-quality candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of America’s P-12 students. The provider demonstrates efforts to know and address community, state, national, regional, or local needs for hard-to-staff schools and shortage fields, currently, STEM, English- language learning, and students with disabilities. Reflect on: What recruitment evidence (plans and goals) do I have that demonstrates attracting diverse candidates to meet identified needs?
94
Evidence Sufficiency Criteria, 3.1
PLAN/GOALS TO RECRUIT/SUPPORT HIGH-QUALITY CANDIDATES Recruitment plan with base points and goals; including academic ability, diversity, and employment needs Data on applicants, admitted, and enrolled candidates are disaggregated by relevant demographics Evidence that results are recorded, monitored, and used in planning and modification of recruitment strategies Plan and demonstrates knowledge of and addresses employment opportunities in schools, districts, and/or regions
95
Recruitment Plan Common elements…
Introduction and Planning Organization, College, Department, etc… Background of College/Department College/Department Self-Assessment Recruitment of Candidates Develop EPP’s “Message” Develop “How To” Recruit Develop, Schedule, Conduct Orientations Retention of Candidates Assign Support/Supervisor Provide Learning Opportunities of Foundations, Methods, and Clinical Experiences Evaluate Content and Pedagogical Development Provide Academic/non-Academic Resources Transition of Candidates to Completers Communicate with Completers regularly via surveys, polls, questionnaires, census Recognize professional support, supervisor(s), and resources Managing and Evaluating Design the Evaluation Collect, Organize, and Analyze Data Report Results, Conclusions Reached, and Recommendations Resources
96
Component 3.2 – Key Language
The provider meets CAEP minimum criteria or the state’s minimum criteria for academic achievement, whichever are higher, and presents disaggregated data on the enrolled candidates whose preparation begins during an academic year.
97
Evidence Sufficiency Criteria, 3.2
CANDIDATES DEMONSTRATE ACADEMIC ACHIEVEMENT Average scores for group of candidates during an academic year meet CAEP minimum GPA of 3.0 AND performance on nationally-normed, substantially equivalent state- normed, or EPP administered assessments is in the top 50% for all test takers of the selected assessment Assessments examine candidate performance in mathematical and reading achievement Beginning in 2021 in writing achievement Group average: The GPA and standardized test scores are averaged for all members of a cohort or class of admitted candidates. Averaging does not require that every candidate meet the specified score. Thus, there may be a range of candidates’ grades and scores on standardized tests.
98
Academic Selection Samples
Assessment Test or Section 3.2 Domain—NOTE: proficiency must be met for each domain Group average performance requirements of candidates whose preparation began during the academic year or earlier ACT “Reading” Reading 21.25 “Math” Math “Writing” Writing 6.60 SAT “Evidence-Based Reading and Writing” 543.33 532.50 “Essay – Writing dimension.” 5.30 Praxis Core 168.06 “Mathematics” 162.14 165 OGET “Oklahoma General Education Test (OGET)” Reading, Math, and Writing 258** CAEP minimum criteria 1) Admissions 2) Prior to program completion 3.0 GPA and is for Initial Standards Performance on a nationally-normed, substantially equivalent state-normed, or EPP administered assessments is in the top 50% for all test takers of the selected assessment
99
Component 3.3 – Key Language
Educator preparation providers establish and monitor attributes and dispositions beyond academic ability that candidates must demonstrate at admissions and during the program. The provider selects criteria, describes the measures used and evidence of the reliability and validity of those measures, and reports data that show how the academic and non- academic factors predict candidate performance in the program and effective teaching. Reflect on: What data can I present to demonstrate the other things (besides GPA and test scores) we look for at admissions that result in selecting high quality candidates?
100
Evidence Sufficiency Criteria, 3.3
PROVIDERS ESTABLISHES/MONITORS CANDIDATE ATTRIBUTS/DISPOSITIONS, BEYOND ACADEMICS Rationale for established non-academic criteria Makes evidence-based case for the selection and implementation Evidence that EPP monitors candidate progress on established non-academic criteria at multiple points; takes appropriate actions based on results Evidence of association/correlation of non-academic criteria with candidate and completer performance
101
Non-Academic Samples Admission to Teacher Education
Admission to Clinical Experience
102
Component 3.4 – Key Language
The provider creates criteria for program progression and monitors candidates’ advancement from admissions through completion. All candidates demonstrate the ability to teach to college- and career-ready standards. Providers present multiple forms of evidence to indicate candidates’ developing content knowledge, pedagogical content knowledge, pedagogical skills, and the integration of technology in all of these domains Reflect on: What data can I present to demonstrate that my EPP continues to be selective of candidates throughout our programs?
103
Evidence Sufficiency Criteria, 3.4
PROVIDER CRITERIA FOR PROGRAM PROGRESSION/MONITORING OF CANDIDATES Evidence of candidates developing proficiencies in or evidence of developing proficiencies in candidates at 2 or more gateways of progression Ability to teach to college- and career-ready standards Pedagogical/Content knowledge Integration of use of technology Results and stated candidate progressions criteria align with evidence of actions taken such as: Changes in curriculum or clinical experiences Providing interventions/Counseling out
104
Monitoring Table of Candidates
105
Component 3.5 – Key Language
Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate has reached a high standard for content knowledge in the fields where certification is sought and can teach effectively with positive impacts on P-12 student learning and development Reflect on: What data can I present to demonstrate that exit criteria are rigorous?
106
Evidence Sufficiency Criteria, 3.5
PROVIDER DEMONSTRATES; CANDIDATES HAVE CONTENT KNOWLEDGE IN CERTIFICATION FIELD Evidence is the same as that for 1.1 Evidence of effective teaching including positive impacts on P-12 student learning and development for all candidates as noted in Standard 1
107
EPP Created- Assessments
Standard 1, component Clinical Experience/Observation Instruments -Lesson/Unit Plans -Portfolios -Teacher Work Samples -GPA, Courses Specific P-12 Learner -Dispositional Data -Comparisons of Education and other IHE attendees on provider end-of- major projects -End of Course/Program Assessments -Pre-Service Measures of Candidate Impact -Capstone/Thesis + Proprietary Assessments/Measures + State Assessments/Measures EPP Created- Assessments Initial Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments
108
Component 3.6 – Key Language
Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate understands the expectations of the profession, including codes of ethics, professional standards of practice, and relevant laws and policies. CAEP monitors the development of measures that assess candidates’ success and revises standards in light of new results Reflect on: What data can I present to document that our candidates understand the professional dos and don’t of teaching?
109
Evidence Sufficiency Criteria, 3.6
PROVIDER DEMONSTRATES, CANDIDATES UNDERSTAND EXPECTATIONS OF PROFESSION Candidates’ understanding; codes of ethics and professional standards of practice Evidence that candidates have knowledge of relevant laws and policies 504 disability provisions, education regulations; bullying, etc.
110
Component A.3.1 – Key Language
The provider sets goals and monitors progress for admission and support of high-quality advanced program candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of America’s teacher pool and, over time, should reflect the diversity of P-12 students. The provider demonstrates efforts to know and address community, state, national, regional, or local needs for school and district staff prepared in advanced fields. Reflect on: What evidence do you have that would demonstrate that there is an admission plan that is sensitive to candidate diversity, academic ability, and the employment landscape?
111
SUGGESTED EVIDENCE: ADMISSION OF DIVERSE CANDIDATES WHO MEET EMPLOYMENT NEEDS
Proof that the EPP periodically examines the employment landscape in the community, state, regional, or national market for which EPP’s are preparing completers e.g., shortage areas, job openings, job forecasts, and related information – Admission Plan showing that labor market information is considered during goal setting Documentation from admission reviews showing that the EPP monitors annual progress toward admission goals e.g., for high-need specialty areas, locality, gender, ethnicity, academic ability, Hiring and/or retention rates that show the majority of completers fulfill an employment need in a P-12 setting. Proof that the provider periodically examines the employment landscape – shortage areas, openings, forecasts, and related information – in the community, state, regional, or national market for which EPP’s are preparing completers. Documentation from admission reviews illustrating that the EPP monitors annual progress toward admission goals such as for gender, ethnicity, academic ability, and/or high-need specialty areas.
112
Component A.3.2 – Key Language
Required Component: The provider sets admissions requirements for academic achievement, including CAEP minimum criteria, the state’s minimum criteria, or graduate school minimum criteria, whichever is highest, and gathers data to monitor candidates from admission to completion. The provider determines additional criteria intended to ensure that candidates have, or develop, abilities to complete the program successfully and arranges appropriate support and counseling for candidates whose progress falls behind.
113
SUGGESTED EVIDENCE: CANDIDATES DEMONSTRATE ACADEMIC ACHIEVEMENT AND ABILITY TO COMPLETE PREPARATION SUCCESSFULLY Documentation on: Admission criteria for GPA and results of GPA analysis Admission criteria for normed tests and results of rank analyses EPP criteria created for interviews or other admission procedures together with results Performance on qualifying exams Assessments of writing ability Documentation illustrating that the EPP sets goals for candidate support and monitors progress toward goals e.g., provisions for targeted assistance, remediation, etc. Providers set admission requirements that include the CAEP minimum criteria, described in component A.3.2, but also including their own criteria “intended to ensure that candidates have, or develop, abilities to complete the program successfully.” Examples might include the following: Admission criteria for GPA and results Admission criteria for normed tests and results EPP criteria created for interviews or other admission procedures together with results Performance on qualifying exams Assessments of writing ability Evidence for components A.3.1 and A.3.2 might also include documentation from performance reviews, remediation efforts, and/or provisions illustrating that the EPP sets goals for candidate support and monitors progress toward goals of providing sufficient support to candidates to facilitate successful program completion.
114
Academic Selection Samples
CAEP minimum criteria 1) Admissions 2) Prior to program completion 3.0 GPA or is for Advanced Standards Performance on a nationally-normed, substantially equivalent state-normed, or EPP administered assessments is in the top 50% for all test takers of the selected assessment Licensure examinations Additional measures utilized toward compliance of other accreditors (e.g., for reporting requirements; WASC, NASC, HLC (aka NCA), SACS, MSA, NEASC) Other Assessment Test or Section 3.2 Domain—NOTE: proficiency must be met for each domain Group average performance requirements of candidates whose preparation began during the academic year or earlier GRE “Verbal Reasoning” Reading 150.75** “Quantitative Reasoning” Math 152.75** “Analytical Writing” Writing 3.74**
115
Component A.3.3 – Key Language
Before the provider recommends any advanced program candidate for completion, it documents that the candidate has reached a high standard for content knowledge in the field of specialization, data literacy and research-driven decision making, effective use of collaborative skills, applications of technology, and applications of dispositions, laws, codes of ethics and professional standards appropriate for the field of specialization. Reflect on: What data can I present to demonstrate following documentation that candidates (besides GPA and test scores) have developed and progressed through program?
116
SUGGESTED EVIDENCE: SELECTIVITY DURING PREPARATION
Evidence of candidates developing proficiencies in or evidence of developing proficiencies in candidates at 2 or more gateways of progression Proficiencies to understand and apply knowledge and skills appropriate to program fields of specialization, see generic skills in component A.1.1 Results and stated candidate progressions criteria align with evidence of actions taken such as: Changes in curriculum or clinical experiences Providing interventions/Counseling out Assessments used at key points during the program (e.g., phases/stages, checkpoints) Content knowledge and dispositions assessments; these could be administered serially (in any order) or in parallel Demonstration of evolving technology integration into practice; this could be assessed repeatedly with the same tasks and criteria for competence, or with different tasks or criteria at different points in the program.
117
EPP Created- Assessments
Academic Selection Samples Action Research Capstones/Portfolios/Thesis Dispositional/Professional Responsibility Data Problem-based projects with coursework/group projects Problem-based projects with school/district Pre- and post-data and reflections on interpretations and use of data End of key-course tests Grades, by program field Survey Data from Completers/Employers + State Assessments/Surveys + Other Proficiency Measures EPP Created- Assessments Advanced Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments
118
Component A.3.4 – Key Language
The provider creates criteria for program progression and uses disaggregated data to monitor candidates’ advancement from admissions through completion. Reflect on: What data can I present to demonstrate at exit the proficiencies of completing candidates?
119
Evidence Sufficiency Criteria, A.3.4
SELECTION AT COMPLETION Checklist for completion requirements that includes performance metrics and candidate’s results: e.g., graduation requirements, licensure requirements, specific skills, types of authentic problem-based experiences Curriculum and state measures of topic knowledge on special education laws, codes of ethics, professional standards EPP-created dispositional/ethics assessments
120
Monitoring Table of Candidates
121
Step 5. Analyze and Interpret
The Evidence and Assessment results… That Education Preparation Provider’s program management and operations (e.g., systems, processes, and practices) related to meeting the CAEP standards Develop the plan for action
122
In Summary - The Case for Standard 3/A.3
Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, and continuous functioning. Data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined in relation to components 3.1 and 3.2 (recruitment and admissions), as appropriate. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described. The guiding questions may help focus the selection of evidence and the EPP inquiry of its message: STRENGTHS AND WEAKNESSES—What strengths and areas of challenge have you discovered as you analyzed and compared the results of your disaggregated data on candidate quality, recruitment/admissions, and quality monitoring by program and by demographics? What questions have emerged that need more investigation? How are you using this information for continuous improvement? TRENDS What trends in candidate quality, recruitment and admissions practices, and monitoring of candidate progress have emerged as you compared program and demographic data across evidence sources and programs? What questions have emerged that need more investigation? How are you using this information for continuous improvement? IMPLICATIONS—What implications can you draw or conclusions can you reach across evidence sources about candidate quality, recruitment/ admissions, and quality monitoring? What questions have emerged that need more investigation? Improvement? How have data driven decisions on changes been incorporated into preparation?
123
tip… Standard 3
124
Initial Level Standards
Proprietary Assessment Test or Section 3.2 Domain—NOTE: proficiency must be met for each domain Group average performance requirements of candidates whose preparation began during the academic year or earlier ACT “Reading” Reading 21.25 “Math” Math “Writing” Writing 6.60 SAT “Evidence-Based Reading and Writing” 543.33 532.50 “Essay – Writing dimension.” 5.30 Praxis Core 168.06 “Mathematics” 162.14 165 OGET “Oklahoma General Education Test (OGET)” Reading, Math, and Writing 258** State Assessments/Measures Proprietary Assessments/Measures STATE Relevant surveys Assessments (value added) of completers Licensure examinations Additional measures utilized toward compliance of other accreditors(e.g., for reporting requirements; WASC, NASC, HLC (aka NCA), SACS, MSA, NEASC)
125
Advanced Level Standards
STATE Relevant surveys Assessments (value added) of completers Licensure examinations Additional measures utilized toward compliance of other accreditors(e.g., for reporting requirements; WASC, NASC, HLC (aka NCA), SACS, MSA, NEASC) OTHER - State Assessments/Surveys - Other Proficiency Measures Assessment Test or Section 3.2 Domain—NOTE: proficiency must be met for each domain Group average performance requirements of candidates whose preparation began during the academic year or earlier GRE “Verbal Reasoning” Reading 150.75** “Quantitative Reasoning” Math 152.75** “Analytical Writing” Writing 3.74**
126
Standard 4/A.4 Program Impact
127
CAEP Standard 4/A.4 Program Impact
Impact on P-12 Student Learning and Development 4.1 Indicators of Teaching Effectiveness 4.2 Satisfaction of Employers 4.3 A.4.1 Satisfaction of Completers 4.4 A.4.2
128
CAEP Standard 4/A.4 Program Impact
The provider demonstrates the impact of its completers on P-12 student learning and development [component 4.1], classroom instruction [component 4.2] and schools [component 4.3], and the satisfaction of its completers [component 4.4] with the relevance and effectiveness of their preparation. The provider documents the satisfaction of its completers from advanced preparation programs [component 4.2] and their employers [component 4.1] with the relevance and effectiveness of their preparation.
129
Standard 4’s Holistic Case
Provider established the outcomes of preparation indicating that completers from licensure programs/fields are impacting P-12 student learning and development. …CAEP Standards for Initial/Advanced Programs, Evidence Sufficiency Criteria that the whole is more than merely the sum of its parts
130
Step 1. Rules for Standard 4/A.4
General for all Standards Special for Standard 4 Key concepts in standard and components are addressed EPP-Created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential Disaggregated data on candidates, for main/branch campuses (if applicable) All components for Standard 4 are required Emphasis on evidence; must demonstrate component is met Evidence must be provided for all components, of the standard to be met. All Phase-In requirements are met
131
Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements
132
Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement
133
Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3
134
Component 4.1 – Key Language
REQUIRED COMPONENT: The provider documents, using multiple measures that program completers contribute to an expected level of student-learning growth. Multiple measures shall include all available growth measures (including value-added measures, student-growth percentiles, and student learning and development objectives) required by the state for its teachers and available to educator preparation providers, other state-supported P-12 impact measures, and any other measures employed by the provider. Reflect on: What evidence do you have that would demonstrate graduates’ impact on P-12 student learning? What research methodologies could you feasibly employ to gain such information?
135
EVIDENCE SUFFICIENCY CRITERIA, 4.1
SUFFICIENT EVIDENCE Presents multiple measures showing positive impact on student learning One or more state-provided or two or more EPP-generated From a representative or purposive sample of graduates 1-3 years post-exit EPP-generated data utilizes research-based methodology( e.g., cases study, action research) Describes the measures and context Describes representativeness of sample/data Analyzes data and interprets results appropriately Conclusions are supported by results
136
SUGGESTED EVIDENCE: IMPACT ON LEARNING
Direct measures of student learning and development Addresses diverse subjects and grades P-12 impact or growth data from state teacher evaluations (if available) If state data are not available: Teacher-linked student assessments from districts Classroom-based research (e.g., action research, case studies)
137
Component 4.2 – Key Language
REQUIRED COMPONENT: The provider demonstrates, through structured and validated observation instruments and/or student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve. Reflect on: What evidence do I have (beyond measures of P-12 student learning) that would demonstrate in-service graduates are effective teachers?
138
EVIDENCE SUFFICIENCY CRITERIA, 4.2
SUFFICIENT EVIDENCE Measures classroom-based demonstration of professional knowledge, skills, and dispositions (e.g., InTASC, state/district teacher performance standards) Utilizing structured and validated teaching observation tools and/or P-12 student surveys Utilizing a representative sample that covers most licensure areas Obtaining survey return rates of 20% or higher Analyzes data and interprets results appropriately Conclusions are supported by results
139
SUGGESTED EVIDENCE: CLASSROOM INSTRUCTION
Teaching Observations Aligned to the 4 InTASC categories Aligned to state standards for teachers / local teacher evaluation framework P-12 Student Surveys Aligned to the InTASC categories Corroboration for observation/evaluation data Employer Surveys Aligned to the InTASC The 4 InTASC categories addressed in Standard 1 are: Learner and Learning, Content, Instructional Practice, and Professional Responsibility.
140
Component 4.3 : Key Language
REQUIRED COMPONENT: The provider demonstrates, using measures that result in valid and reliable data and including employment milestones such as promotion and retention, that employers are satisfied with the completers’ preparation for their assigned responsibilities in working with P-12 students. Reflect on: What evidence do we have that would demonstrate that employers are satisfied with the professional knowledge, skills, and dispositions of your program graduates who are working at their location?
141
EVIDENCE SUFFICIENCY CRITERIA, 4.3
SUFFICIENT EVIDENCE Shows that employers perceive completers’ preparation was sufficient for their job responsibilities and attainment of employment milestones (e.g., retention) Utilizing valid and reliable measures Obtaining response rates of 20% or higher Describes representativeness of sample/data for licensure areas Discusses satisfaction patterns with respect to employment contexts (e.g., shortage fields, hard-to-staff schools, schooling level, school demographics) Data analysis is appropriate and conclusions are supported by data
142
SUGGESTED EVIDENCE: SATISFACTION
Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Can triangulate with observation/evaluation, survey, and impact data Employer Surveys Corroboration for observation/evaluation and data
143
Component 4.4: Key language
REQUIRED COMPONENT: The provider demonstrates, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. Reflect on: What evidence do we have that would demonstrate your program graduates are satisfied with how well the program prepared them for their job?
144
EVIDENCE SUFFICIENCY CRITERIA, 4.4
SUFFICIENT EVIDENCE Shows that completers perceive their preparation was sufficient for their job responsibilities and was effective Utilizing valid and reliable measures Obtaining response rates of 20% or higher Describes representativeness of sample/data for licensure areas Discusses satisfaction patterns with respect to employment contexts (e.g., shortage fields, hard-to-staff schools, schooling level, school demographics) Data analysis is appropriate and conclusions are supported by data
145
SUGGESTED EVIDENCE: SATISFACTION
Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Can triangulate with observation/evaluation, survey, and impact data Employer Surveys Corroboration for observation/evaluation and data
146
COMPONENT A.4.1: KEY LANGUAGE
The provider demonstrates that employers are satisfied with completers’ preparation and that completers reach employment milestones such as promotion and retention. Reflect on: What evidence do you have that would demonstrate that employers are satisfied with the preparation of programs completers, that they fulfill employments needs, and that they perform effectively enough to be retained or promoted? 146
147
EVIDENCE SUFFICIENCY CRITERIA, A.4.1
SUFFICIENT EVIDENCE Shows that majority of responding employers report that completers were sufficiently prepared for their job responsibilities. Provides information on employment setting (e.g., locality, public/private, shortage field) The sample is representative of the completer population, or purposive with a plan for expansion toward representativeness over time Shows response rate is 20% or higher for those invited Analyzes data appropriately for the data type and quantity Examines the results for trends/patterns and differences between specialty areas and/or over time Does not over-generalize interpretations and conclusions to non-sampled groups of employers or completers Provider includes appropriate analysis and interpretation of results Describes a system for analysis, evaluation, and interpretation of data Utilizing valid and reliable measures Obtaining response rates of 20% or higher Conclusions supported by data Provide documentation of employment milestones
148
Component A.4.2 – Key Language
REQUIRED COMPONENT: The provider demonstrates that advanced program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. Reflect on: What evidence do you have that would demonstrate that completers are satisfied with their preparation, that completers fulfill employments needs, and that they perform effectively enough to be retained or promoted?
149
EVIDENCE SUFFICIENCY CRITERIA, A.4.2
SATISFACTION OF COMPLETERS Results show that the majority of responding completers report that they were sufficiently prepared for their job responsibilities The sample is representative of the completer population, or purposive with a plan for expansion toward representativeness over time Response rate is 20% or higher for those invited Provides analysis and interpretation of data aligned with the intent of the standard/component Examines the results for trends/patterns and differences between specialty areas and/or over time Does not over-generalize interpretations and conclusions to non-sampled groups of employers or completers Provider includes appropriate analysis and interpretation of results Describes a system for analysis, evaluation, and interpretation of data Utilizing valid and reliable measures Obtaining response rates of 20% or higher Evidence of an adequate and representative sample Analysis and interpretation of data aligned to standard/component Conclusions supported by data
150
Step 5. Analyze and Interpret
The Evidence and Assessment results… That Education Preparation Provider’s program management and operations (e.g., systems, processes, and practices) related to meeting the CAEP standards Develop the plan for action
151
In Summary - The Case for Standard 4/A.4
Information is provided from several sources and provides evidence of program impact on graduates (in-service). Data are analyzed for completer effectiveness, completer satisfaction, and employer satisfaction. Differences and similarities across licensure/field areas and demographic categories are examined. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification or “staying the course”. Based on the analysis of data, there are planned or completed actions for change that are described. The guiding questions may help focus the selection of evidence and the EPP inquiry of its message: STRENGTHS AND WEAKNESSES—What strengths and areas of challenge have you discovered about the impact of completers who are employed in the education professional positions for which they were prepared as you analyzed and compared the results of your disaggregated data by program and by demographics? What questions have emerged that need more investigation? How are you using this information for continuous improvement? TRENDS What trends have emerged about completer performance and completer/employer satisfaction with preparation as you compared program and demographic data across evidence sources and programs? What questions have emerged that need more investigation? How are you using this information for continuous improvement? IMPLICATIONS—What implications can you draw or conclusions can you reach across evidence sources about completer performance and completer/employer satisfaction with preparation? What questions have emerged that need more investigation? Improvement? How have data driven decisions on changes been incorporated into preparation?
152
tips… Standard 4
153
Indicators of Teaching Effectiveness
Employer/Completer satisfaction data Surveys, Polls, Questionnaires, Census, Case Studies, Focus Groups, etc… Descriptive, narrative data from approaches would employ a research-based method of qualitative analysis
154
Qualitative Analysis Method – Case Study Focus - Organization, entity, individual, or event Sample size --- Data Collection - Interviews, documents, reports, observations …the value of the case study in explaining an organization, entity, company, or event. A case study involves a deep understanding through multiple types of data sources. Case studies can be, exploratory, explanatory or describing an event.
155
Standard 5/A.5 Provider Quality Assurance and Continuous Improvement
156
CAEP Standard 5/A.5 Provider Quality Assurance and Continuous Improvement
Quality and Strategic Evaluation A.5.1-A.5.2 Continuous Improvement A.5.3-A.5.5
157
CAEP Standard 5/A.5 Provider Quality Assurance and Continuous Improvement
The provider maintains a quality assurance system [component 5.1/A] comprised of valid data from multiple measures [component 5.2/A and outcomes measures in 5.4/A], including evidence of candidates’ and completers’ positive impact on P-12 student learning and development [NOTE: This is a cross reference to preservice impact on P-12 student learning from component 3.5 and to in-service impact from Standard 4]. The provider supports continuous improvement that is sustained and evidence-based, and that evaluates the effectiveness of its completers [component 5.3/A and the evidence for Standard 4]. The provider uses the results of inquiry and data collection to establish priorities, enhance program elements and capacity, and test innovations to improve completers’ impact on P-12 student learning and development [component 5.3/A].
158
Quality Assurance System (QAS) Indicators
Meeting standard 5, particularly component 5. 1, involves providing evidence of a functioning QAS. Set of indicators related to EPP program management and operations related to meeting the CAEP standards. The indicators refer to systems, processes, and practices that support meeting the current Evidence Sufficiency Criteria for the CAEP Standard. MUTIPLE MEASURES USED TO INFORM, MODIFY, AND EVALUATE EPP
159
Step 1. Rules for Standard 5/A.5
General for all Standards Special for Standard 5/A.5 Key concepts in standard and components are addressed EPP-Created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential Disaggregated data on candidates, for main/branch campuses (if applicable) Components 5.3 and 5.4 are required. Emphasis on evidence; must demonstrate component is met Evidence must be provided for components 5.3 and 5.4, for the standard to be met. All Phase-In requirements are met.
160
Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements
161
Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement
162
Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3
163
Component 5.1/A.5.1: Key Language
The provider’s quality assurance system is comprised of multiple measures that can monitor candidate progress, completer achievements, and provider operational effectiveness. Evidence demonstrates that the provider satisfies all CAEP standards. Reflect on: What evidence do I have that would demonstrate a comprehensive Quality Assurance System (QAS)? Reflect on: How do I know that assessment system is adequate? Reflect on: How do I know that programs’ structure, content, policies, and practices support achievement of CAEP standards?
164
Evidence Sufficiency Criteria, 5.1/A.5.1
MUTIPLE MEASURES USED TO INFORM, MODIFY, AND EVALUATE EPP Quality Assurance System (QAS): Evidence that the assessment system is designed and managed to collect information relevant to Standards 1, 3, and 4 on candidate progress and completer achievements. Evidence that the quality of partnerships is measured and monitored with respect to all components of Standard 2. Multiple measures: The QAS is designed and functions to collect a coherent set of information that balances the strengths and weaknesses of individual measures as described in Component 5.2 on evidence quality. Operational Effectiveness: Evidence that data, feedback, etc. relevant to all CAEP standards are reviewed at least annually for completeness, accuracy, and implications.
165
Quality Assurance System (QAS) Indicators
Standard 1 There is a functioning process in place for developing and revising assessments of candidate knowledge, skills, and dispositions. The candidate knowledge, skills, and dispositions that are assessed align with state and national or association standards for educators. There is a functioning data/record management system in place for recording, storing, and retrieving data on candidate knowledge, skills, and dispositions. There is a system in place to collect, store, and review data on candidates’ practical application of professional knowledge and skills in field settings. There is a functioning process in place for regularly reviewing and monitoring candidate performance.
166
Quality Assurance System (QAS) Indicators
Standard 2 There is a functioning mechanism in place whereby the EPP and clinical sites collaborate to determine the terms, structure, and content of field experiences hosted at the partner site. EPPs and their partners collaborate on candidate evaluation tools and processes. EPPs and clinical partners regularly discuss the terms, structure, and content of field experiences hosted at the partner site. Clinical partners have a mechanism for providing feedback to the EPP on patterns in candidate strengths and needs and providing input on potential program enhancements. There is a functioning mechanism to ensure that clinical placements occur in diverse settings. Note: Diversity is not limited to race/ethnicity. There is a functioning mechanism to manage the attributes of field experiences (e.g., breadth, depth, duration, and coherence) so that they provide practical experience relevant to standards 1 and 4
167
Quality Assurance System (QAS) Indicators
Standard 3 There is a mechanism in place to manage recruitment initiatives to attract applicants from groups and in labor-market areas identified in component 3.1. There is a system in place to collect, store, analyze and review data relevant to standard 3 on applicants, enrollees, and exiting candidates.
168
Quality Assurance System (QAS) Indicators
Standard 4 There are processes in place to collect and update contact information for alumni for 3-years, post-exit. There is a functioning process in place for developing and revising measures of initial level completers’ instructional practices and impact on P-12 student learning. There is a functioning process in place for developing and revising measures of advanced level completers’ satisfaction with their preparation. There is a functioning process in place for developing and revising measures of employers’ satisfaction with the completers’ preparation and performance. There is a system in place to collect, store, analyze, and review data on completers that is relevant to standard 4
169
Quality Assurance System (QAS) Indicators
Standard 5 There is a functional process in place to protect curricular integrity There is a functional process in place to ensure the hiring of qualified faculty and program staff (particularly staff involved with clinical placements) There is a process in place to minimize out-of-field teaching assignments and chronic or severe work overload (not simply course load) There is a working mechanism in place for training faculty to collaborate (in-person or virtually, synchronously or asynchronously) to provide feedback and input on candidate learning, the assessment system, and program features, operations, and priorities. The data system collects and stores information relevant to CAEP’s 8 annual outcome measures. There is a functioning process for publicly sharing outcomes and trends (updated annually) for the 8 annual measures. There is a functioning process for involving diverse stakeholders in decision-making, program evaluation, and selection and implementation of improvement initiatives. Documentation of stakeholder inputs to specific decisions, evaluations, and/or improvement initiatives is stored and accessible.
170
Component 5.2/A.5.2: Key Language
The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent. Reflect on: What evidence do I have that would demonstrate the quality of assessment measures?
171
Evidence Sufficiency Criteria, 5.2/A.5.2
EPP-CREATED ASSESSMENTS IN QAS AT LEVEL OF SEFFICIENCY Relevant: Evidence that the measures provided are applicable to CAEP standards. (relates to validity) Verifiable: Data records are accurate and analyses can be replicated by a third party. (relates to reliability) Representative: Evidence that data samples are free of bias and are typical of completed assessments. If not, the EPP clearly delineates the limits of generalizability. (relates to validity) Cumulative: Data sets are based multiple concordant measures for each standard and ≥ 3 administrations of the assessments. Actionable: Evidence is accessible and in a form that can guide EPP faculty in evaluating outcomes, making decisions, and modeling, implementing, and evaluating innovations.
172
Component 5.3/A.5.3: Key Language
REQUIRED COMPONENT: The provider regularly and systematically assesses performance against its goals and relevant standards, tracks results over time, tests innovations and the effects of selection criteria on subsequent progress and completion, and uses results to improve program elements and processes. Reflect on: What evidence do I have that would demonstrate systematic review of EPP quality and the use of the results for continuous improvement?
173
Evidence Sufficiency Criteria, 5.3/A.5.3
REGULARLY AND SYSTEMATICALLY REVIEW DATA Evidence of regular and systematic data-driven modifications Regularly: QAS data is reviewed at least annually. Systematically: Reviews of QAS data follow a scope and sequence that ensures that key language in component 5.3 and the evidence sufficiency criteria for 5.3 are addressed. Data-driven: Innovations and improvements may derive from the EPP’s QAS data or from research and evidence from the broader field (e.g., publications). Evidence that the results of modifications are monitored and adjusted as appropriate to produce positive trends in improvement.
174
Component 5.4/A.5.4: Key Language
REQUIRED COMPONENT: Measures of completer impact, including available outcome data on P-12 student growth, are summarized, externally benchmarked, analyzed, shared widely, and acted upon in decision- making related to programs, resource allocation, and future direction. Reflect on: What evidence do I have that would demonstrate the use of data on completers’ performance (Standard 4) to drive decision-making about program elements? Reflect on: What evidence do I have that completer effectiveness on the job is shared with stakeholders and references effectiveness criteria that are valued by stakeholders?
175
Evidence Sufficiency Criteria, 5.4/A.5.4
IMPACT MEASURES MONITOR AND REPORTED Evidence that eight outcome measures are a source for driving program changes. impact and outcome data for CAEP’s eight annual measures are collected, monitored, and published. Evidence that data from the
176
Component 5.5/A.5.5: Key Language
The provider assures that appropriate stakeholders, including alumni, employers, practitioners, school and community partners, and others defined by the provider, are involved in program evaluation, improvement, and identification of models of excellence. Reflect on: What evidence do I have that our stakeholders participate in our processes for quality review and assurance?
177
Evidence Sufficiency Criteria, 5.5/A.5.5
DIVERSE STAKEHOLDER INVOLVMENT, DOCUMENTED IN MULTIPLE SOURCES Description of stakeholders and their roles in the EPP’s quality reviews related to Program evaluation Decision-making Selection of improvement targets/priorities and implementation of these changes Evidence that stakeholder input in these three domains is collected and reviewed. Evidence that stakeholder input influenced faculty decision-making on ≥ 2 occasions.
178
Step 5. Analyze and Interpret
The Evidence and Assessment results… That Education Preparation Provider’s program management and operations (e.g., systems, processes, and practices) related to meeting the CAEP standards Develop the plan for action
179
In Summary - The Case for Standard 5/A.5
Information is provided from several sources and provides evidence that the EPP monitors and manages aspects of program quality relevant to the CAEP standards. Data of sufficient quality and quantity are collected and analyzed appropriately. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that indicate whether program changes are needed and what changes should be pursued regarding the assessment system and program operations. The effect of changes is monitored and evaluated for evidence of positive impact. Impact (whether positive, neutral, or negative) is discussed along with next steps.
180
Cross Cutting Themes Diversity and Technology
181
Cross-Cutting Themes Embedded in Every Aspect of Educator Preparation
Coursework Diversity Technology Fieldwork Interpersonal Interactions
182
Themes of Technology and Diversity
Standard 1 Candidates must demonstrate skills and commitment to provide all P-12 students access to rigorous college and career ready standards. Technology Standard 1 Endorses InTASC teacher standards. Providers are to “…ensure that candidates model and apply technology standards as they design, implement, and assess learning experiences to engage students and improving learning and enrich professional practice.”
183
Themes of Technology and Diversity
Standard 2 Clinical experiences prepare candidates to work with all students. Technology Standard 2 Technology-enhanced learning opportunities Appropriate technology-based applications Technology-based collaborations
184
Themes of Technology and Diversity
Standard 3 Providers are committed to outreach efforts to recruit a more able and diverse candidate pool. Technology Standard 3 Candidates integrate technology into all learning domains.
185
Themes of Technology and Diversity
Standard A.1 Candidates use their professional specialty practices “flexibly to advance the learning of P-12 students toward attainment of college-and career-readiness standards” to enhance “learning and development opportunities” for students. Technology Standard A.1 Candidates apply technology appropriate to their field of specialization.
186
Themes of Technology and Diversity
Standard A.2 Clinical experiences prepare candidates to fulfill their specialized professional roles to benefit all students. Technology Standard A.2 Technology-based collaborations may be included in partnerships
187
Themes of Technology and Diversity
Standard a.3 Providers are committed to outreach efforts to recruit a more able and diverse pool of advanced program candidates. The diversity of advanced candidates reflects the diversity of America’s teacher pool, and over time, should reflect the diversity of P-12 students. EPPs monitor disaggregated evidence of academic quality and candidate progress, provided support for candidates who need it. Technology Standard A.3 Candidates can apply technology in appropriate ways to their field of specialization.
188
Step 6. Formulate Summative and Analytical Statements
Frame the argument to be made for standard - what points will be offered, which support the argument Describe the data sources and representativeness, relevant to supporting the standard - why are the data credible for this standard Present the results in a way that aligns with the standard Draw a conclusion about how the data supports the standard Where appropriate, address triangulation and convergence of different forms of evidence; compensates for limitations of any one data source Discuss the implications of the findings for subsequent action by the provider
189
Areas for Improvement
190
Areas for Improvement – from legacies
An EPP must address AFIs in their Annual Report. During the next accreditation review the EPP must demonstrate that the AFIs have been corrected. If the AFIs have not been corrected, a stipulation may be cited in the same area.
191
7. Draft Self-Study Report
Compile a complete draft of report Including evidence; tagged to the appropriate standard(s), component(s), crossing- cutting themes, and data quality documentation Summary and analysis statements The Selected Improvement Plan Review the draft with stakeholders Revise as needed Upload the final into Accreditation Information Management System (AIMS)
192
Preparing the Self-Study Report - STEPS
Demonstrate to Standards Review Inventory of available evidence Gather information, categorize and prepare evidence to upload, and draft table to be completed Take stock Analyze and discuss the evidence and draft of the Selected Improvement Plan Formulate summary/narrative statements Draft and submit Self-Study Report
193
Thank You
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.