Download presentation
Presentation is loading. Please wait.
Published byRoxanne Griffith Modified over 6 years ago
1
STANDARD A.1 Content and Pedagogical Knowledge
Tatiana Rivadeneyra, Ed.D. Accreditation Director, Site Visitor Development and EPP Accreditation Procedures Banhi Bhattacharya, Ph.D. Accreditation Director Senior Director of Program Review
2
Session Overview This session will focus on the key language and intent of CAEP Standard A.1. Content will reference the evidence sufficiency criteria. The CAEP Standards for Initial-Level Programs are not covered in this presentation. Please attend the session dedicated to those standards or access the presentation materials for guidance. Forewarn that they will be asked to reflect on possible evidence sources. Time for Q&A is scheduled at the end.
3
EVIDENCE SUFFICIENCY: RESOURCES
CONSULT: Evidence Sufficiency Criteria Evaluation Criteria for Self-Study Evidence - Standard A.1 CAEP Guidelines for Plans for phase-in plan content SSR submitted through academic year 2018/2019 can include plans for Component A.1.1 SSRs can present plan with progress data for Component A.1.1 Site visits in F22 and beyond are not eligible for phase-in Assessment Sufficiency Criteria CAEP Evaluation Framework for EPP-Created Assessments The process of building a case that standard is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, it may help to begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component 5.2/A It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.
4
Evidence Sufficiency Rules for Standard A.1
General for all Standards Special for Standard 1 Key concepts in standard and components are addressed EPP-created assessments meet CAEP’s assessment sufficiency criteria At least three cycles of data that are sequential and most recent available Results disaggregated by specialty field area (when appropriate) Also for main and additional campuses, on site and online programs (if applicable) Data/evidence analysis includes discussion of trends/patterns, comparisons, and/or differences. No required components All data disaggregated by specialty licensure area Evidence from Standard 1cited in support of continuous improvement, part of overall review system The majority of programs meet the standards of the selected program review option(s) Handout: Evaluation Criteria for Self-Study Evidence Advanced
5
STANDARD A.1: CONTENT AND PEDAGOGICAL KNOWLEDGE
The provider ensures that candidates for professional specialties develop a deep understanding of the critical concepts and principles of their field of preparation [component A.1.1] and, by completion, can use professional specialty practices flexibly to advance the learning of all P-12 students toward attainment of college- and career-readiness standards [component A.1.2]. Talking point if asked about exclusions: The scope of the advanced-level programs is currently under review given requests we have received for exemptions. Those discussions have paused the release of implementation information. That information is scheduled to be released by the beginning of April, 2017.
6
Standard A.1, Guidance from Component A.1.1
Candidates for advanced preparation demonstrate their proficiencies to understand and apply knowledge and skills appropriate to their professional field of specialization so that learning and development opportunities for P-12 are enhanced, through: Application of data literacy; Use of research and understanding of qualitative, quantitative and/or mixed methods research methodologies; Use of data analysis and evidence to develop supportive school environments; Leading and/or participating in collaborative activities with others such as peers, colleagues, teachers, administrators, community organizations, and parents; Application of appropriate technology for their field of specialization; and Application of professional dispositions, laws and policies, codes of ethics and professional standards appropriate to their field of specialization. Evidence of candidate content knowledge appropriate for the professional specialty will be documented by state licensure test scores or other proficiency measures.
7
EVIDENCE FOR A.1.1 Consider: What evidence do you have that would demonstrate proficiencies in the specialty content and general skills referenced in Component A.1.1 for a specialization?
8
EVIDENCE SUFFICIENCY CRITERIA, A.1.1
SUFFICIENT EVIDENCE Demonstrates that most candidates pass state/nationally-benchmarked content/licensure exams Addresses all of the professional skills listed in the component Documents proficiency for at least three of the skills for each specialty field Utilizes multiple measures to assess each proficiency Utilizes measures that meet criteria in CAEP Evaluation Framework for EPP-Created Assessments Phase-In Plans for Component A.1.1 meet the criteria for the CAEP Guidelines for Plans and are consistent with the Phase-In Schedule. Standard A.1 Sufficiency Criteria: Component A.1.1 All of the generic professional skills stated in A.1.1.are addressed. At least three of the six generic skill areas are informed for each professional specialty field by multiple indicators/measures that adapt the generic skills to a professional specialty field. EPP created assessments have been reviewed at the minimum level of sufficiency on CAEP’s assessment rubric. Analysis of data/evidence includes identification of trends/patterns, comparisons, and/or differences. Data/evidence supports interpretations and conclusions. Class average at or above acceptable levels on the EPP scoring guide for EPP-created assessments.
9
EPP Created- Assessments
Standard A.1, component A 1.1 Action Research Capstones/Portfolios/Thesis Dispositional/Professional Responsibility Data Problem-based projects with coursework/group projects Problem-based projects with school/district Pre- and post-data and reflections on interpretations and use of data End of key-course tests Grades, by program field Survey Data from Completers/Employers EPP Created- Assessments Advanced Standards (suggested evidence) Resource: CAEP Evaluation Framework for EPP-Created Assessments
10
Advanced Level Standards
STATE Relevant surveys Assessments (value added) of completers Licensure examinations Additional measures utilized toward compliance of other accreditors (e.g., for reporting requirements; WASC, NASC, HLC (aka NCA), SACS, MSA, NEASC) OTHER - State Assessments/Surveys - Other Proficiency Measures Assessment Test or Section 3.2 Domain—NOTE: proficiency must be met for each domain Group average performance requirements of candidates whose preparation began during the academic year or earlier GRE “Verbal Reasoning” Reading 150.75** “Quantitative Reasoning” Math 152.75** “Analytical Writing” Writing 3.74**
11
Standard A.1, Guidance from Component A.1.2
Providers ensure that advanced program completers have opportunities to learn and apply specialized content and discipline knowledge contained in approved state and/or national discipline-specific standards. These specialized standards include, but are not limited to, Specialized Professional Association (SPA) standards, individual state standards, standards of the National Board for Professional Teaching Standards, and standards of other accrediting bodies [e.g., Council for Accreditation of Counseling and Related Educational Programs (CACREP)]. Consider: What evidence do you have that would demonstrate that the program provides candidates the opportunity to both learn and apply content knowledge and skills that are emphasized in professional standards for the specialty area?
12
EVIDENCE SUFFICIENCY CRITERIA, A.1.2
SUFFICIENT EVIDENCE Documents that the majority of programs meet the standards of the selected program review option(s) A majority submitted for SPA Review achieved National Recognition State Review reports document how well individual programs perform in relation to the state’s selected standards and that the majority meet the standards Program Review with Feedback results show that the state-selected state or national standards are met for the majority of programs Includes a discussion of performance trends and compares across specialty areas. Component A.1.2 is not eligible for Phase-in Plan submission Standard A.1 Sufficiency Criteria: Component A.1.2 The provider presents at least one source of evidence that candidates apply advanced preparation knowledge at specialty area levels (SPA or state reports, disaggregated specialty area data, actions, etc.). A majority (51% or above) of SPA program reports have achieved National Recognition. OR documentation is provided on periodic state review of program level outcome data. Answers specific top specialty area questions are complete and supported by an analysis and accurate interpretation of specialty area data. The provider makes comparisons and identifies trends across specialty areas based on data. Assessments submitted for advanced preparation fields under the Program Review with Feedback option are at the minimal level of sufficiency.
13
PROGRAM REVIEW and THE ACCREDITATION PROCESS Component A.1.2
14
VOCABULARY EPP: Educator Preparation Provider (previously called “Unit”) that prepares professionals in various licensure or certification areas to serve in a P-12 setting PROGRAM: A planned sequence of academic courses and experiences leading to a degree, a recommendation for a state license, or some other credential that entitles the holder to perform professional education services in schools (P-12) CANDIDATES: Pre service educators STUDENTS: P-12 students SPA: Specialized professional associations SPA Program Report: A report submitted at a program level to provide evidence to meet standards developed by SPAs SPA RECOGNITION REPORT/DECISION REPORT: Report providing SPA feedback and recognition decision– used as partial evidence for CAEP Standard 1 BB
15
PROGRAM REVIEW: INTEGRAL TO CAEP ACCREDITATION (Advanced Level Program)
Program review decisions factor into CAEP Component A.1.2, which says: “Providers ensure that advanced program completers have opportunities to learn and apply specialized content and discipline knowledge contained in approved state and/or national discipline specific standards. These specialized standards include, but are not limited to, Specialized Professional Association (SPA) standards, individual state standards, standards of the National Board for Professional Teaching Standards (NBPTS), and standards of other accrediting bodies [e.g., Council for Accreditation of Counseling and Related Educational Programs (CACREP)]
16
CAEP SCOPE AND PROGRAM REVIEW
CAEP accredits EPPs EPP offered programs leading to licensing degrees, certificates, or endorsements of P-12 professionals fall under the scope All endorsements use CAEP Advanced Level Programs Standards Programs licensing “Other school professionals” use CAEP Advanced Level Programs Standards Programs accredited by other national accrediting bodies (CACREP, NASM, etc.): EPP may choose to exempt from review by CAEP (will not be recognized as accredited by CAEP) EPP may choose to include in the CAEP accreditation process (will be included in EPP-wide assessments, annual report, and program review)
17
PROGRAM REVIEW OPTIONS
CAEP-state agreements determine program review options for EPPs within state (28 agreements signed to date) Available program review options for EPPs in states with agreements: SPA review with National Recognition (3 years prior to site visit) CAEP program review with feedback (part of self-study report) State review of programs (determined by state) Available program review options for EPPs in states without agreements: State review of programs (EPP coordinates with state to obtain and provide state agency report)
18
EXAMPLES: STATE-SELECTED PROGRAM REVIEW OPTIONS*
SPA REVIEW REVIEW WITH FEEDBACK STATE REVIEW ARKANSAS X DELAWARE - INDIANA KANSAS NEW JERSEY * Information on program review options by state is available on the CAEP website:
19
QUESTIONS THAT PROGRAM REVIEW ADDRESSES
What degree of competence in content knowledge do candidates demonstrate? Can candidates successfully develop a conceptual plan for their teaching and other professional education responsibilities? Can candidates implement their conceptual plan with students and colleagues? Are candidates effective in promoting student learning? Do candidates meet state licensure requirements?
20
PRESENTING PROGRAM REVIEW EVIDENCE FOR CAEP ACCREDITATION
CONSIDER: 1. Did the EPP update the program review option in AIMS for each program? 2. Does the program list match the licensure, certification, or endorsement programs list on the EPP’s catalog? 3. Does the selection of program review option meet CAEP-state agreement (if applicable)? 4. Does the program level evidence (SPA report, state agency report, self-study addendum) presented on the self-study report match the selected review option? Remember: 1 licensing program = 1 review option evidence type
22
SPA PROGRAM REVIEW OPTION WITH NATIONAL RECOGNITION
Two Steps in CAEP Accreditation Process if Selecting SPA Review Option: Step #1: Initial review report submitted to SPA three years prior to site visit (Program level review) Example: Site Visit in Fall 2020 Initial SPA review in Fall 2017 Step #2: Self-study report submitted to CAEP nine months prior to site visit (Provider level review) Example: Site Visit in Nov Self-Study report in Mar. 2020
23
SPA REVIEW EXPECTATIONS: WHAT THE SITE TEAM WILL LOOK FOR
INITIAL REVIEW DUE DATE: 3 years prior to site visit Example: site visit in Fall 2020 initial SPA review in Fall 2017 SPA reports initiated more than three years before = old data Did the EPP receive an extension to account for older Recognition Reports?
24
PROGRAM REVIEW EVIDENCE: SPA REVIEW
What evidence will the site team look for? A SPA Recognition/Decision Report 3 year out timing of Initial Review How will the site team determine if CAEP expectations are met when an EPP selects the SPA option? Minimum sufficiency criteria: 51% of the total number of programs selecting SPA review option have full National Recognition from a 3 year out submission Which SPA recommendations on the Decision Report will be used? Comments in Part E (Areas for consideration) Comments in Part F (Additional comments) SPA decisions or conditions for the program to address in Part G
25
SPA REVIEW: TIMING AND CYCLES
PURPOSE: Gather evidence for current accreditation cycle (CAEP Standard 1) Initiate process to receive full National Recognition by visit date Initiate process to continue prior National Recognition status before expiration REVIEW CYCLES: 2 times per year Spring Cycle Due Date: March 15 Fall Cycle Due Date: September 15 Spring Cycle Decisions: August 1 Fall Cycle Decisions: February 1
26
STAGES OF SPA REVIEW PROCESS
INITIAL SUBMISSION: 3 years before site visit SHELL REQUESTS BEGIN: 1 year before submission date SHELL REQUESTS ENDS (moving forward): 5 days before submission date (March 10 for spring cycle and September 10 for fall cycle) SHELL REQUEST SUBMISSION: List all programs preparing P-12 professionals in each specialization area in AIMS to enable shell request submission Submit shell requests through CAEP’s Accreditation Information Management System (AIMS): Directions requesting shells provided on CAEP website: and-procedur CAEP staff creates shells after receiving request
27
SPA REVIEW: DISCUSSION ON SELF-STUDY REPORT
The EPP addresses the following questions for programs selecting SPA Program Review: How was the SPA feedback on specialty licensure area used to inform decision making and improve instruction and candidate learning outcomes? What was learned about different specialty licensure areas as a result of the review of the disaggregated data? What trends do the comparison of data across specialty licensure areas indicate and how do they provide evidence for meeting the CAEP and state expectations and standards? Accreditation Decision: Evidence meets CAEP sufficiency criteria, OR, evidence indicates potential area for improvement (AFI)
28
PROGRAM REVIEW EVIDENCE: CAEP REVIEW WITH FEEDBACK
What is CAEP Review with Feedback? An alternative option to SPA and state review Requires evidence of candidates’ knowledge of content and pedagogical content knowledge for each licensure area program How do programs report evidence for this option? Incorporate evidence as part of the self-study report Analyze data from state licensure exams and/or other proficiency measures required by EPP to demonstrate candidates’ content knowledge in the licensure area Analyze data to demonstrate candidates’ pedagogical knowledge in the area Analyze data from assessment of candidates’ impact on student learning in the area Provide assessment description and scoring guide in each case
29
PROGRAM REVIEW EVIDENCE: CAEP REVIEW WITH FEEDBACK
How do programs report evidence for this option (Continued)? Address the following questions for each assessment: What artifact(s) is used to provide evidence? How was the assessment developed? How does the assessment provide evidence for meeting standards (next slide)? How is the quality of the assessment/evidence determined or assured? What criteria of success were established or measured, and how? Refer to the Technical Guide for CAEP Program Review with Feedback: options/caep-program-review-with-feedback
30
Source: Technical Guide for CAEP Program Review with Feedback
31
Source: Technical Guide for CAEP Program Review with Feedback
32
PROGRAM REVIEW EVIDENCE: CAEP PROGRAM REVIEW WITH FEEDBACK
What standards are used for this option? As a norm, align the assessments for Advanced Level Programs with the NBPTS Standards in respective areas of specialization If a state requires use of other standards for the CAEP Program Review with Feedback option (state agreement), EPP will align evidence to those standards
33
CAEP PROGRAM REVIEW WITH FEEDBACK OPTION: TIMING AND PURPOSE
EVIDENCE SUBMISSION: Included as part of self-study report REVIEWED BY: site team PURPOSE: Gather program level evidence for current accreditation cycle Provide evidence for CAEP Standard A.1 (Advanced-Level Program) Receive formative feedback on meeting CAEP Standard A.1 Feedback used by CAEP’s Accreditation Council to make accreditation decisions Feedback may be used by states to understand if program meets state expectations
34
CAEP PROGRAM REVIEW WITH FEEDBACK OPTION: GENERAL EXPECTATIONS
3 cycles of data submitted and analyzed as part of self-study report Disaggregated data provided on candidates enrolled for main and branch campuses (Disaggregate data if applicable) Cycles of data must be sequential and latest available The review is based on guidance provided in the CAEP Evidence Guide
35
PROGRAM REVIEW WITH FEEDBACK: DISCUSSION ON SELF-STUDY REPORT
The EPP addresses the following questions for programs selecting the Program Review with Feedback Option: Based on the analysis of the disaggregated data, how are the results of specialty licensure area evidence used to inform decision making and improve instruction and candidate learning outcomes? Based on the analysis of specialty licensure area data, how have individual licensure areas used data as the basis for change? How do the specialty licensure area data align with and provide evidence for meeting the state-selected (or InTASC) standards?
36
POTENTIAL ISSUES: STANDARD A.1
AREAS FOR IMPROVEMENT (AFIs) MAY BE CITED WHEN Instrument Quality is Poor: EPP-created assessments used to collect Standard A.1 data have significant deficiencies with respect to CAEP’s assessment evaluation framework Phase-In Plans for one or more components do not meet CAEP’s guidelines for plans Evidence Quantity is Limited: Less than three cycles of data are provided Less than one cycle of phase-in data collected by academic year 2019/2020 Site visitors may recommend AFIs or stipulations if general rules, special rules, or specific evidence sufficiency criteria are not met. Only the Accreditation Council can decide if AFIs or stipulations will be cited or whether standards are met or not met. The following three slides are intended to clarify some of the conditions under which this has happened in the past may or in the future.
37
POTENTIAL ISSUES: STANDARD A.1
AREAS FOR IMPROVEMENT (AFIs) MAY BE CITED WHEN Case is Weak: Deficiency in evidence that program options foster deep understanding of critical concepts and skills in the specialty areas Deficiency in evidence that knowledge and skills are applied to enhance P-12 settings. EPP’s analysis of data/evidence does not identify and discuss trends/patterns, comparisons, and/or differences between programs or over time. Site visitors may recommend AFIs or stipulations if general rules, special rules, or specific evidence sufficiency criteria are not met. Only the Accreditation Council can decide if AFIs or stipulations will be cited or whether standards are met or not met. The following three slides are intended to clarify some of the conditions under which this has happened in the past may or in the future. Program Review results indicate that some of the EPP’s advanced programs are not well-aligned to professional standards and/or performance benchmarks in the field. As a result, either the EPP’s expectations for deep understanding of critical concepts and principles or candidates’ ability to use professional practices flexibly to enhance P-12 settings or outcomes is below standard. The evidence for Standard A.1 does not address all of the professional skills listed in Component A.1.1, and/or fewer than three of these skills are assessed for each specialty area using multiple indicators/measures that adapt the generic skills to a professional specialty field. The EPP provides limited or no evidence that advanced candidates understand the learning objectives and performance standards to which P-12 students and personnel are held accountable (e.g., CCR Standards). As a result, there is limited or no evidence that candidates are able to flexibly use relevant specialty-area practices to promote their attainment. The EPP-created measures of practical application (e.g., field evaluation tools) used for evidence of Standard A.1 do not meet CAEP’s sufficiency criteria. [The site team clearly describes the deficiencies as they relate to the evaluation framework for assessments]. Site team tasks intended to verify the accuracy of results reported in the self-study report could not be completed using the data provided by the EPP, or the effort uncovered significant discrepancies between the data set(s) and the rates or performance levels reported in the self-study report. Review of available data indicates that the EPP did not provide the most sequential and the most recent data that was relevant to their analysis. The EPP’s analysis of data/evidence does not identify and discuss trends/patterns, comparisons, and/or differences between programs. The EPP’s analysis of data/evidence does not identify and discuss trends/patterns, comparisons, and/or differences over time. One or more of the three components of the phase-in plan for Standard A.1.1 do not meet criteria in the CAEP Guidelines for Plans. For example, under Timeline, the plan will not result in at least one data point in the academic year [The site team clearly describes the deficiencies in the plan as they relate to the guidelines].
38
POTENTIAL ISSUES: STANDARD A.1
STIPULATIONS MAY BE CITED WHEN Evidence Quality is Low Significant aspects/key language of the standard and components are not addressed by relevant measures Majority of measures do not meet assessment sufficiency criteria Evidence Quantity is Limited: Limited or no evidence for Standard A.1, and (when eligible) no phase-in plan for A.1.1 that meets CAEP’s Guidelines for Plans and phase-in schedule Results are not disaggregated by specialty area The Accreditation Handbook provides additional detail on evidence issues that may lead to AFIs and stipulations
39
POTENTIAL ISSUES: STANDARD A.1
STIPULATIONS MAY BE CITED WHEN Case is Weak Candidate performance is severely below reported standards for content knowledge and application Majority of programs do not meet program review standards Limited or no evidence that candidates can apply the professional skills listed in A.1.1 to enhancing P-12 settings or outcomes The Accreditation Handbook provides additional detail on evidence issues that may lead to AFIs and stipulations. The self-study report does not address the key concepts and language of Standard A.1. Program Review results indicate that a majority of the EPP’s advanced programs are not well-aligned to professional standards and/or performance benchmarks in the field. As a result, both the EPP’s expectations for deep understanding of critical concepts and principles and candidates’ ability to use professional practices flexibly to enhance P-12 settings or outcomes is below standard despite meeting the EPP’s performance criteria. There is limited or no evidence for Standard A.1 and no plan for gathering a sufficient quantity of valid and reliable evidence as outlined in the General Rules for the standard and the CAEP Evaluation Framework for EPP-Created Assessments. An insufficient quantity of data is submitted, and the EPP’s explanation for the insufficiency is incomplete or inadequate. If the EPP’s explanation for the data insufficiency reveals a problem in the EPP’s quality assurance system (e.g., lack of stable assessment processes, lack of performance monitoring, poor data management that lead to data losses), this should also be cited as an issue in Standard A.5 with a rationale that explains how it affected evidence for Standard A.1. [The Accreditation Council will decide whether to officially cite either or both of the recommended citations and whether the severity is sufficient to consider either standard unmet]. The majority of EPP-created measures used for evidence of Standard A.1 do not meet CAEP’s sufficiency criteria, and the insufficiencies are not compensated for by proprietary measures included in the evidence suite. [The site team clearly describes the deficiencies as they relate to the evaluation framework for assessments]. The EPP adapts a proprietary measure for use in its program(s) and does not supply evidence that the adaptation is a valid revision that produces reliable data. None of the three components of the phase-in plan for Standard A.1.1 meet criteria in the CAEP Guidelines for Plans. [The site team clearly describes the deficiencies in the plan as they relate to the guidelines]. Phase-in Plans are submitted for Standard A.1 after the expiration of the period for submitting new plans. Progress on Phase-in Plans for Standard A.1 does not include any data on candidate outcomes. Candidate outcome data submitted to demonstrate progress on Phase-In Plans for Standard A.1 show inadequate performance for the majority of candidates assessed. Disaggregated evidence is not provided for each advanced preparation specialty area despite evidence that there were 10 or more candidates or completers across the span of years covered by the self-study report. Review of available data confirms selection bias in the EPP’s data set, the analysis of which lead to inflated results. Candidates perform below the reported performance standard in both specialty content knowledge and application. Candidate performance is severely below standards in either content knowledge or application. [The site team describes how it quantified severity in relation to the performance standard. For example, average performance of the completing cohort is in the lower half of the licensure test score distribution, and there is no plan to improve the EPP’s performance.] The EPP incorrectly analyzes or interprets data/evidence for Standard A.1 and draws conclusions about accomplishments for Standard A.1 that are not supported by data/evidence.
40
Cross-Cutting Themes Embedded in Every Aspect of Educator Preparation
Coursework Diversity Technology Fieldwork Interpersonal Interactions
41
Cross-Cutting Themes of Diversity and Technology
Places in which the cross-cutting themes of diversity and technology must be explicitly addressed through evidence are identified by the following icons in the CAEP Evidence Tables. = diversity and = technology
42
Themes of Diversity and Technology
Standard A.1 Candidates use their professional specialty practices “flexibly to advance the learning of P-12 students toward attainment of college-and career- readiness standards” to enhance “learning and development opportunities” for students. Technology Standard A.1 Candidates apply technology appropriate to their field of specialization
43
In Summary - The Case for Standard A.1
Information is provided from several sources and provides evidence of candidate knowledge, skills, and dispositions. Grades, scores, pass rates, and other data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, planned or completed actions for change that are described. Standard 1 The guiding questions may help focus the selection of evidence and the EPP inquiry of its message: STRENGTHS AND WEAKNESSES—What strengths and areas of challenge have you discovered about candidate content and pedagogical knowledge and its applications as you analyzed and compared the results of your disaggregated data by program and by demographics? What questions have emerged that need more investigation? How are you using this information for continuous improvement? TRENDS What trends have emerged as you compared program and demographic data about candidate content and pedagogical knowledge and its applications across evidence sources and programs? What questions have emerged that need more investigation? How are you using this information for continuous improvement? IMPLICATIONS—What implications can you draw or conclusions can you reach across evidence sources about candidate content and pedagogical knowledge and its applications? What questions have emerged that need more investigation? Improvement? How have data driven decisions on changes been incorporated into preparation?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.