Presentation is loading. Please wait.

Presentation is loading. Please wait.

The CAEP Accreditation Review Process:

Similar presentations


Presentation on theme: "The CAEP Accreditation Review Process:"— Presentation transcript:

1 The CAEP Accreditation Review Process:
A Workshop For EPPs Working Toward CAEP Accreditation. Fall 2017 Gary Railsback, PhD, MBA CAEP Volunteer Lead Site Visitor Azusa Pacific University Vince O’Neill, EdD Accreditation Director Counselors and Accreditation Decisions

2 CAEPCon F2017 Post conference Workshops
Tuesday, September 26, 2017 Time Scheduled 8:30 am – 12:00 Pm CAEP Accreditation Review Process- Standards 1-4 8:30 am – 10:15 am Writing the SSR Writing to Standard 1 Writing to Standard 2 10:15 am – 10:30 am Break Writing to Standard 3 Writing to Standard 4

3 TURNING SANDBOX INTO A PDF

4 TURNING SANDBOX INTO A PDF

5

6 Standard 1/A.1 Content and Pedagogical Knowledge

7 CAEP Standard 1/A.1 Content Knowledge and Pedagogical Knowledge
Candidate Knowledge, Skills, and Professional Dispositions 1.1 A.1.1 Provider Responsibilities A.1.2

8 CAEP Standard 1 Initial Programs Content Knowledge and Pedagogical Knowledge
The provider ensures that candidates develop a deep understanding of the critical concepts and principles of their discipline [components 1.1, 1.3] and, by completion, can use discipline-specific practices flexibly to advance the learning of all students toward attainment of college- and career-readiness standards [component 1.4].

9 CAEP Standard 1/A.1 Advanced Programs Content Knowledge and Pedagogical Knowledge
The provider ensures that candidates for professional specialties develop a deep understanding of the critical concepts and principles of their field of preparation [component A.1.1] and, by completion, can use professional specialty practices flexibly to advance the learning of all P- 12 students toward attainment of college- and career-readiness standards [component A.1.2].

10 Step 1. Review Rules for Standard 1/A.1
General for all Standards All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses

11 Step 1. Review Rules for Standard 1/A.1
Special for Standard 1/A.1 No required components All data disaggregated by specialty licensure area Evidence from Standard 1/A.1 cited in support of continuous improvement, part of overall review system from The Accreditation Handbook, p. 15

12 CAEP Standard 1/A.1 Content Knowledge and Pedagogical Knowledge
Replace this screen shot to get ride of “sample schedule” as it will be confusing to participants Notice character limit (at 2450 characters per page this is 12 pages)

13 Uploading your evidence to AIMS
NOTE THERE IS AN INSTRUCTIONAL VIDEO

14 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

15 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT CHOOSE EITHER FILE OR FOLDER

16 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

17 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

18 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

19 Uploading your evidence to AIMS
NAME YOUR FILE TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

20 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

21 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

22 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

23 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

24 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

25 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

26 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

27 Uploading your evidence to AIMS
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

28 Disaggregating Data by Regional Campuses
TO UPLOAD EVIDENCE JUST CLICK ON ADD TO THE TOP LEFT

29 Component 1.1 – Key Language
Candidates demonstrate an understanding of the 10 InTASC standards at the appropriate progression level(s) in the following categories: the learner and learning; content; instructional practice; and professional responsibility. Reflect on: What evidence do I have that would demonstrate developing an understanding over time in these four categories?

30 -Portfolios (Rubrics & data results)
Standard 1 Component 1.1 -Clinical Experience/Observation Instruments -Lesson/Unit Plan (Rubrics & data results) -Portfolios (Rubrics & data results) -Teacher Work Samples (Rubrics & data results) -GPA, Courses Specific P-12  Learner (See CAEP “Grade Policy”) -Dispositional (Rubrics & data results)

31 Standard 1 Component 1.1 -As an example - Major Field Test (ETS) in Math that comparing Math Education and non- education majors at completion of program. -End of Course/Program assessments -Pre-Service Measures of Candidate Impact -Capstone/Thesis + Proprietary Assessments/Measures + State Assessments/Measures

32 EPP-created Assessments
Initial Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments 

33 Exercise 1: Supporting Evidence for Standard 1
Open: CAEP Standards 1-pager (Initial) (Sent by ) CAEP Evaluation Framework for EPP-Created Assessments (Sent by ) Handout: Exercise 1, Supporting Evidence for Standard 1 (sent by )

34 Exercise 1: Supporting Evidence for Standard 1

35 Exercise 1: Supporting Evidence for Standard 1
Instructions: Consider the full range of evidence that could potentially be used for the four InTASC Categories. In the second column insert the name of the EPP created assessment instrument. Use the handout to evaluate a piece of evidence more deeply. Which sources could provide the strongest suite of cumulative evidence? Where are there gaps?

36 Exercise Resource- When you get back...

37 Evidence Sufficiency Criteria, 1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF 10 InTASC STANDARDS All four of the InTASC categories are addressed with multiple indicators across the four categories Indicators/measures specific to application of content knowledge in clinical settings are identified

38 Evidence Sufficiency Criteria, 1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF 10 InTASC STANDARDS Data/evidence are analyzed including identification of trends/patterns, comparisons, and/or differences Averages at/above acceptable levels on EPP’s scoring indicators, on InTASC standards (categories) If applicable, demonstration candidates’ performance is comparable to non-candidates’ performance in same courses or majors Performances indicate competency and benchmarked against the average licensure area performance of other providers Interpretations and conclusions are supported by data/evidence

39 MOVING TO ADVANCED STANDARDS

40 Component A.1.1– Key Language
Candidates for advanced preparation demonstrate their proficiencies to understand and apply knowledge and skills appropriate to their professional field of specialization so that learning and development opportunities for all P-12 are enhanced… Reflect on: What evidence do I have that would demonstrate proficiencies in the content and skills referenced in Component A.1.1 for a specialization?

41

42 -Capstones/Portfolios/Thesis
Standard A.1, component A 1.1 -Action Research -Capstones/Portfolios/Thesis -Dispositional/Professional Responsibility Data -Problem-based projects with coursework/group projects -Problem-based projects with school/district This is repeated 2 slides later, why?

43 -End of key-course tests (Does it meet CAEP Sufficiency level?)
Standard A.1, component A 1.1 -Pre- and post-data and reflections on interpretations and use of data -End of key-course tests (Does it meet CAEP Sufficiency level?) -Grades, by program field (See CAEP policy on using grades) -Survey Data from Completers/Employers + State Assessments/Surveys + Other Proficiency Measures

44 EPP Created- Assessments
Standard A.1, Component A 1.1 EPP Created- Assessments Advanced Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments 

45 Exercise A1: Supporting Evidence for Standard A.1
Open: CAEP Standards 1-pager (Advanced) Evidence Sufficiency Criteria for Advanced Programs Handout: Exercise A1, Supporting Evidence for Component A.1.1

46 Exercise A1: Supporting Evidence for Standard A.1
Instructions: Examine the list you generated when asked what evidence you have that would demonstrate proficiencies in the content and skills referenced in Component A.1.1 for a specialization? Choose one EPP-created assessment; practice describing evidence quality: relevance, representativeness, actionability, verifiability.

47 Exercise A1: Supporting Evidence for Standard A.1

48 EVIDENCE SUFFICIENCY CRITERIA, A.1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Demonstrates that most candidates pass state/nationally- benchmarked content/licensure exams

49 EVIDENCE SUFFICIENCY CRITERIA, A.1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Addresses all of the professional skills listed in the component Documents proficiency for at least three of the skills for each specialty field Utilizes multiple measures to assess each proficiency Utilizes measures that meet criteria in CAEP Evaluation Framework for EPP-Created Assessments

50 EVIDENCE SUFFICIENCY CRITERIA, A.1.1
CANDIDATES DEMONSTRATE UNDERSTANDING OF PROFESSIONAL SKILLS Phase-In Plans for Component A.1.1 meet the criteria for the CAEP **Guidelines for Plans and are consistent with the Phase-In Schedule.

51 Component 1.2 – Key Language
Providers ensure that candidates use research and evidence to develop an understanding of the teaching profession and use both to measure their P-12 students’ progress and their own professional practice. Reflect on: What evidence do I have that would demonstrate using research and assessment (evidence) for student and professional learning?

52 Evidence Sufficiency Criteria, 1.2
CANDIDATES USE RESEARCH/EVIDENCE TOWARD TEACHING PROFESSION Data/evidence document effective Candidate use of Research/evidence for planning, implementing, and evaluating students progress Data to reflect on teaching effectiveness and own practice Data to assess P-12 student progress and then modify instruction based on student data

53 Component A.1.2 – Key Language
Providers ensure that advanced program completers have opportunities to learn and apply specialized content and discipline knowledge contained in approved state and/or national discipline-specific standards.

54 Component A.1.2 – Key Language
These specialized standards include, but are not limited to, Specialized Professional Association (SPA) standards, individual state standards, standards of the National Board for Professional Teaching Standards, and standards of other accrediting bodies [e.g., Council for Accreditation of Counseling and Related Educational Programs (CACREP)] Reflect on: What evidence do I have that would demonstrate candidates application of advanced professional knowledge, at professional standard levels?

55 EVIDENCE SUFFICIENCY CRITERIA, A.1.2
CANDIDATES APPLY ADVANCED PREPARATION KNOWLEDGE Documents that the majority of programs meet the standards of the selected program review option(s) A majority submitted for SPA Review achieved National Recognition State Review reports document how well individual programs perform in relation to the state’s selected standards and that the majority meet the standards Program Review with Feedback results show that the state-selected state or national standards are met for the majority of programs I

56 EVIDENCE SUFFICIENCY CRITERIA, A.1.2
CANDIDATES APPLY ADVANCED PREPARATION KNOWLEDGE Includes a discussion of performance trends and compares across specialty areas. Component A.1.2 is not eligible for Phase-in Plan submission

57 Component 1.3 – Key Language
Providers ensure that candidates apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Specialized Professional Associations (SPA), the National Board for Professional Teaching Standards (NBPTS), states, or other accrediting bodies (e.g., National Association of Schools of Music – NASM). Reflect on: What evidence do I have that would demonstrate the application of CK and PK in response to other professional standards?

58 Specialized Professional Associations (SPA) Examples: NCTM
WHAT IS A SPA? Specialized Professional Associations (SPA) Examples: NCTM Some states require SPAS, other it is optional Get up to date list

59 Program Review Weblink:

60 Program Review Options
Available program review options for EPPs in states with agreements: SPA review with National Recognition (3 years prior to site visit) CAEP program review with feedback (part of self-study report) State review of programs (determined by state)

61 Program Review Options
Available program review options for EPPs in states without agreements: SPA review with National Recognition (3 years prior to site visit) CAEP program review with feedback (part of self-study report) QUESTIONS ON CAEP Program Review - contact Banhi Bhattacharya:

62 Program Review Options

63 Program Review Options

64 Program Review Options

65 Program Review Options

66 Component 1.4 – Key Language
Providers ensure that candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards (e.g., Next Generation Science Standards, National Career Readiness Certificate, Common Core State Standards). Reflect on: What evidence do I have that would demonstrate skills and commitment to access for all students? Skip to standard 2 if behind schedule

67 Component 1.4 Suggested Evidence
Evidence specific to college- and career-readiness Plans, assignments, and observational data demonstrate candidates’ skills for Deep content knowledge Eliciting P-12 student application of their knowledge to solve problems and think critically Cross-discipline teaching Differentiation instruction Ability to identify and interpret assessments to match P-12 college- and career-readiness goals/objective

68 Evidence Sufficiency Criteria, 1.4
CANDIDATES DEMONSTRATE TO COLLEGE-AND-CAREER-READY STANDARDS Multiple indicators/measures specific to evaluating proficiencies for Candidate’s ability to Provide effective instruction for all students (differentiation of instruction) Have students apply knowledge to solve problems and think critically Include cross-discipline learning experiences and to teach for transfer of skills Design and implement learning experiences that require collaboration and communication skills

69 Component 1.5 – Key Language
Providers ensure that candidates model and apply technology standards as they design, implement and assess learning experiences to engage students and improve learning; and enrich professional practice. Reflect on: What evidence do I have that would demonstrate modeling and application of technology skills to enhance learning for students and self?

70 Component 1.5, Technology…
Design Analysis of Learning and Teaching Facilitate Planning for Integration of Instructional Technology Evaluate Post-Instruction Evaluation and Review

71 Evidence Sufficiency Criteria, 1.5
CANDIDATES MODEL AND APPLY TECHNOLOGY Candidates demonstrate Knowledge and skill proficiencies including accessing databases, digital media, and/or electronic sources The ability to design and facilitate digital learning The ability to track and share student performance data digitally

72 There is a place in the SSR at the end for cross-cutting themes of diversity & technology, you can add your technology data there

73 In Summary - The Case for Standard 1/A.1
Information is provided from several sources and provides evidence of candidate knowledge, skills, and dispositions. Grades, scores, pass rates, and other data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined.

74 In Summary - The Case for Standard 1/A.1
Information is provided from several sources and provides evidence of candidate knowledge, skills, and dispositions. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.

75 Standard 2/A.2 Clinical Partnerships and Practice

76 CAEP Standard 2/A.2 Clinical Partnerships and Practice
Partnerships for Clinical Preparation 2.1 A.2.1 Clinical Educators 2.2 Clinical Experiences 2.3 A.2.2

77 CAEP Standard 2 Clinical Partnerships and Practice
The provider ensures that effective partnerships [components 2.1 and 2.2] and high-quality clinical practice [component 2.3] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions necessary to demonstrate positive impact on all P-12 students’ learning and development.

78 SCREENSHOT OF SANDBOX ST 2 - INITIAL

79 CAEP Standard A.2 Clinical Partnerships and Practice
The provider ensures that effective partnerships [component A.2.1] and high-quality clinical practice [component A.2.2] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions appropriate for their professional specialty field.

80 SCREENSHOT OF SANDBOX ST 2 - ADVANCED
Upload docs here so there are examples

81 Special for Standard 2/A.2
Rules for Standard 2/A.2 General for all Standards All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses Special for Standard 2/A.2 No required components (CAEP Accreditation Handbook 2016, p. 25)

82 Component 2.1 – Key Language
Partners co-construct mutually beneficial P-12 school and community arrangements, including technology-based collaborations, for clinical preparation and share responsibility for continuous improvement of candidate preparation.

83 Component 2.1 – Key Language
Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for candidate outcomes. Reflect on: What evidence do I have that would demonstrate mutually beneficial and accountable partnerships in which decision-making is shared?

84 If you are an EPP attempting to provide evidence of co- construction of Clinical practice experiences, which one of the following would be the best example of co-construction? gmeolAKPlH6ogk

85 Component 2.1 – Co-construction
Question: If you are an EPP attempting to provide evidence of co-construction of Clinical practice experiences, which one of the following would be the best example of co-construction? “Yes or No” EPP sending an MOU to a school principal outlining what needs to take place during clinical practice. EPP with a small enrollment documents in Self-Study report that conversations happen with principals and administrators in the local area. Poll everywhere

86 Component 2.1 – Co-construction
Video file (.mov or mp3) of a conference call between EPP Field Work coordinator requesting places for 12 student teachers in the upcoming semester. Minutes of an EPP Advisory council documenting anticipated placements for the upcoming semester, reviewing evaluations from the previous semester, and discussion of clinical practice evaluation forms. School District sending a letter approving 12 placements for student teachers in the upcoming semester.

87 Co-Construction of Clinical Experiences
Co-Construct the opportunities, challenges, and responsibilities, along with the support and guidance of clinical educators and designated faculty. Co- Constructed opportunities allow Candidates to apply the knowledge, dispositions and skills developed in general education and professional courses.

88 Co-Construction of Clinical Experiences
Candidates should continue learning to adapt to the various conditions of classrooms in Co-Construction opportunities. Application, Introduction, Participation, Culmination, Roles/Responsibilities, Evaluate…

89 Evidence Sufficiency Criteria, 2.1
EVIDENCE THAT A COLLABORATIVE PROCESS IN PLACE AND REVIEWED Documentation provided for a shared responsibility model that includes elements of Co-construction of instruments and evaluations Co-construction of criteria for selection of mentor teachers Involvement in on-going decision-making Input into curriculum development EPP and P-12 educators provide descriptive feedback to candidates Opportunities for candidates to observe and implement effective teaching strategies linked to coursework

90 Component 2.2 – Key Language
Partners co-select, prepare, evaluate, support, and retain high-quality clinical educators, both provider- and school-based, who demonstrate a positive impact on candidates’ development and P-12 student learning and development.

91 Component 2.2 – Key Language
In collaboration with their partners, providers use multiple indicators and appropriate technology-based applications to establish, maintain, and refine criteria for selection, professional development, performance evaluation, continuous improvement, and retention of clinical educators in all clinical placement settings. Reflect on: What evidence do I have that would demonstrate the depth of partnership around highly effective clinical educators?

92 Clinical Educator Development/Responsibilities
Process of collaboration with partnerships; further demonstrate partnerships, in field-experiences Developed - criteria, reflective teaching and learning, mutual engagement,… Monitored - facilitate learning and development Evaluated - opportunities for partners to…

93 Evidence Sufficiency Criteria, 2.2
EVIDENCE EPP AND P-12 CLINICAL EDUCATORS/ADMINISTRATORS CO-CONSTRUCT CRITERIA FOR CO- SELECTION Clinical educators receive Professional development, resources, and support Are involved in creation of professional development opportunities, the use of evaluation instruments, professional disposition evaluation of candidates, specific goals/objectives of the clinical experience, and providing feedback Data collected are used by EPPs and P-12 clinical educators for modification of selection criteria, future assignments of candidates, and changes in clinical experiences

94 Component 2.3 – Key Language
The provider works with partners to design clinical experiences of sufficient depth, breadth, diversity, coherence, and duration to ensure that candidates demonstrate their developing effectiveness and positive impact on all students’ learning and development.

95 Component 2.3 – Key Language
Clinical experiences, including technology-enhanced learning opportunities, are structured to have multiple performance-based assessments at key points within the program to demonstrate candidates’ development of the knowledge, skills, and professional dispositions, as delineated in Standard 1, that are associated with a positive impact on the learning and development of all P-12 students. Reflect on: What evidence do I have that clinical experiences develop candidates’ Knowledge, Skills, and Dispositions to have a positive impact on P-12 learning?

96 Clinical Experience Table Course Sample

97 Clinical Experience Table
Program Sample

98 Evidence Sufficiency Criteria, 2.3
EVIDENCE ALL CANDIDATES HAVE CLINICAL EXPERIENCES IN DIVERSE SETTINGS Attributes (depth, breadth, diversity, coherence, and duration) are linked to student outcomes and candidate/completer performance documented in Standards 1 and 4 Evidence documents a sequence of clinical experiences that are focused, purposeful, and varied with specific goals Clinical experiences include focused teaching experience where specific strategies are practiced Clinical experiences are assessed using performance-based

99 CAEP Standard 2.3/A.2.1 & A.2.2 Clinical Partnerships and Practice
The provider ensures that effective partnerships [components 2.1 and 2.2] and high-quality clinical practice [component 2.3] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions necessary to demonstrate positive impact on all P-12 students’ learning and development.

100 CAEP Standard 2.3/A.2.1 & A.2.2 Clinical Partnerships and Practice
The provider ensures that effective partnerships [component A.2.1] and high-quality clinical practice [component A.2.2] are central to preparation so that candidates develop the knowledge, skills, and professional dispositions appropriate for their professional specialty field.

101 Component A.2.1: Key language
Partners co-construct mutually beneficial P-12 school and community arrangements, including technology-based collaborations, for clinical preparation and shared responsibility for continuous improvement of candidate preparation.

102 Component A.2.1: Key language
Partnerships for clinical preparation can follow a range of forms, participants, and functions. They establish mutually agreeable expectations for candidate entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for candidate outcomes.

103 SUGGESTED EVIDENCE: PARTNERSHIPS FOR CLINICAL PREPARATION
Documents illustrating co-construction of a collaborative relationship Documents outlining provider and partner responsibilities for examining and improving clinical preparation

104 SUGGESTED EVIDENCE: PARTNERSHIPS FOR CLINICAL PREPARATION
Evidence that assessments and performance standards are mutually acceptable to providers and partners Documentation of shared perspective on appropriate uses of technology for the candidate’s future role

105 Evidence Sufficiency Criteria, A.2.1
PARTNERSHIPS FOR CLINICAL PREPARATION Illustrates specific benefits to provider and P-12 partners Outlines the collaborative nature of the relationship Documents that effectiveness of the partnership is reviewed at least annually

106 Evidence Sufficiency Criteria, A.2.1
PARTNERSHIPS FOR CLINICAL PREPARATION Shows that the EPP seeks input from partners to refine criteria for entry/exit to clinical experiences Documents partner participation in development and review activities (e.g., for clinical instruments, clinical curriculum, EPP-curriculum) Phase-in Plans meet CAEP guidelines and schedule Instruments for evaluating partnership (if any) meet CAEP’s assessment sufficiency criteria

107 Component A.2.2: Key language
The provider works with partners to design varied and developmental clinical settings which allow opportunities for candidates to practice applications of content knowledge and skills emphasized by the courses and other experiences of the advanced preparation program.

108 Component A.2.2: Key language
The opportunities lead to appropriate culminating experiences in which candidates demonstrate their proficiencies, through problem-based tasks or research (e.g., qualitative, quantitative, mixed methods, action) that are characteristic of their professional specialization as detailed in component A.1.1

109 SUGGESTED EVIDENCE: CLINICAL EXPERIENCES
Charts illustrating the breadth, depth, duration, and coherence of the opportunities to practice applying content knowledge and skills to practical challenges in their specialty area

110 SUGGESTED EVIDENCE: CLINICAL EXPERIENCES
Evidence mapping the developmental trajectory of specific practical knowledge and skills as candidates’ progress through courses and the clinical experiences embedded within or external to the courses. Candidate evaluations of connection between coursework and fieldwork

111 Clinical Experience Table Course Sample
Clinical Internships & Associated Description (Observation and/or Implementation) Program Fields Hours Measures Schools/Districts EDU 2100: This supervised practicum in elementary settings, exposes candidates with practical experiences in workplace settings and scenarios to evaluate the connections between coursework and fieldwork M.Ed., Ed.D. 45 hours of Observation and/or Implementation -Dispositional/ Professional Responsibility Data -Problem-based projects, coursework Internship- Must be Approved by PDS/D during semester of application prior to… EDU 2900: This clinical internship in elementary education, is designed for Candidates to appropriately and effectively apply research based instructional learning theory/strategies for their fields of specialization, in P -12 60 hours of Observation and Implementation -Problem-based projects, school/district -Action Research -Capstones/ Portfolios/ Thesis

112 Clinical Experience Table Program Sample
Field Field Experiences & Associated Hours (Observation) Clinical Internships & Associated Hours (Implementation) Hours M.Ed., Secondary Mathematics Education EDUM 552, EDUM 553, EDUM 554, EDUM 555, EDUM 556 (Practicum) – 200 hours observation EDU-M 699 – 500 hours of participation and implementation of coursework and fieldwork 700 M.Ed., English as a Second Language (TESL) TESL 500 (Practicum) – 250 hours of observation and participation EDU-TESL 699 – 500 hours of participation and implementation of research based instructional learning strategies 750

113 Evidence Sufficiency Criteria, A.2.2
CLINICAL EXPERIENCES Documents that all candidates have practical experiences in workplace settings Illustrates that candidates observe and implement appropriate and effective strategies for their fields of specialization Documents the attributes of clinical/practical experiences Illustrates that they are varied and developmentally progressive Illustrates that they relate to coursework .

114 Evidence Sufficiency Criteria, A.2.2
CLINICAL EXPERIENCES Demonstrates a relationship between clinical/practical experiences and candidate outcomes reported in Standard A.1 Phase-in Plans meet CAEP guidelines and schedule .

115 In Summary - The Case for Standard 2/A.2
Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, schools/districts, and continuous functioning. Data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined in relation to clinical experiences, as appropriate.

116 In Summary - The Case for Standard 2/A.2
Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, schools/districts, and continuous functioning. Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.

117

118 Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity

119 CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
Recruitment/Admission of Diverse Candidates who Meet Employment Needs 3.1 A.3.1 Admission Standards Indicate That Candidates Have High Academic Achievement and Ability 3.2 A.3.2 Additional Selectivity Factors (non-academic) 3.3 Selectivity During Preparation (performance standards) 3.4 A.3.3 Selection At Completion (ready, not just finished) A.3.4

120 CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
The provider demonstrates that the quality of candidates is a continuing and purposeful part of its responsibility from recruitment [component 3.1], at admission [component 3.2], through the progression of courses and clinical experiences [components 3.3 and 3.4], and to decisions that completers are prepared to teach effectively and are recommended for certification [components 3.5 and 3.6].

121 CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
The provider demonstrates that development of candidate quality is the goal of educator preparation in all phases of the program. This process is ultimately determined by a program’s meeting of Standard 4. The provider demonstrates that the quality of advanced program candidates [components A.3.1 and A.3.2] is a continuing and purposeful part of its responsibility [component A.3.3] so that completers are prepared to perform effectively and can be recommended for certification where applicable [component A.3.4].

122 Rules for Standard 3/A.3 General for all Standards
All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses

123 Rules for Standard 3/A.3 Special for Standard 3/A.3
Component 3.2/A.3.2 are required in order to meet Standard 3 (CAEP Accreditation Handbook 2016, p. 36)

124 Component 3.1 – Key Language
The provider presents plans and goals to recruit and support completion of high-quality candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of America’s P-12 students.

125 Component 3.1 – Key Language
The provider demonstrates efforts to know and address community, state, national, regional, or local needs for hard-to-staff schools and shortage fields, currently, STEM, English-language learning, and students with disabilities. Reflect on: What recruitment evidence (plans and goals) do I have that demonstrates attracting diverse candidates to meet identified needs?

126 An example of a Recruitment Plan’s Common elements…
Introduction and Planning Organization, College, Department, etc… Background of College/Department College/Department Self-Assessment Recruitment of Candidates Develop EPP’s “Message” Develop “How To” Recruit Develop, Schedule, Conduct Orientations

127 An example of a Recruitment Plan’s Common elements…
Retention of Candidates Assign Support/Supervisor Provide Learning Opportunities of Foundations, Methods, and Clinical Experiences Evaluate Content and Pedagogical Development Provide Academic/non-Academic Resources Transition of Candidates to Completers Communicate with Completers regularly via surveys, polls, questionnaires, census Recognize professional support, supervisor(s), and resources

128 An example of a Recruitment Plan’s Common elements…
Managing and Evaluating Design the Evaluation Collect, Organize, and Analyze Data Report Results, Conclusions Reached, and Recommendations Resources

129 Evidence Sufficiency Criteria, 3.1
PLAN/GOALS TO RECRUIT/SUPPORT HIGH-QUALITY CANDIDATES Recruitment plan with base points and goals; including academic ability, diversity, and employment needs Data on applicants, admitted, and enrolled candidates are disaggregated by relevant demographics Evidence that results are recorded, monitored, and used in planning and modification of recruitment strategies Plan and demonstrates knowledge of and addresses employment opportunities in schools, districts, and/or regions

130 Component 3.2 – Key Language
The provider meets CAEP minimum criteria or the state’s minimum criteria for academic achievement, whichever are higher, and presents disaggregated data on the enrolled candidates whose preparation begins during an academic year.

131 Academic Selection Samples
CAEP minimum criteria 1) Admissions 2) Prior to program completion  “Starting in academic year , the CAEP minimum criteria apply to the group average of enrolled candidates whose preparation begins during an academic year. The provider determines whether the CAEP minimum criteria will be measured (1) at admissions, OR (2) at some other time prior to candidate completion.”

132 Let’s compare initial and advanced programs
Initial; ‘The CAEP minimum criteria are a grade point average of 3.0 and a group average performance on nationally normed assessments or substantially equivalent state-normed assessments of mathematical, reading and writing achievement in the top 50 percent of those assessed.” Advanced Standards “The CAEP minimum criteria are a college grade point average of 3.0 or a group average performance on nationally normed assessments, or substantially equivalent state-normed or EPP administered assessments, of mathematical, reading, and writing achievement in the top 50 percent of those assessed.

133 Evidence Sufficiency Criteria, 3.2
CANDIDATES DEMONSTRATE ACADEMIC ACHIEVEMENT Average scores for group of candidates during in an academic year meet CAEP minimum GPA of 3.0 AND performance on nationally-normed, substantially equivalent state- normed, or EPP administered assessments is in the top 50% for all test takers of the selected assessment Assessments examine candidate performance in mathematical and reading achievement Beginning in 2021 in writing achievement Group average: The GPA and standardized test scores are averaged for all members of a cohort or class of admitted candidates. Averaging does not require that every candidate meet the specified score. Thus, there may be a range of candidates’ grades and scores on standardized tests.

134 Component 3.3 – Key Language
Educator preparation providers establish and monitor attributes and dispositions beyond academic ability that candidates must demonstrate at admissions and during the program.

135 Component 3.3 – Key Language
The provider selects criteria, describes the measures used and evidence of the reliability and validity of those measures, and reports data that show how the academic and non-academic factors predict candidate performance in the program and effective teaching. Reflect on: What data can I present to demonstrate the other things (besides GPA and test scores) we look for at admissions that result in selecting high quality candidates?

136 Non-Academic Samples Admission to Teacher Education
Admission to Clinical Experience

137 Evidence Sufficiency Criteria, 3.3
PROVIDERS ESTABLISHES/MONITORS CANDIDATE ATTRIBUTS/DISPOSITIONS, BEYOND ACADEMICS Rationale for established non-academic criteria Makes evidence-based case for the selection and implementation Evidence that EPP monitors candidate progress on established non-academic criteria at multiple points; takes appropriate actions based on results Evidence of association/correlation of non-academic criteria with candidate and completer performance

138 Component 3.4 – Key Language
The provider creates criteria for program progression and monitors candidates’ advancement from admissions through completion. All candidates demonstrate the ability to teach to college- and career-ready standards.

139 Component 3.4 – Key Language
Providers present multiple forms of evidence to indicate candidates’ developing content knowledge, pedagogical content knowledge, pedagogical skills, and the integration of technology in all of these domains Reflect on: What data can I present to demonstrate that my EPP continues to be selective of candidates throughout our programs?

140 Monitoring Table of Candidates

141 Evidence Sufficiency Criteria, 3.4
PROVIDER CRITERIA FOR PROGRAM PROGRESSION/MONITORING OF CANDIDATES Evidence of candidates developing proficiencies in or evidence of developing proficiencies in candidates at 2 or more gateways of progression Ability to teach to college- and career-ready standards Pedagogical/Content knowledge Integration of use of technology

142 Evidence Sufficiency Criteria, 3.4
PROVIDER CRITERIA FOR PROGRAM PROGRESSION/MONITORING OF CANDIDATES Results and stated candidate progressions criteria align with evidence of actions taken such as: Changes in curriculum or clinical experiences Providing interventions/Counseling out

143 Component 3.5 – Key Language
Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate has reached a high standard for content knowledge in the fields where certification is sought and can teach effectively with positive impacts on P-12 student learning and development Reflect on: What data can I present to demonstrate that exit criteria are rigorous?

144 Evidence Sufficiency Criteria, 3.5
PROVIDER DEMONSTRATES; CANDIDATES HAVE CONTENT KNOWLEDGE IN CERTIFICATION FIELD Evidence is the same as that for 1.1 Evidence of effective teaching including positive impacts on P-12 student learning and development for all candidates as noted in Standard 1

145 Component 3.6 – Key Language
Before the provider recommends any completing candidate for licensure or certification, it documents that the candidate understands the expectations of the profession, including codes of ethics, professional standards of practice, and relevant laws and policies. CAEP monitors the development of measures that assess candidates’ success and revises standards in light of new results Reflect on: What data can I present to document that our candidates understand the professional dos and don'ts of teaching?

146 Evidence Sufficiency Criteria, 3.6
PROVIDER DEMONSTRATES, CANDIDATES UNDERSTAND EXPECTATIONS OF PROFESSION Candidates’ understanding; codes of ethics and professional standards of practice Evidence that candidates have knowledge of relevant laws and policies 504 disability provisions, education regulations; bullying, etc.

147 Sandbox Screenshot of Standard 3 Initial-Level

148 CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
The provider demonstrates that the quality of candidates is a continuing and purposeful part of its responsibility from recruitment [component 3.1], at admission [component 3.2], through the progression of courses and clinical experiences [components 3.3 and 3.4], and to decisions that completers are prepared to teach effectively and are recommended for certification [components 3.5 and 3.6].

149 CAEP Standard 3/A.3 Candidate Quality, Recruitment, and Selectivity
The provider demonstrates that development of candidate quality is the goal of educator preparation in all phases of the program. This process is ultimately determined by a program’s meeting of Standard 4. The provider demonstrates that the quality of advanced program candidates [components A.3.1 and A.3.2] is a continuing and purposeful part of its responsibility [component A.3.3] so that completers are prepared to perform effectively and can be recommended for certification where applicable [component A.3.4].

150 Rules for Standard 3/A.3 General for all Standard
All components addressed EPP-created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential, and be the latest available Disaggregated data on candidates, for main/branch campuses

151 Rules for Standard 3/A.3 Special for Standard 3/A.3
Component 3.2/A.3.2 are required in order to meet Standard 3 (See CAEP Accreditation Handbook 2016, Page 36)

152 Component A.3.1 – Key Language
The provider sets goals and monitors progress for admission and support of high-quality advanced program candidates from a broad range of backgrounds and diverse populations to accomplish their mission. The admitted pool of candidates reflects the diversity of America’s teacher pool and, over time, should reflect the diversity of P-12 students.

153 Component A.3.1 – Key Language
The provider demonstrates efforts to know and address community, state, national, regional, or local needs for school and district staff prepared in advanced fields. Reflect on: What recruitment evidence (plans and goals) do I have that demonstrate to base points and annual monitoring?

154 Screenshot of Standard 3 Advanced-Level

155 Evidence Sufficiency Criteria, A.3.1
ADMISSION OF DIVERSE CANDIDATES, WHO MEET EMPLOYMENT NEEDS Recruitment plan with base points and annual monitoring; including academic ability, diversity, and employment needs Data on applicants, admitted, and enrolled candidates are disaggregated by relevant demographics Evidence that results are recorded, monitored, and used in planning and modification of recruitment strategies Plan and demonstrates knowledge of and addresses employment opportunities in schools, districts, and/or regions

156 Component A.3.2 – Key Language
Required Component: The provider sets admissions requirements for academic achievement, including CAEP minimum criteria, the state’s minimum criteria, or graduate school minimum criteria, whichever is highest, and gathers data to monitor candidates from admission to completion.

157 Component A.3.2 – Key Language
The provider determines additional criteria intended to ensure that candidates have, or develop, abilities to complete the program successfully and arranges appropriate support and counseling for candidates whose progress falls behind.

158 Academic Selection Samples
CAEP Minimum Criteria 1) Admissions 2) Prior to program completion  3.0 GPA Initial-Level or Advanced-Level Standards State Licensure Test Scores Relevant surveys or assessments of completers

159 Academic Selection Samples
Other Proficiency Measures -Action Research -Capstones/Portfolios/Thesis -Dispositional/Professional Responsibility Data -Problem-based projects with coursework/group projects -Problem-based projects with school/district -Pre- and post-data and reflections on interpretations and use of data -End of key-course tests -Grades, by program field -Survey Data from Completers/Employers

160 Component A.3.3 – Key Language
Before the provider recommends any advanced program candidate for completion, it documents that the candidate has reached a high standard for content knowledge in the field of specialization, data literacy and research-driven decision making, effective use of collaborative skills, applications of technology, and applications of dispositions, laws, codes of ethics and professional standards appropriate for the field of specialization. Reflect on: What data can I present to demonstrate following documentation that candidates (besides GPA and test scores) have developed and progressed through program?

161 EPP Created- Assessments
Academic Selection Samples -Action Research -Capstones/Portfolios/Thesis -Dispositional/Professional Responsibility Data -Problem-based projects with coursework/group projects -Problem-based projects with school/district

162 EPP Created- Assessments
Academic Selection Samples -Pre- and post-data and reflections on interpretations and use of data -End of key-course tests -Grades, by program field -Survey Data from Completers/Employers + State Assessments/Surveys + Other Proficiency Measures

163 EPP Created- Assessments
Advanced Standards Resource: CAEP Evaluation Framework for EPP-Created Assessments 

164 Evidence Sufficiency Criteria, A.3.3
SELECTIVITY DURING PREPARATION Evidence of candidates developing proficiencies in or evidence of developing proficiencies in candidates at 2 or more gateways of progression Proficiencies to understand and apply knowledge and skills appropriate to program fields of specialization, see generic skills in component A.1.1

165 Evidence Sufficiency Criteria, A.3.3
SELECTIVITY DURING PREPARATION Results and stated candidate progressions criteria align with evidence of actions taken such as: Changes in curriculum or clinical experiences Providing interventions/Counseling out

166 Component A.3.4 – Key Language
The provider creates criteria for program progression and uses disaggregated data to monitor candidates’ advancement from admissions through completion. Reflect on: What data can I present to demonstrate at exit the proficiencies of completing candidates?

167 Monitoring Table of Candidates

168 Evidence Sufficiency Criteria, A.3.4
SELECTION AT COMPLETION Evidence is the same as that for A.1.1 Evidence of effective teaching including positive impacts on P-12 student learning and development for all candidates as noted in Standard 1

169 In Summary - The Case for Standard 3/A.3
Information is provided from several sources and provides evidence of shared decision-making, collaboration among clinical faculty, and continuous functioning. Data are analyzed. Differences and similarities across licensure/field areas, comparisons over time, and demographical data are examined in relation to components 3.1 and 3.2 (recruitment and admissions), as appropriate.

170 In Summary - The Case for Standard 3/A.3
Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification. Based on the analysis of data, there are planned or completed actions for change that are described.

171

172 Standard 4/A.4 Program Impact

173 CAEP Standard 4/A.4 Program Impact
Impact on P-12 Student Learning and Development 4.1 Indicators of Teaching Effectiveness 4.2 Satisfaction of Employers 4.3 A.4.1 Satisfaction of Completers 4.4 A.4.2

174 CAEP Standard 4 Program Impact
WARNING initial-level programs only program impact data at this time!

175 New Guidance on Standard 4- see CAEP Accreditation Weekly Update September 22, 2017
EPPs have followed the CAEP Standards and Handbook procedures, and the CAEP review procedures have been in place with site teams and the Accreditation Council for decisions in October and April So now, for the first time, CAEP is able to describe what EPPs are including in their self- study reports as evidence and to provide actual examples.

176 another fifth were case studies to create student learning data.
Update on Standard 4 “The EPP self-study report evidence can be categorized by type. Drawing on the 17 cases decided in April 2017, we find that about one third of the evidence was for State or district measures of P-12 student learning or growth, and another fifth were case studies to create student learning data. One EPP provided student survey results, and another offered a teacher evaluation tool as evidence.” P. 2

177 CAEP Standard 4 Program Impact
The provider demonstrates the impact of its completers on P-12 student learning and development [component 4.1], classroom instruction [component 4.2] and schools [component 4.3], and the satisfaction of its completers [component 4.4] with the relevance and effectiveness of their preparation.

178 Initial Component 4.1 – Key Language
REQUIRED COMPONENT: The provider documents, using multiple measures that program completers contribute to an expected level of student-learning growth. Multiple measures shall include all available growth measures (including value-added measures, student-growth percentiles, and student learning and development objectives) required by the state for its teachers and available to educator preparation providers, other state-supported P-12 impact measures, and any other measures employed by the provider.

179 Initial Component 4.1 – Key Language
Reflect on: What evidence do you have that would demonstrate graduates’ impact on P-12 student learning? What research methodologies could you feasibly employ to gain such information?

180 EPPs that have access to data from states about completer impact:
Demonstrate that they are familiar with the sources of the P-12 student learning impact data and the state’s model for preparing the data that are attributed to the EPP’s preparation program. Document the EPP’s analysis and evaluation of information provided on P-12 student learning.

181 EPPs that have access to data from states about completer impact:
Interpretations of the data. Judge the implications of the data and analyses for the preparation program. If judged to be invalid, use other valid evidence.

182 EPPs that do not have access to data from states about completer impact:
The EPP creates data similar to state data in conjunction with student assessment and teacher evaluations conducted in school districts where some portion of its completers are employed This type of EPP study could be phased in VInce - can you check out the specifics of “calendar year 2018” to make sure we are correct?

183 EPPs that do not have access to data from states about completer impact:
By 2016, all EPPs should at least have a design in place and pilot data collection under way One year of data needed for calendar year 2018 EPP collaborations encouraged Also presented by EPPs that are supplementing state or district data with data on subjects or grades not covered VInce - can you check out the specifics of “calendar year 2018” to make sure we are correct?

184 4-Examples Self-study report evidence that “early adopter” EPPs have submitted Example 1—STATE University Example 2—PRIVATE University Example 3—PRIVATE College Example 4—PUBLIC University The next set of slides are excerpt with key points from each example ** Based on time constraints- we may not be able to go over these slides in detail. The examples are here for participants to access after the workshop.

185 Example 1: STATE University
P-12 academic achievement comparison using available data, with confirmation from correlated measures: Enrollment 23000; EPP enrollment around 2300 Use data available to you; (these are student growth measures) Develop case study designs similar to teacher work sample from pre-service Cites findings from research that associates teaching strategies demonstrating an impact on student learning FIRST build a graduate tracking system!!

186 Example 2: PRIVATE University
P-12 student growth complemented by planned teacher action research Enrollment around 3100; EPP enrollment around 80 Student growth percentiles available from state, information was highly summarized, so of little use to the EPP on information available Pilot a teacher action research project using volunteering completers, and constructed as an annually recurring activity The design permits links with pre-service data for the same completers Candidate tasks are similar to those in a teacher work sample assessment These include pre and post measures of student learning associated with teaching a comprehensive unit of instruction

187 Example 3: PRIVATE College
Evidence from preservice and indirectly through employers - NOT TO USE State shares no data with EPPs An early adopter EPP attempts to respond to Standard 4 Almost all the evidence provided was taken from pre-service preparation, so is not responsive to 4.1 Self-Study Report included a richly descriptive principal survey that included questions on assessment (although not on learning), but this is not a substitute for measuring student learning.

188 Example 4: PUBLIC University (problematic)
P-12 student value added data as part of state teacher evaluation, complemented by planned teacher action research State value added data were available but are aggregated, so of little use to the EPP on information available Data was part of a state teacher evaluation (50% student learning / 50% other factors), so evidence for 4.1 and 4.2 are linked EPP shows options it has considered to complement the state data and designates one as the path forward (plan but not data on this element) Provider will work with a school partner to gather and evaluate classroom data from novice teachers

189 Summary of Key Points, Component 4.1
Use available data, but learn their strengths and weaknesses State data on P-12 learning are highly variable from state to state, and early adopters generally found them insufficient as feedback on their own performance

190 Summary of Key Points, Component 4.1
Case Studies were most common approach for early adopters Some of these are fashioned so that preservice Teacher Work Sample (TWS) assessments cold serve as a point of comparison for new in-service teacher data One approach was in the form of teacher action research Collaborating with partner school districts was one strategy One important lesson…build a tracking system

191 Step 1 Review - Rules for Standard 4/A.4
General for all Standards All components addressed EPP-Created Assessments at CAEP level of sufficiency At least 3 cycles of data Cycles of data are sequential Disaggregated data on candidates, for main/branch campuses

192 Step 1 Review - Rules for Standard 4/A.4
Special for Standard 4 All components for Standard 4 are required All components must be met for the standard to be considered met. All phase-in requirements are met.

193 Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements

194 Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement

195 Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3

196 Exercise 4: Supporting Evidence for Standard 4
Open: CAEP Standards 1-pager (Initial) Evidence Sufficiency Criteria for Initial Programs Handout: Exercise 4, Supporting Evidence for Component 4.1 We will look at each of these resources and practice applying them to prospective evidence. The process of building a case that Standard 4 is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, we will begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.

197 Exercise 4: Supporting Evidence for Standard 4
Instructions: Select one of the measures that you listed for Component 4.1. Which category or categories in the evidence evaluation are the greatest source of concern for this assessment or evidence suite? In the Weaknesses space, list the concerns and the types of resources that would help you address the concern? We will look at each of these resources and practice applying them to prospective evidence. The process of building a case that Standard 4 is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, we will begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.

198 Exercise 4: Supporting Evidence for Standard 4
We will look at each of these resources and practice applying them to prospective evidence. The process of building a case that Standard 4 is met can begin with a review of the quality sufficiency of each measure currently in use and how they can combine to create a well-balanced set that speaks to the evidence sufficiency criteria. This would be followed by an exploration of how to best fill any gaps. Or, building a case can begin with a review of the evidence sufficiency criteria, followed by taking inventory of available evidence that meets sufficiency criteria for assessments, then by an exploration of how to best fill any gaps. Since EPPs are conducting assessments that are more for operational purposes than accreditation purposes, we will begin by looking at the evidence sufficiency criteria to see where existing sources can serve dual purposes. The Evaluation Framework for EPP-Created Assessments is a general tool for thinking about the quality of individual instruments. The Evidence Evaluation Exercise is more directly tied to the evidence quality factors discussed in Component It is also more tailored to evaluating evidence for particular standards or components, whether EPP created or not. This tool can be applied to individual measures or to sets of evidence, and provides a way to document that the whole is more than the sum of its parts or what gaps remain even after the strengths of multiple sources are combined. This can allow for a much more focused approach to selecting additional evidence.

199 SUGGESTED EVIDENCE: IMPACT ON LEARNING
Direct measures of student learning and development Addresses diverse subjects and grades P-12 impact or growth data from state teacher evaluations (if available) If state data are not available: Teacher-linked student assessments from districts Classroom-based research (e.g., action research, case studies)

200 EVIDENCE SUFFICIENCY CRITERIA, 4.1
SUFFICIENT EVIDENCE Presents multiple measures showing positive impact on student learning One or more state-provided or two or more EPP-generated From a representative or purposive sample of graduates 1-3 years post-exit EPP-generated data utilizes research-based methodology( e.g., cases study, action research)

201 EVIDENCE SUFFICIENCY CRITERIA, 4.1
SUFFICIENT EVIDENCE Describes the measures and context Describes representativeness of sample/data Analyzes data and interprets results appropriately Conclusions are supported by results

202 Component 4.2 – Key Language
REQUIRED COMPONENT: The provider demonstrates, through structured and validated observation instruments and/or student surveys, that completers effectively apply the professional knowledge, skills, and dispositions that the preparation experiences were designed to achieve. Reflect on: What evidence do I have (beyond measures of P-12 student learning) that would demonstrate in-service graduates are effective teachers?

203 SUGGESTED EVIDENCE: CLASSROOM INSTRUCTION
Teaching Observations Aligned to the 4 InTASC categories Aligned to state standards for teachers / local teacher evaluation framework P-12 Student Surveys Aligned to the InTASC categories Corroboration for observation/evaluation data The 4 InTASC categories addressed in Standard 1 are: Learner and Learning, Content, Instructional Practice, and Professional Responsibility.

204 SUGGESTED EVIDENCE: CLASSROOM INSTRUCTION
Employer Surveys Aligned to the InTASC Corroboration for observation/evaluation data The 4 InTASC categories addressed in Standard 1 are: Learner and Learning, Content, Instructional Practice, and Professional Responsibility.

205 EVIDENCE SUFFICIENCY CRITERIA, 4.2
SUFFICIENT EVIDENCE Measures classroom-based demonstration of professional knowledge, skills, and dispositions (e.g., InTASC, state/district teacher performance standards) Utilizing structured and validated teaching observation tools and/or P-12 student surveys Utilizing a representative sample that covers most licensure areas Obtaining survey return rates of 20% or higher SECTION OR OTHER CONTENT

206 EVIDENCE SUFFICIENCY CRITERIA, 4.2
SUFFICIENT EVIDENCE Analyzes data and interprets results appropriately Conclusions are supported by results SECTION OR OTHER CONTENT

207 Component 4.3 : Key Language
REQUIRED COMPONENT: The provider demonstrates, using measures that result in valid and reliable data and including employment milestones such as promotion and retention, that employers are satisfied with the completers’ preparation for their assigned responsibilities in working with P-12 students. Reflect on: What evidence do we have that would demonstrate that employers are satisfied with the professional knowledge, skills, and dispositions of your program graduates who are working at their location?

208 SUGGESTED EVIDENCE: SATISFACTION
Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Can triangulate with observation/evaluation, survey, and impact data Employer Surveys Corroboration for observation/evaluation and data

209 EVIDENCE SUFFICIENCY CRITERIA, 4.3
SUFFICIENT EVIDENCE Shows that employers perceive completers’ preparation was sufficient for their job responsibilities and attainment of employment milestones (e.g., retention) Utilizing valid and reliable measures Obtaining response rates of 20% or higher

210 EVIDENCE SUFFICIENCY CRITERIA, 4.3
SUFFICIENT EVIDENCE Describes representativeness of sample/data for licensure areas Discusses satisfaction patterns with respect to employment contexts (e.g., shortage fields, hard-to-staff schools, schooling level, school demographics) Data analysis is appropriate and conclusions are supported by data

211 Component 4.4: Key language
REQUIRED COMPONENT: The provider demonstrates, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. Reflect on: What evidence do we have that would demonstrate your program graduates are satisfied with how well the program prepared them for their job?

212 SUGGESTED EVIDENCE: SATISFACTION
Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Can triangulate with observation/evaluation, survey, and impact data Employer Surveys Corroboration for observation/evaluation and data

213 EVIDENCE SUFFICIENCY CRITERIA, 4.4
SUFFICIENT EVIDENCE Shows that completers perceive their preparation was sufficient for their job responsibilities and was effective Utilizing valid and reliable measures Obtaining response rates of 20% or higher

214 EVIDENCE SUFFICIENCY CRITERIA, 4.4
SUFFICIENT EVIDENCE Describes representativeness of sample/data for licensure areas Discusses satisfaction patterns with respect to employment contexts (e.g., shortage fields, hard-to-staff schools, schooling level, school demographics) Data analysis is appropriate and conclusions are supported by data

215 Component A.4.1 – Key Language
REQUIRED COMPONENT: The provider demonstrates that employers are satisfied with completers’ preparation and that completers reach employment milestones such as promotion and retention. Reflect on: What evidence do I have that would demonstrate graduates’ impact on P-12 student learning? What research methodologies could we feasibly employ to gain such information?

216 Step 2. Inventory Evidence toward… Candidate performance
Completer performance Other CAEP requirements

217 Step 3. Information, Categorize, and Prepare
Gather evidence toward... EPP overview Standards, components Cross-Cutting Themes Areas For Improvement

218 Step 4. Take Stock With Stakeholders…Faculty; Clinical Faculty, P-12 Districts and/or Schools, Candidates Evidence for Standards Evidence Criteria Evidence Quality Review and seek feedback on what was learned from steps 1–3

219 Exercise A4: Supporting Evidence for Standard A.4
Open: CAEP Standards 1-pager (Advanced) Evidence Sufficiency Criteria for Advanced Programs Handout: Exercise A4, Supporting Evidence for Component A.4.1

220 Exercise A4: Supporting Evidence for Standard A.4
Instructions: Select one of the measures that you listed for Component A.4.1. Which category or categories in the evidence evaluation are the greatest source of concern for this assessment or evidence suite? In the Weaknesses space, list the concerns and the types of resources that would help you address the concern?

221 Exercise A4: Supporting Evidence for Standard A.4

222 EVIDENCE SUFFICIENCY CRITERIA, A.4.1
SATISFACTION OF EMPLOYERS Provider includes appropriate analysis and interpretation of results Describes a system for analysis, evaluation, and interpretation of data Utilizing valid and reliable measures Obtaining response rates of 20% or higher

223 EVIDENCE SUFFICIENCY CRITERIA, A.4.1
SATISFACTION OF EMPLOYERS Conclusions supported by data Provide documentation of employment milestones

224 Component A.4.2 – Key Language
REQUIRED COMPONENT: The provider demonstrates that advanced program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective. Reflect on: What evidence do I have that would demonstrate the appropriate clinical settings for candidates to demonstrate proficiencies?

225 EVIDENCE SUFFICIENCY CRITERIA, A.4.2
SATISFACTION OF COMPLETERS Provider includes appropriate analysis and interpretation of results Describes a system for analysis, evaluation, and interpretation of data Utilizing valid and reliable measures Obtaining response rates of 20% or higher

226 EVIDENCE SUFFICIENCY CRITERIA, A.4.2
SATISFACTION OF COMPLETERS Evidence of an adequate and representative sample Analysis and interpretation of data aligned to standard/component Conclusions supported by data

227 In Summary - The Case for Standard 4/A.4
Information is provided from several sources and provides evidence of program impact on graduates (in-service). Data are analyzed for completer effectiveness, completer satisfaction, and employer satisfaction. Differences and similarities across licensure/field areas and demographic categories are examined.

228 In Summary - The Case for Standard 4/A.4
Appropriate interpretations and conclusions are reached. Trends or patterns are identified that suggest need for preparation modification or “staying the course”. Based on the analysis of data, there are planned or completed actions for change that are described.

229 Brain Break

230 Cross Cutting Themes Diversity and Technology

231 Cross-Cutting Themes Embedded in Every Aspect of Educator Preparation
Coursework Diversity Technology Fieldwork Interpersonal Interactions

232 Cross Cutting Themes: Diversity
Standard 1 Candidates must demonstrate skills and commitment that provide all P-12 students access to rigorous college and career ready standards. Standard 2 Clinical experiences prepare candidates to work with all students. Standard 3 Providers are committed to outreach efforts to recruit a more able and diverse candidate pool.

233 Cross Cutting Themes: Diversity

234 Cross Cutting Themes: Technology
Standard 1 Endorses InTASC teacher standards. Providers are to “…ensure that candidates model and apply technology standards as they design, implement, and assess learning experiences to engage students and improving learning and enrich professional practice.”

235 Cross Cutting Themes: Technology
Standard 2 Technology-enhanced learning opportunities Appropriate technology-based applications Technology-based collaborations Standard 3 Candidates integrate technology into all learning domains.

236 Cross Cutting Themes: Technology

237 Areas for Improvement (AFIs)

238 AFIs An EPP must address AFIs in their Annual Report. During the next accreditation review the EPP must demonstrate that the AFIs have been corrected. If the AFIs have not been corrected, a stipulation may be cited in the same area.

239 Review: Seven Steps to Preparing the (SSR)
Review the CAEP standards Inventory of available evidence Gather information, categorize and prepare evidence to upload, and draft table to be completed Take stock Analyze and discuss the evidence and draft of the Self-Study Report Formulate summary/narrative statements Draft Self-Study Report Iterative process used throughout the writing of the SSR. Will be repeated throughout the presentation.

240 Step 7. Draft Self-Study Report
Compile a complete draft of report Including evidence; tagged to the appropriate standard(s), component(s), crossing- cutting themes, and data quality documentation Summary and analysis statements Review the draft with stakeholders Revise as needed Upload the final into Accreditation Information Management System (AIMS)

241 What remaining questions do you have about writing your Self-Study Report?
HhwT

242 Thank You

243 Contact Information Vince O’Neill, EdD Accreditation Director Counselors and Accreditation Decisions Gary Railsback, PhD, MBA CAEP Volunteer Lead Site Visitor Azusa Pacific University

244 Resources Example Narrative:
Login for Sandbox: Initial Only: ID 29535 PW boe1 Advanced Only: ID 29319 PW boe0 Advances and Initial: ID 29536 PW boe2 Example Narrative: NC State’s Self-Study Narrative: go.ncsu.edu/create There are 3 separate test environments- one for each type of EPP. You can access any or all of these “sandboxes” at any time.


Download ppt "The CAEP Accreditation Review Process:"

Similar presentations


Ads by Google