Download presentation
Presentation is loading. Please wait.
Published byAlyson Banks Modified over 9 years ago
1
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP Board Approval) Stevie Chepko, Sr., VP for Accreditation Elizabeth Vilky, Sr. Director for Program Review
2
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Purpose of Three-Year-Out Review Advancing excellence in educator preparation through evidence-based accreditation and continuous improvement – Assessment includes all instruments providing evidence for meeting a standard Mandate to improve the quality of assessments used by EPPs Part of CAEP’s commitment to capacity building Designing to provide feedback to EPPs No value or decision is made on assessments Feedback is give to EPPs to facilitate the improvement of EPP assessments used across the EPP (Pending approval of Accreditation Council and CAEP Board)
3
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Three-Year-Out Review Review is scheduled three years before the scheduled date of the self-study Pre-submission allows EPPs to use feedback to improve quality of assessments before the submission of self-study and site visit Data are not submitted with the assessments Only assessments(also scoring guide when applicable) are submitted for review All EPP wide assessments used for all specialty licensure areas (Pending approval of Accreditation Council and CAEP Board)
4
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Three-Year-Out Review: Proprietary Assessments Proprietary Assessments – Assessments that are external to the EPP where property rights are held by another agency State Licensure exams edTPA or PPAT State required assessments (i.e., clinical observation instruments, etc.) Any required state or national level assessment Validity and reliability established by an external source Proprietary Assessments are exempt from the Three- Year-Out review (Pending approval of Accreditation Council and CAEP Board)
5
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Proprietary Assessments (cont.) EPPs will provide context for the use of proprietary assessments When during the program is assessment used Identify if the assessment is mandated or elective for the EPP Identify the alignment of the proprietary assessment with the CAEP standard If available, provide validity/trustworthiness and reliability/consistency data for the instrument (Pending approval of Accreditation Council and CAEP Board)
6
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org EPP Created Assessments Often include, but not limited to - Clinical observation instruments Work sample instruments Lesson and/or unit plan instruments Dispositional instruments Reflection instruments Surveys Candidate exit surveys Employer surveys Student surveys Alumni surveys (Pending approval of Accreditation Council and CAEP Board)
7
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Criteria for Evaluating Assessments with Scoring guides Assessments align with CAEP Standards and provide evidence for meeting the standards – Same or consistent categories of content appear in the assessment item that are found in the standards Assessments are congruent with the complexity, cognitive demands, and skill requirements described in the standard Level of respondent effort required, or the difficulty or degree of challenge is consistent with standards (Pending approval of Accreditation Council and CAEP Board)
8
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Criteria for Evaluating Assessments with Scoring Guides (cont.) Level of respondent effort required is reasonable for candidates who are ready to teach Assessment item(s) address the range of knowledge, skills, and dispositions delineated in standards Assessments are free of bias – Avoid bias in language – Avoid bias in testing situations (Pending approval of Accreditation Council and CAEP Board)
9
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Criteria for Evaluating Assessments with Scoring Guides (cont.) Questions to be answered – Is there a clear basis for judging the adequacy of candidate work? A rubric or scoring guide is used Evidence that the assessment measures what it is purports to measure (validity) Results are consistent across raters and over time (reliability) Criteria in rubric or scoring guide are related to CAEP standards (Pending approval of Accreditation Council and CAEP Board)
10
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Criteria for Scoring Guides or Rubrics Distinct levels of candidate performance must be defined – Descriptions of each level describe attributes related to actual performance – Levels represent a developmental sequence in which each successive level is qualitatively different from prior level – It is clear which level represents exit proficiency (ready to practice) – Levels are clearly distinguishable from one another – Levels are constructed in parallel with one another in terms of attributes and descriptors – Scoring guides provide specific feedback to candidates – (Pending approval of Accreditation Council and CAEP Board)
11
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Quality Surveys Surveys allow EPPs to – Gather information for program improvement Access a broad spectrum of individuals – Candidate satisfaction – Graduate satisfaction – Employer satisfaction – Clinical faculty perceptions of candidates’ preparedness for teaching Characteristics of Quality Survey Carefully designed Allow for systematic collection of data Measures the property it claims to measure (Pending approval of Accreditation Council and CAEP Board)
12
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Criteria for Evaluating Surveys Surveys include preambles that explains explain what the respondent is being asked to do Define any concepts or terms that the respondent needs to understand to complete the survey Questions should be sorted by themes or categories Questions should be simple and direct Vague words or terms should be avoided Leading questions should be avoided A cover letter is included Confidentiality of respondent is assured (Pending approval of Accreditation Council and CAEP Board)
13
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Criteria for Evaluating Surveys (cont.) Questions should have a single subject and not combine two or more attributes Questions should be stated in the positive Questions should have a parallel structure Response choices should be mutually exclusive and exhaustive If frequency questions (e.g. “occasionally”) are included, they should be defined in terms of an actual frequency (e.g. “3-5” times) (Pending approval of Accreditation Council and CAEP Board)
14
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Additional Criteria for Three-Year-Out Review Description of how the assessment was developed Description of how validity was established for the assessment Expert validation of the items on the assessment (convergent validity) A measure’s ability to predict performance on another measure (predictive validity) Extent to which the evaluation item measures what it claims to measure (construct validity) (Pending Accreditation Council and CAEP Board Approval)
15
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Additional Criteria for Three-Year-Out Review (cont.) Right attributes are being measured in the right balance (content validity) Measure subjectively viewed as being important and relevant (face validity) Description of how the reliability of assessments were established or will be established (plan is acceptable) Reliability in its various forms is supported through evidence of – Agreement among multiple raters of the same event Stability or consistency of ratings over time Evidence of internal consistency of measures Pending approval of Accreditation Council and CAEP Board
16
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Review Process for Three-Year Out CAEP assigns a lead reviewer and two additional reviewers Reviewers are specifically trained on criteria for quality assessments Reviewers provide feedback on - – Quality of scoring guides or rubrics based on the criteria – Quality of surveys based on the criteria – Alignment of assessments to CAEP Standards – Quality of the evidence for CAEP Standards – Quality of the answers to validity and reliability answers Pending approval of Accreditation Council and CAEP Board
17
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Review Process for Three-Year Out (cont.) Steps in review process Each of the three reviewers complete an independent review through AIMS After all reports are submitted, lead reviewer host a conference call with team Conference call generates a final team report submitted through AIMs CAEP staff completes a tech edit of final report EPP receives feedback on all submitted assessments by March 1 for fall cycle and September 1 for spring cycle Pending approval of Accreditation Council and CAEP Board
18
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Steps for Submission of Three-Year-Out Review Step 1: Three years before the due date of self-study, the EPP request a shell for submission of assessments Step 2: EPPs identify on a chart the proprietary assessments to be submitted as evidence Complete a checklist of which proprietary assessments provided evidence for which CAEP Standards Step 3: EPPs attach/upload assessments (identified by number) to shell Assessments are uploaded as the assessment is used by the EPP Pending approval of Accreditation Council and CAEP Board
19
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Steps for Submission of Three-Year-Out Review (cont.) Step 4: In the space provided in the shell, EPPs answer questions on the development of the assessment, describe the establishment of validity for each assessment, and describe the process in which reliability was established or a plan for establishing reliability. Submissions are due by October 1 for fall cycle and April 1 for the spring cycle Pending approval of Accreditation Council and CAEP Board
20
CONNECT WITH CAEP | www.CAEPnet.org | Twitter: @CAEPupdateswww.CAEPnet.org Evidence for Standards Most of candidate data from assessments will be submitted as evidence for Standard 1 Validity and reliability evidence will be submitted as evidence for Standard 5 (Quality Assurance) Feedback will be used by EPPs to improve or modify assessments Feedback will be used by EPPs to improve or modify validity and reliability processes Member of the Three-Year-Out review team will serve on the site visit team Pending approval of Accreditation Council and CAEP Board
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.