Eastern’s Assessment System Eastern Connecticut State University NCATE Legacy Visit 2016
Overview A. Assessment System B. Data Collection, Analysis, and Evaluation C. Use of Data for Program Improvement
Overview A. Assessment System B. Data Collection, Analysis, and Evaluation C. Use of Data for Program Improvement
A. Assessment System 1. Assessments 2. Use of Data 3. Unit
A. Assessment System 1. Assessments 2. Use of Data 3. Unit
1a. Assessments Three Transition Points: Admission, Midpoint and Exit Post graduation: Alumni Survey Admission Midpoint Exit Initial CARE Entry Data Clinical & Core Portfolios Student teaching & End of program survey Advanced Professional recommendation Clinical Evaluations Capstone Portfolio Multiple assessments and multiple points
1b. Assessments Assessments are aligned to: Conceptual Framework (which are, in turn, aligned to NCATE standards) State Standards (CCCT, Connecticut Common Core of Teaching) Professional Standards
1c. Assessments Assessment procedures: Shared with candidates, and other professional partners Include assessment guidelines and scoring rubrics with performance indicators for various levels Include various appropriate technologies for compilation and analyses: TK20 Select survey/Question pro Excel SPSS
A. Assessment System 1. Assessments 2. Use of Data 3. Unit
2a. Use of Data: Purpose To determine candidate competencies and to evaluate progress To evaluate our assessment system: TK20 Data analyses Data reporting To examine the validity and utility of data To evaluate program capacity and effectiveness Capacity and effectiveness of the assessment system
2a. Use of Data: Time CARE committee meetings Candidate performance discussions with instructors, supervisors and/or cooperating teachers Assessment meetings (if applicable) SPA program review Unit-wide Retreats
A. Assessment System 1. Assessments 2. Use of Data 3. Unit
3a. Unit: Who Candidates Faculty Supervisors Cooperating Teachers Students District administrators Capacity and effectiveness of the assessment system
3b. Unit: Involvement During clinical evaluations Course evaluations *Focus group discussions *End of program survey *Alumni Survey Candidates *Assessment meetings *CARE meetings *Retreats *Other Discussions Faculty Validity and reliability Scale should be 5, not 3 Unit conducts studies to establish fairness, accuracy and consistency
3b. Unit: Involvement *Monthly supervisors’ meeting *Other Discussions Triad and *dyad (supervisor and cooperating teacher) discussions *Discussions with OECE *Professional learning council Cooperating Teachers PLC: validate assessments; score Core I portfolio; offer other feedback
3b. Unit: Involvement Candidates’ planning Candidates’ instruction *Candidates’ assessment of student learning Students *Employer surveys Other discussions District administrators Capacity and effectiveness of the assessment system
Overview A. Assessment System B. Data Collection, Analysis, and Evaluation C. Use of Data for Program Improvement
Overview A. Assessment System B. Data Collection, Analysis, and Evaluation C. Use of Data for Program Improvement
B1. Data Collection, Analysis, and Evaluation Data comes from: Course instructors, supervisors, cooperating teachers, external agencies Data is stored in: TK20 (moving to TK20 from excel) Data analysis: Every semester Data evaluation: Program discussions, candidate evaluations and the three unit-wide retreats
B2. Data: Program & Unit-wide How effective are our programs? How efficient are our unit operations? How well do our candidates perform?
B2. System for candidate complaints & resolutions Denials and dismissals documented in CARE folders, along with subsequent resolutions. Professional development plan to support candidates’ improvement of weaker areas (documented and stored in CARE folders, and graduate office) Test codes in Banner to document candidates in good standing, under probation, dismissed, denied, withdrawn, and other categories. GPA and course grades monitored every beginning and end of semester by CARE through Banner reports
Overview A. Assessment System B. Data Collection, Analysis, and Evaluation C. Use of Data for Program Improvement
Overview A. Assessment System B. Data Collection, Analysis, and Evaluation C. Use of Data for Program Improvement
C1.Use of Data for Program Improvement (implemented) Data from Student Teaching evaluations: Re-examination of our focus on differentiated instruction, designing academic and behavioral interventions and use of data for planning interventions for student development. Reliability data from CARE interviews: Led to revision of CARE interview rubric and interview questions. CARE entry GPA analyses: Resulted in phase in plan for CSUC candidates Data from licensure test scores: Focused collaborations and support sought from content-area faculty and programs
C2.Use of Data for Program Improvement (planned) Data on program vs. unit-wide assessments: Established the need for strong unit-wide assessments that address all elements of standard 1, hence the current work on the design and initiation of Core I, II, III portfolios Formative data on Core I portfolio: Further established the need for the Professional Learning Council to both validate and support implementation.
C3: Use of data for avoidance of bias CARE interview data, particularly on the question related to content knowledge: Led to revision and greater clarity in CARE interview questions. Student teaching evaluation data: Led to clarity and expansion of specific performance expectations so that they are appropriate across age groups.
Lessons Learned and Goals What’s next? Lessons Learned and Goals
Lessons learned Goals Advanced program: Need more students to generate more data to determine program effectiveness Professional community: P- 12 partners need to be more involved in design of assessments. Advanced program: Re- design advanced program with enhanced content, flexibility & tracks Professional Community: Develop a professional learning council, with annual members from P-12 system with focused goals
Questions?