Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bob Michael Associate Vice Chancellor, University System of Georgia

Similar presentations


Presentation on theme: "Bob Michael Associate Vice Chancellor, University System of Georgia"— Presentation transcript:

1 Sustaining Preparation Program Evaluation with Data Systems: Georgia Example
Bob Michael Associate Vice Chancellor, University System of Georgia Carla Tanguay Associate to the Dean for Clinical Practice, Georgia State University Tatiana Rivadeneyra Director of Accreditation, Council for the Accreditation of Educator Preparation

2 Goals for this Session To gain strategies for creating systems for data reporting, sharing, and use at the state and university levels. To acquire new ways for organizing university faculty and stakeholders to use data for continuous program improvement. To attain new ideas about data reporting, sharing, and use in various contexts to support continuous improvement.

3 In your current role, how do you use data to inform improvement?

4 Our vision for Educator Preparation Program (EPP) evaluation includes:
well-defined performance measures and student outcomes linked to program completers; state policy established for public reporting and dissemination of all aggregated and disaggregated data; processes for reporting, sharing, and using data to support EPPs and other stakeholders in determining the degree to which their completers are prepared to educate diverse learners to be life-long learners leading to increased workforce capacity; EPP evaluation protocols that support programs in identifying strengths and needs with performance measures aligned with evidence-based practices reflected in state and national teacher and leader standards; and, a feedback loop where EPP data is used to inform state policy.

5 Creating systems for data reporting, sharing, and use at the state level
Using Data EPP Needs Assessment Technical Assistance Modules Data conversations for program improvement Sharing Data Multi-agency longitudinal data system State agency data sharing agreement Reporting Data Program Preparation Effectiveness Measure (PPEM) PPEM Dashboard & Timeline

6 Preparation Program Evaluation Measure (PPEM) Current Component Distribution
In-program Measures– 50% GACE content assessment scores edTPA classroom performance assessment scores Outcome Measures – 50% TAPS classroom observation scores from first teaching year Surveys of inductee teachers and their employers from first teaching year

7 Sample Data Set E In-program Measures Outcome Measures 30% 20% 10%
Provider edTPA raw score edTPA index score GACE Raw score GACE index score TAPS raw score TAPS index score Survey raw score Inductee Survey index score Employer Survey index score Total index score A 2.62 4.8 224.2 1.7 19.1 16.0 3.29 7.9 138.3 B 2.98 19.2 226.1 2.4 19.4 17.6 3.17 6.7 152.7 C 2.87 14.8 254.2 13.7 19.9 22.1 2.95 4.5 159.5 D 2.81 12.4 259.4 15.8 21.5 3.08 5.8 161.2 E 2.93 17.2 254.6 13.8 19.6 19.7 3.24 7.4 165.6 F 3.09 23.6 258.1 15.2 19.5 18.9 3.03 5.3 168.3 G 2.90 265.8 18.3 20.4 25.4 3.07 5.7 171.1 H 3.15 26.0 257.3 14.9 19.8 20.9 3.14 6.4 174.6 I 261.8 16.7 20.2 23.9 3.13 6.3 179.2 J 3.10 24.0 261.7 20.1 23.2 3.32 8.2 180.2 K 29.2 256.9 16.1 3.39 183.2 L 3.47 38.8 250.6 12.2 21.7 3.22 7.2 187.1 M 3.16 26.4 270.1 20.0 20.3 25.0 3.30 8.0 187.4

8 Creating systems for data reporting, sharing, and use at the university level
Reporting Data GAPSC/USG/CAEP/Title II TPMS; BANNER Sharing Data Data sharing: the LiveText Assessment System Unit Level data summaries: Professional Education Faculty Committee Reports Program level data summaries: strengths, needs, and goals Using Data Unit vs. Program level responsibilities/oversight for professional development Grassroots experiential learning and input Feedback loop for continuous improvement and evolution of policies/procedures

9 Professional Education Faculty (PEF) Infrastructure of Support
Unit Level Goals Program & Course Design, Signature Assignments, Key Assessments, Program Level Professional Development and Resources Teacher Preparation Reform Grass Roots Experiential Learning and Input Clinical Practice Teachers Faculty Students edTPA Liaisons Field Supervisors Unit Level Coordination Team Unit Assessment Coordinator Associate Dean Associate to the Dean for Clinical Practice Department edTPA Coordinators PEF Committees Assessment & Accreditation (CAEP 1, 3, 4, 5) Clinical Partnerships & Induction (CAEP 2) edTPA Ad-hoc (CAEP 1, 4) Program Level Goals Syllabi Review & CEEDAR Modules

10 Alignment with CAEP Standard 4: Program Impact
CAEP’s Standard 4 is about providers demonstrating the impact of its completers on P-12 student learning and development, classroom instruction, and schools, and the satisfaction of its completers with the relevance and effectiveness of their preparation. The provider establishes a suite of evidence/measures that are to be found, developed, accumulated, measured, analyzed and interpreted as the basis for an EPP’s claim that it has effectively prepared teachers and other education professionals.

11 Guiding Questions??? What evidence/measures do you have that would demonstrate graduates’ impact, effectiveness, and satisfaction? What research methodologies could you feasibly employ to gain such information?

12 Ways to hit the mark… Component 4.1
Direct measures of student learning and development Addresses diverse subjects and grades P-12 impact or growth data from state teacher evaluations (if available)

13 Ways to hit the mark… If state data are not available:
Teacher-linked student assessments from districts Classroom-based research (e.g., action research, case studies) Describe data sources and model/formula If state data are is available Describe data sources and model/formula Describe EPP’s analysis and evaluation of the information Interpret data and judge implications If validity cannot be credibly established for state sources, supplement with other valid evidence.

14 Ways to hit the mark… Component 4.2 P-12 Student Surveys
Teaching Observations Aligned to the 4 InTASC categories Aligned to state standards for teachers / local teacher evaluation framework P-12 Student Surveys Aligned to the InTASC categories Corroboration for observation/evaluation data

15 Ways to hit the mark… Component 4.3 Component 4.4 Employer Surveys
Aligned to the InTASC Corroboration for observation/evaluation and data Component 4.4 Completer Surveys Aligned to the InTASC Aligned to state standards for teachers / local teacher evaluation framework Triangulate with observation/evaluation, survey, and impact data

16 Considering Your State Contexts
State Systems EPP Engagement CAEP Alignment What are some state systems already in place for data reporting, sharing, and use? How might those systems be modified or new systems created to support EPP faculty & stakeholders? How do you engage faculty & stakeholders to use data for program improvement? How do you report, share, and use data at the EPP level? What new ideas support your efforts in sustaining a process for continuous improvement? What strategies support a process of continuous improvement as aligned to CAEP 4?

17 Closing Connections What will be important to share with my team?
Who should I connect with later? When? What should I follow-up on?


Download ppt "Bob Michael Associate Vice Chancellor, University System of Georgia"

Similar presentations


Ads by Google