Data Conventions and Analysis: Focus on the CAEP Self-Study

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
CONNECT WITH CAEP | | Teachers Know Their Content And Teach Effectively: CAEP Standard 1 Stevie Chepko,
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
Weber State University Teacher Preparation Program Levels, Field Experiences, and Assessments.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Student Growth Percentile (SGP) Model
1 Teacher Evaluation Institute July 23, 2013 Virginia Department of Education Division of Teacher Education and Licensure.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
BY Karen Liu, Ph. D. Indiana State University August 18,
CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP.
John Seelke University of Maryland College Park Preparing and Supporting Candidates for the edTPA 1.
EDU 385 Education Assessment in the Classroom
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
The levels of performance considered at the element level are not intended to be used to label teachers as Not Evident, Emerging, Proficient, and Exemplary.
Using Assessment Data Helen Thumann Department of Education.
Standard Setting Results for the Oklahoma Alternate Assessment Program Dr. Michael Clark Research Scientist Psychometric & Research Services Pearson State.
 Field Experience Evaluations PSU Special Educator Programs Confidence... thrives on honesty, on honor, on the sacredness of obligations, on faithful.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
CONNECT WITH CAEP | | CAEP Accreditation and STEM Stevie Chepko, Sr. VP for Accreditation
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Stetson University welcomes: NCATE Board of Examiners.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Arizona Educator Exams Update Arizona Educator Exams: What You Need to Know Kortney Zesiger, M Ed. 1.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
CONNECT WITH CAEP | | CAEP Update Stevie Chepko, CAEP Sr. VP for Accreditation.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Paulding County School District 2016 Baggett Elementary Parent Presentation PowerPoint information has been adapted from resources available at
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
NCATE Program Review Process Margaret D. Crutchfield, Ph.D. September 2006
Designing Quality Assessment and Rubrics
Paulding County School District 2017 Baggett Elementary
OCTEO April 1, 2016 Margaret D. Crutchfield, Ph.D.
EVALUATING EPP-CREATED ASSESSMENTS
Introduction to Teacher Evaluation
Classroom Assessments Checklists, Rating Scales, and Rubrics
Presented by Deborah Eldridge, CAEP Consultant
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
NASP Program Review and Approval Eric Robinson, PhD
Evaluating Student-Teachers Using Student Outcomes
STANDARD 1 Content and Pedagogical Knowledge
Bob Michael Associate Vice Chancellor, University System of Georgia
Partnership for Practice
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
An Overview of the EQuIP Rubric for Lessons & Units
Office of Field and Clinical Partnerships and Outreach: Updates
Elayne Colón and Tom Dana
STANDARD 2/A.2 Clinical Partnerships and Practice
Classroom Assessments Checklists, Rating Scales, and Rubrics
Introduction to Teacher Evaluation
WIFI ACCESS COW-GUEST-WIRELESS No Login Needed
Connecticut Core Standards for English Language Arts & Literacy
Licensing Updates AWSA Webinar David DeGuire
An Overview of the EQuIP Rubric for Lessons & Units
Timeline for STAAR EOC Standard Setting Process
Connecticut Core Standards for English Language Arts & Literacy
Transitioning to the Caep elementary standards
Connecticut Core Standards for English Language Arts & Literacy
Bob Michael Associate Vice Chancellor, University System of Georgia
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Assessment Literacy: Test Purpose and Use
Marilyn Eisenwine Committee Chair
District Assessment Report
Training Chapter for the Advanced Field Experience Form (Pre-CPAST
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Data Conventions and Analysis: Focus on the CAEP Self-Study CAEP Conference Washington, D.C. September 30, 2016 Washington Hilton Presented by Deborah Eldridge, CAEP Consultant LCVinc1@gmail.com

Goal and Objectives Goal: To provide updated information on presenting data and its analysis in the CAEP self-study. Objectives: Participants will be able to (PWBAT): Identify the CAEP expectations in data presentation, Describe features of data tables, and Explain key aspects of data analysis expected at the component level in the self-study.

Data Chart Conventions: Part I Data are disaggregated by licensure area/program Program Vs. Licensure area? At least one comparison point is available for analysis Data charts are clearly labeled In evidence and in self-study Importance of title and content alignment

Elementary Education (1-6 Sample chart: 2014-2015 Candidate Proficiency Data on InTASC Categories from Clinical Observation Instrument: Disaggregated by Program/Licensure Area Provides a comparison point Informs each InTASC category Disaggregated by licensure area/program InTASC Observation Item EPP Mean Elementary Education (1-6 English Education (7-12) Spr-2014 Fall-2014 Spr-2015 Learner and learning Develops learning experiences appropriate for the grade level and connected to standards   N = 149 M = 3.2 N = 143 M = 3.4 N = 123 N = 93 M = 3.1 N = 84 M = 3.6 N = 65 N = 16 M = 3.3 N = 17 N = 11 Instructional practice Uses discussion strategies to promote high level thinking N – 126 M = 3.5 N =93 M = 2.9 M = 3.7 Professional responsibility Participates in professional development M = 3.0 M = 2.7

Data chart conventions: Part II Include the “N” for the data set broken out by year or semester Low enrollment programs (under 10 graduates over three years) can aggregate data by licensure area for three cycles Data requirement is for three cycles of data Cycle is one independent collection of data using the assessment Could be as long as three years (e.g., small programs that offer a course or clinical experience just once a year) Some data are required for a three year period (state licensure test scores) Could be as short as three semesters (courses or clinical experiences offered each semester)

Elementary Education (1-6 Science Education: Physics (7-12) Sample chart: 2014-2015 Candidate Proficiency Data on InTASC Categories from Clinical Observation Instrument: Disaggregated by Program/Licensure Area 3 year data cycle Low Enrollment Program 3 year data cycle InTASC Observation Item EPP Mean Elementary Education (1-6 Science Education: Physics (7-12) Spr-2014 Fall-2014 Spr-2015 Learner and learning Develops learning experiences appropriate for the grade level and connected to standards   N = 149 M = 3.2 N = 143 M = 3.4 N – 126 N = 93 M = 3.1 N = 84 M = 3.6 N = 65 N = 4 M – 3.6 Instructional practice Uses discussion strategies to promote high level thinking M = 3.5 M = 2.9 M = 3.7  N = 4 M= 3.4

Data Chart Conventions: Part III If grades are used as evidence of content knowledge, the mean for the class should be reported along with the mean for education majors in the same class. If the data chart reports a mean score, range should also be reported

Sample GPA Chart: 2013-2015 Candidate and Non-Candidate Content Area GPA Disaggregated by Program/Licensure Area N for data set broken out by year (or semester) Comparison point is Non-Teacher Candidates Program/ Licensure areas N and Mean GPA for Teacher Candidates N and Mean GPA for Non-Teacher Candidates Spring 2015 Fall 2015 Spring 2016 History N = 15 M = 3.0 Range: 2.8-3.2 N = 12 M = 3.1 Range: 2.7-3.5 N = 17 M = 3.2 Range: 2.3-3.5 N = 127 M = 2.8 Range: 2.0-3.2 N = 99 M = 2.6 Range: 2.1-3.5 N = 145 M = 2.9 Range: 2.0-3.4 Mathematics N = 11 Range: 2.2-3.2 N = 9 Range: 2.1-3.2 Range: 25-3.2 N = 24 M = 3.3 N = 22 M = 3.3 Range: 2.8-3.3 N = 28 M = 3.6 Range: 2.3-3.9 Physics Aggregated for 3 years. N = 4 M = 3.3 Range = 3.1-3.6 Range = 2.9-3.4 Range = 2.7-3.9 Art Aggregated for 3 years. N = 5 M = 3.5 Range = 3.3-3.9 N = 14 M = 3.2 Range = 2.5-3.4 N = 18 Range = 2.6-3.7 Range is reported Low enrolled program data aggregated for 3 cycles

Data Chart Conventions: Part IV If the data chart is reporting a percentage, the percentage should be reported for each level L-1= Emerging L-2 =Developing L-3 = Proficient L-4 = Exemplary InTASC Observation Item EPP Mean Elementary Education (1-6) Science Education: Physics (7-12) Spr-2015 N= 149 Fall-2015 N=143 Spr- 2016 N=123 N= 93 N=84 Spr-2016 N=65 Spr- 2015 Fall- 2015 Learner and learning Develops learning experiences appropriate for the grade level and connected to standards L- 1: 4% L- 2: 3% L-3: 87% L-4: 6% L-1: 3% L-2: 4% L-3: 89% L-4: 4% L-1: 5% L-2: 6% L-3: 82% L-4: 7%   L-1: 1% L-2: 0% L-3: 96% L-4:3% L-1: 0% L-2: 0 % L-3: 98% L-4: 2% L-3: 97% L-4: 3% N=4  L-3: 25% L-4: 75%

All Components in Standard 1: Data chart conventions for proprietary assessments State licensure score data should have comparison points such as – Benchmark score for minimal competency required by state Possible other points such as national median score for the content area

(sub-test listed below) Reading and Language Arts National comparison State’s minimum passing score Range of actual EPP candidates’ scores Academic Years Number of Students Qualifying Score Mean National Median Range EPP % of Candidates Passing Early Childhood 2011-2012 N = 35 160 172 177 152-186 100% 2012-2013 N = 33 169 176 158-172 2013-2014 N = 31 168 152-183 Elementary Education (sub-test listed below) Reading and Language Arts N = 22 157 165 No data 153-174 N = 27 157-172 N = 25 162 155-170 Mathematics 153-171 158 150-162 Social Studies 155 149-162 159 146-169 Science 161 149-168 164 151-170 163 155-169

What will reviewers ask about data? Do data measure what is intended to be measured? Do data measure the preponderance of what is specified in the component? (Only the items specific to the component are cited as evidence) Are EPP-created assessments evaluated at the sufficient level or above? (See CAEP’s Assessment Rubric categories and sufficient column for meeting expectations) Does an audit check of the data indicate that data are accurately recorded/reported? Are data chart conventions used? Is data disaggregated for program/licensure areas?

Feedback and Question Pause

Data Analysis Categories and Questions Against the comparison point: Across programs Within programs Over 3 cycles By demographics Questions to ask about the data: Strengths? Areas of Challenge? Outliers? Trends?

Data Chart Conventions: Part IV If the data chart is reporting a percentage, the percentage should be reported for each level L-1= Emerging L-2 =Developing L-3 = Proficient L-4 = Exemplary InTASC Observation Item EPP Mean Elementary Education (1-6) Science Education: Physics (7-12) Spr-2015 N= 149 Fall-2015 N=143 Spr-2016 N=123 N= 93 N=84 N=65 Spr- 2015 Fall- 2015 Spr- 2016 Learner and learning Develops learning experiences appropriate for the grade level and connected to standards L- 1: 4% L- 2: 3% L-3: 87% L-4: 6% L-1: 3% L-2: 4% L-3: 89% L-4: 4% L-1: 5% L-2: 6% L-3: 82% L-4: 7%   L-1: 1% L-2: 0% L-3: 96% L-4:3% L-1: 0% L-2: 0 % L-3: 98% L-4: 2% L-3: 97% L-4: 3% N=4  L-3: 25% L-4: 75%

Sample GPA Chart: 2013-2015 Candidate and Non-Candidate Content Area GPA Disaggregated by Program/Licensure Area N for data set broken out by year (or semester) Comparison point is Non-Teacher Candidates Program/ Licensure areas N and Mean GPA for Teacher Candidates N and Mean GPA for Non-Teacher Candidates Spring 2015 Fall 2015 Spring 2016 History N = 15 M = 3.0 Range: 2.8-3.2 N = 12 M = 3.1 Range: 2.7-3.5 N = 17 M = 3.2 Range: 2.3-3.5 N = 127 M = 2.8 Range: 2.0-3.2 N = 99 M = 2.6 Range: 2.1-3.5 N = 145 M = 2.9 Range: 2.0-3.4 Mathematics N = 11 Range: 2.2-3.2 N = 9 Range: 2.1-3.2 Range: 25-3.2 N = 24 M = 3.3 N = 22 M = 3.3 Range: 2.8-3.3 N = 28 M = 3.6 Range: 2.3-3.9 Physics Aggregated for 3 years. N = 4 M = 3.3 Range = 3.1-3.6 Range = 2.9-3.4 Range = 2.7-3.9 Art Aggregated for 3 years. N = 5 M = 3.5 Range = 3.3-3.9 N = 14 M = 3.2 Range = 2.5-3.4 N = 18 Range = 2.6-3.7 Range is reported Low enrolled program data aggregated for 3 cycles

(sub-test listed below) Reading and Language Arts National comparison State’s minimum passing score Range of actual EPP candidates’ scores Academic Years Number of Students Qualifying Score Mean National Median Range EPP % of Candidates Passing Early Childhood 2011-2012 N = 35 160 172 177 152-186 100% 2012-2013 N = 33 169 176 158-172 2013-2014 N = 31 168 152-183 Elementary Education (sub-test listed below) Reading and Language Arts N = 22 157 165 No data 153-174 N = 27 157-172 N = 25 162 155-170 Mathematics 153-171 158 150-162 Social Studies 155 149-162 159 146-169 Science 161 149-168 164 151-170 163 155-169

When might Areas for Improvement (AFIs) be assigned? One or more of the 4 InTASC categories are not informed by EPP evidence, or there is not disaggregated data for more than 20% of the candidates Only state licensure tests are provided as evidence Licensure test scores are not in the upper half of national median/average field by field (ETS Praxis) OR in upper half of state median/average field by field (Pearson)

When might Areas for Improvement (AFIs) be assigned? There is no significant analysis of data Interpretations of evidence are not well-grounded in evidence provided EPP-created instruments have significant deficiencies Site visitors report inaccuracies in reporting data

When might a stipulation be assigned? Data not disaggregated by licensure/ program areas No evidence of internal consideration of the data for improvement purposes No steps to ensure data quality No efforts to ensure validity

Final Feedback and Question Pause