Deconstructing Standard 2b Sharon Valente Savannah College of Art and Design 1 912-525-5873 May 14, 2012.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Culminating Academic Review Adams State College Department of Teacher education graduate programs.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College 1.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Personnel Policies Workshop Best Practices for Personnel Committees.
Training Module for Cooperating Teachers and Supervising Faculty
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
Dallas Baptist University College of Education Graduate Programs
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Office of Research, Evaluation, and Assessment April 19, 2008.
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
Virginia Teacher Performance Evaluation System
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
The National University Special Education Internship Introduction to the Program
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
CAA’s IBHE Program Review Presentation April 22, 2011.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
The Department of Educational Administration Assessment Report School of Education and Human Services Carol Godsave, Chair, Assessment Coordinator.
BY Karen Liu, Ph. D. Indiana State University August 18,
Measuring Dispositions Dr. Sallie Averitt Miller, Associate Dean Office for Assessment and Accreditation Columbus State University GaPSC Regional Assessment.
CONNECT WITH CAEP | | CAEP Standard 3: Candidate quality, recruitment and selectivity Jennifer Carinci,
Stronge Teacher Effectiveness Performance Evaluation System
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
ADEPT 1 SAFE-T Judgments. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
HECSE Quality Indicators for Leadership Preparation.
Professional Performance Process Presented at March 2012 Articulation Meetings.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
TWS Aids for Student Teachers & Interns Overview of TWS.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Academic Practicum Winter Academic Practicum Seminar2 Agenda 4 Welcome 4 Burning ??’s 4 Routines & Organizational Systems 4 Overview of Academic.
Primary Purposes of the Evaluation System
 Field Experience Evaluations PSU Special Educator Programs Confidence... thrives on honesty, on honor, on the sacredness of obligations, on faithful.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Education Unit The Practicum Experience Session Two.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
A G E N D A Define GATE 1 Criteria for Admission Reflection Narratives Packet Format.
Stetson University welcomes: NCATE Board of Examiners.
Deconstructing Standard 2c Laura Frizzell Coastal Plains RESA 1.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
NCATE Unit Standards 1 and 2
ASSESSMENT METHODS – Chapter 10 –.
Presented by Deborah Eldridge, CAEP Consultant
Polices, procedures & protocols
Eastern’s Assessment System
Evaluating Real Life Integration and Application of Content Knowledge
NCATE Standard 3: Field Experiences & Clinical Practice
NCATE 2000 Unit Standards Overview.
Introduction to Core Professionalism
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Deborah Anne Banker Committee Chair
Deconstructing Standard 2b
Deconstructing Standard Two, Element 2-b Dr
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Deconstructing Standard 2b Sharon Valente Savannah College of Art and Design May 14, 2012

2b: Data Collection, Analysis, and Evaluation Regular and systematic data collection, analysis, and evaluation for the purposes of improving candidate performance, program quality and unit operations Multiple assessments from internal and external sources at transition points Data are summarized, analyzed, and reported by discipline; and are disaggregated for alternate route, off-campus, and distance learning programs 2

Key Criteria for Meeting the Expectations of Element 2b System uses multiple assessments from internal and external sources System is maintained and data are collected regularly Data are regularly and systematically –Compiled- Aggregated –Summarized- Analyzed 3

Maintained using information technology Disaggregated by discipline and for alternate route, off-campus, and distance learning programs Records of formal complaints are maintains and resolutions are documented 4 Key Criteria for Meeting the Expectations of Element 2b

Sub-elements of Standard 2b (1) The professional education unit maintains an assessment system that provides regular and comprehensive information on applicant qualifications, candidate proficiencies, competence of graduates, professional education unit operations, and preparation program quality. 1 AFI cited 5

Sub-elements of Standard 2b (2) Using multiple assessments from internal and external sources, the professional education unit collects data from applicants, candidates, recent graduates, faculty, and other members of the professional community. 0 AFI’s cited!!!! 6

Sub-elements of Standard 2b (3) Candidate assessment data are regularly and systematically collected, compiled, aggregated, summarized, and analyzed to improve candidate performance, preparation program quality, and professional education unit operations. 15 AFI’s cited 7

Sub-elements of Standard 2b (4) The professional education unit disaggregates candidate assessment data when candidates are in alternate route, off-campus, and distance learning programs. 0 AFI’s cited!!!! 8

Sub-elements of Standard 2b (5) The professional education unit maintains records of formal candidate complaints and documentation of their resolution. 5 AFI’s cited 9

Sub-elements of Standard 2b (6) The professional education unit maintains its assessment system through the use of information technologies appropriate to the size of the professional education unit and institution/agency. 9 AFI’s cited 10

Scenarios DISCLAIMER: The information presented does not represent any one institution, nor does this purport to be the ideal format and discussion for 2b. Any errors found are those of the authors, and should not be attributed to the Georgia Professional Standards Commission (GaPSC). The items in blue and/or italics represent hypertext links to further data. The reader may assume that these reports provide the suggested data. For purposes of the present exercise, the reader may assume that similar data is presented for all teacher preparation programs at the given institution. 11

Scenario One Assessment Description This assessment consists of a cumulative G.P.A. in 12 required courses that address content knowledge for Physical Education teacher candidates. Teacher candidates must maintain a 3.0 (out of a 4.0) cumulative G.P.A., with no grades below a C- to remain in the program and to enter practicum and clinical practice. The candidates’ major adviser records and tracks these requirements throughout the program. The scoring guides developed with the department faculty define the course letter grades from A to F. Separate guides were developed for foundation courses and pedagogy and content courses. Data Analysis Along with the guides, the mean G.P.A. data for all 12 required courses is presented. The overall G.P.A. mean score of 3.35 is above the required departmental mean score of 3.0. However, six candidates did score lower. These candidates were required to retake those courses. 12

Scenario One Evaluation It is apparent that our teacher candidates score higher in the Pedagogical/Content courses (mean of 3.48) as compared to the mean for Foundation courses of Many candidates excel and are more comfortable with the “activity” courses. Also, there is an inherent difference in learning domains in the two areas: Foundation courses focus on the cognitive domain while the pedagogy/content courses also include the psychomotor and affective domains. In an effort to address this discrepancy, instructors will continue to meet students individually or in groups to provide additional instruction when needed. Unacceptable, Acceptable, or Target? 13

Scenario Two Assessment Description The Clinical Practice Teacher Candidate Performance Evaluation (CPTCPE) comprises two rubrics: The Clinical Practice Teacher Candidate Performance Evaluation Rubric and the NAPSE Addendum. The university supervisor uses the CPTCPE to provide assessment feedback to the candidates at least four times over the course of each quarter. The candidate’s performance is formally assessed at midterm (end of first placement) and at the end of their clinical practice (end of second placement), immediately prior to program completion. At those times, based on observations and consultation with clinical teachers, university supervisors complete an evaluation instrument for each teacher candidate. Candidates must demonstrate or provide evidence to the university supervisor and cooperating teacher that they are able to meet or exceed expectations on all 52 elements of the final CPTCPE to pass Clinical Practice. 14

Scenario Two Data Analysis The data was collected from the Spring 2009 cohort of teacher candidates in Clinical Practice. The data indicate a slight improvement trend from midterm (1.25) to final (1.28). The lowest mean scores were for 5.1 (Creates a learning community in which individual differences are respected) and 6.1 (Applies knowledge of students’ abilities/disabilities to positively impact student learning), although all candidates did score ”Acceptable.” 15

Scenario Two Evaluation The overall change in mean scores from midterm to final is small but positive, which seems to indicate growth over time. It is surmised that adjustments to the new NAPSE Addendum may have caused some of the discrepancies and negative trends in the data. To address this issue, we have embarked on a process of rewriting our Clinical Practice Primer for supervisors. The revised document will be available for the onsite visit. At present, clinical practice supervisors do not grade candidates’ impact papers, their reflections on their teaching, student learning, and making decisions based on those reflections. This appears to have affected candidates’ scores in this assessment. In the future, impact papers will be part of this assessment. Unacceptable, Acceptable, or Target? 16

Scenario Three Student Complaint Policy The institution has established policies and procedures for resolving student complaints. These policies are outlined in the Student Complaint Policy in the Student Handbook and the Catalog. Students are encouraged to first attempt to resolve certain complaints informally by contacting the office responsible for the area relevant to the complaint. For example, students are referred to the residence hall director for housing complaints; to the professor or academic chair for academic complaints; and to the bursar's office for financial concerns. The institution ensures that its policies are published each academic year in the Student Handbook and the Catalog. The catalog is mailed to all applicants and an updated copy is sent to all incoming students. It is also published at the public website. Thus, the policies are disseminated to the students. 17

Scenario Three Additional Complaint Mechanisms Student Ombudsman Office of the Student Ombudsman exists solely to serve the students. The student ombudsman is a designated neutral party who does not advocate any particular point of view. As an impartial complaint-handler, the student ombudsman strives to see that people are treated fairly and equitably. A student may consult with the student ombudsman on a variety of matters and concerns, including but not limited to, academics, judicial concerns, and unethical behavior. 18

Scenario Three Ensuring Policies are Followed To ensure that the published policies are followed when resolving student complaints, all written student non-academic complaints, other than those concerning FERPA and/or ADA, are routed through the Office of Student Success. The assistant to the dean and vice president for student success maintains a log of student complaints and the corresponding outcomes. This log will be available for the onsite visit. Unacceptable, Acceptable, or Target? 19

Rubric for Evaluating Evidence 20 UnacceptableAcceptableTarget The unit does not use multiple assessments from internal and external sources to collect data on applicant qualifications, candidate proficiencies, graduates, unit operations, and program quality. Using multiple assessments from internal and external sources, the unit collects data from applicants, candidates, recent graduates, faculty, and other members of the professional community. Assessment data from candidates, graduates, faculty, and other members of the professional community are based on multiple assessments from both internal and external sources that are systematically collected as candidates progress through programs.

Rubric for Evaluating Evidence 21 UnacceptableAcceptableTarget The unit does not regularly and comprehensively gather, aggregate, summarize, and analyze assessment and evaluation information on the unit's operations, its programs, or candidates. The unit maintains an assessment system that provides regular and comprehensive information on applicant qualifications, candidate proficiencies, competence of graduates, unit operations, and program quality. The unit's assessment system provides regular and comprehensive data on program quality, unit operations, and candidate performance at each stage of its programs, extending into the first years of completers' practice.

Rubric for Evaluating Evidence 22 UnacceptableAcceptableTarget The unit cannot disaggregate candidate assessment data when candidates are in alternate route, off- campus, and distance learning programs. The unit disaggregates candidate assessment data when candidates are in alternate route, off- campus, and distance learning programs. These data are disaggregated by program when candidates are in alternate route, off- campus, and distance learning programs. The unit does not maintain a record of formal candidate complaints or document the resolution of complaints. The unit maintains records of formal candidate complaints and documentation of their resolution. The unit has a system for effectively maintaining records of formal candidate complaints and their resolution. The unit is developing and testing different information technologies to improve its assessment system.

Rubric for Evaluating Evidence 23 UnacceptableAcceptableTarget When candidate assessment data are collected, the data is not compiled, aggregated, summarized, or analyzed. Candidate assessment data are regularly and systematically collected, compiled, aggregated, summarized, and analyzed to improve candidate performance, program quality, and unit operations. These data are regularly and systematically compiled, aggregated, summarized, analyzed, and reported publicly for the purpose of improving candidate performance, program quality, and unit operations. The technology used to support the unit’s assessment system is used sporadically. The unit maintains its assessment system through the use of information technologies appropriate to the size of the unit and institution. The unit is developing and testing different information technologies to improve its assessment system.

Thank You! 24