ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

The 21st Century Context for
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
Gwinnett Teacher Effectiveness System Training
Training Module for Cooperating Teachers and Supervising Faculty
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Office of Research, Evaluation, and Assessment April 19, 2008.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Principles of Assessment
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
NCATE Standards 1 & 2 January 2002 Donna M. Gollnick & Antoinette Mitchell.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Student Achievement Teacher and Leader Effectiveness Principal Professional Growth and Effectiveness System Field Test Overview.
A Comprehensive Unit Assessment Plan Program Improvement, Accountability, and Research Johns Hopkins University School of Education Faculty Meeting October.
Measuring Dispositions Dr. Sallie Averitt Miller, Associate Dean Office for Assessment and Accreditation Columbus State University GaPSC Regional Assessment.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Tk20Tk20 CAMPUS TOOLS FOR HIGHER EDUCATION. WHAT IS IT? Tk20 is an electronic program that offers one, central, easy location to manage all courses. Instructors.
CLASS Keys Orientation Douglas County School System August /17/20151.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Stronge Teacher Effectiveness Performance Evaluation System
40 Performance Indicators. I: Teaching for Learning ST 1: Curriculum BE A: Aligned, Reviewed and Monitored.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Student Achievement Teacher and Leader Effectiveness Principal Professional Growth and Effectiveness System Overview.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Professional Learning Communities “The most promising strategy for sustained, substantial school improvement is developing the ability of school personnel.
PTEU Conceptual Framework Overview. Collaborative Development of Expertise in Teaching, Learning and Leadership Conceptual Framework Theme:
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Using PACT Data for National Accreditation Gladys L. Benerd School of Education University of the Pacific Presenters: Betsy Keithcart, Assessment Coordinator.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
March 15-16, Inquiry and Evidence An introduction to the TEAC system for accrediting educator preparation programs 3/15/12, 9:00-10:00a.m. CAEP.
Assessment and Continuous Improvement in Teacher Education.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Education Unit The Practicum Experience Session Two.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Melanie DiLoreto.  Planning Committee ◦ Spring 2008 – Summer 2008 ◦ Internal & External Stakeholders – CAS, PCL, SoE, Academic Affairs, Supervising Teachers,
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Valley City State University School of Education and Graduate Studies Aggregate Assessment Data Please click on the action boxes to navigate your way through.
Stetson University welcomes: NCATE Board of Examiners.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
NCATE Unit Standards 1 and 2
Eastern’s Assessment System
Field Experiences and Clinical Practice
NCATE Standard 3: Field Experiences & Clinical Practice
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Deborah Anne Banker Committee Chair
Presentation transcript:

ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University

WHAT TSPC RULE SAYS…  (1) The unit has an assessment system that collects and analyzes data on applicant qualifications, candidate and graduate or program completer performance, and unit operations to evaluate and improve the performance of candidates, the unit and its programs ROSSELLI (2012) 2

WHAT TSPC RULE SAYS…  (2) Areas evaluated under this standard include:  (a) Assessment System;  (b) Data Collection, Analysis, and Evaluation; and  (c) Use of Data for Program Improvement ROSSELLI (2012) 3

TSPC EXPECTATIONS  Unit faculty collaborate with members of the consortium to implement and evaluate the system. Professional, state, and institutional standards are key reference points for candidate assessments.  The unit embeds assessments in programs, conducts them on a continuing basis for both formative and summative purposes, and provides candidates with ongoing feedback.  The unit has procedures to ensure credibility of assessments: fairness, consistency, accuracy, and avoidance of bias. TSPC Professional Standards Manual ROSSELLI (2012) 4

PROGRAM REVIEW TEMPLATE  TRANSITION POINT ASSESSMENT: BRIEF NARRTIVE OF THE KEY ASSESSMENTS AT THE GATES IDENTIFIED.  SUMMARY OF ASSESSMENTS AND GUIDES: FIRST LIST OUT ALL 6- 8 ASSESSMENTS AND THE AREA OF DATA COLLECTION  Example:  Assessment # 1: Content test Content test  Assessment #2: Work sample – Content Pedagogy  Assessemnt #3:  Assessment #4  Assessment #5  Assessment #6  Assessment #7  Assessment #8 Summarize the assessments, scoring guides and rubrics for each major assessment. ROSSELLI (2012) 5

UNIT REVIEW EXPECTATIONS  The unit uses multiple indicators (e.g., 3.0 GPA, mastery of basic skills, general education knowledge, content mastery, and life and work experiences) to identify candidates with potential to become successful teachers or assume other professional roles in schools at the point of entry into programs.  The unit has multiple decision points, (e.g., at entry, prior to clinical practice, and at program completion).  The unit administers multiple assessments in a variety of forms and aligns them with candidate proficiencies. These may come from end-of-course evaluations, written essays, or topical papers, as well as from tasks used for instructional purposes (such as work samples, projects, journals, observations by faculty, comments by cooperating teachers, or videotapes) and from activities associated with teaching (such as lesson planning, identifying student readiness for instruction, creating appropriate assessments, reflecting on results of instruction with students, or communicating with parents, families, and school communities). TSPC Professional Standards Manual ROSSELLI (2012) 6

TSPC EXPECTATIONS  The unit uses information available from external sources such as state licensing exams, evaluations during an induction or mentoring year, employer reports, follow-up studies or surveys, and state program reviews.  The unit establishes scoring guides, which may be rubrics, for determining levels of candidate accomplishment and completion of their programs.  The unit uses results from candidate assessments to evaluate and make improvements in the unit, and its programs, courses, teaching, and field and clinical experiences. ROSSELLI (2012) 7

WHAT DO YOU WANT A SYSTEM TO DO Disaggregate data on individual candidate progress and decisions Aggregate data on candidates’ documentation of specific proficiencies Aggregate data on program documentation of specific proficiencies Aggregate data on candidate demographics, GPA, PRAXIS, field placement, cohort, year etc. ROSSELLI (2012) 8

HOW TO MAKE IT HAPPEN  Establish structures to guide and support the work around identification, quality, collection, and use of data to make decisions.  Personnel, Purchased or Develop relational database  Institutionalize dedicated time for sharing results  Involve Consortium (including students)  Develop a secure assessment website for reports  Develop a rule to guide how data results are used ROSSELLI (2012) 9

HOW TO MAKE IT HAPPEN  Tie your curriculum proposal process to data to study impact  Share the importance of priority with Provost & President (will likely appreciate how this aligns with NWCCU expectations)  Focus on just a few key assessments that can be used consistently allowing academic freedom to still exist  Provide systematic orientations to faculty, staff, supervisors  Disaggregate data by program or delivery model to examine for program fidelity  Balance indirect with direct assessments that have external validity (PRAXIS, ACT/SAT, state survey of employer satisfaction) ROSSELLI (2012) 10

DIRECT  Teacher Work Samples  Professional Projects  Portfolios  Standardized tests or certification exams  Evaluations from field experiences INDIRECT  Exit surveys  Alumni surveys  Employer surveys  Job placement data EXAMPLES ROSSELLI (2012) 11

DESIGN PROCESS  Understand how faculty will want to use data  Sort out consistent from inconsistent sources of data  Map the assessments to the required standards  Establish design principles ROSSELLI (2012) 12

DESIGN PRINCIPLES The system we developed needed to be able to:  Make clear to everyone involved the knowledge, skills, and dispositions expected of candidates  Make clear to everyone what evidence will be looked to in making these assessments  Make clear “how good is good enough” with respect to areas of performance SAMPLE ROSSELLI (2012) 13

A METAPHOR…  You are designing a road map that traces the path backwards from the destination (proficiency attained) to various points along the journey where decisions are made about attainment. ROSSELLI (2012) 14

KEEP IN MIND…  Not all elements should be assessed at each gate.  Some of the least valuable data are the most easily gathered.  What is not measured may be a statement about what is not valued or important.  Some of the most important things may be the most difficult to measure.  It may be worth measuring something even though the assessment may not be as desirable as you would like. ROSSELLI (2012) 15

KEEP IN MIND  Make sure there is congruity between Standard 2 and what is shared about the assessment process in your Program Reviews.  Expect some ambiguity and continued evolution as your system develops.  Develop or purchase a relational database system. ROSSELLI (2012) 16

QUESTIONS FOR OUR PANELISTS  Briefly describe the data system you use and its best features.  What process was used to develop/adopt your data system?  What are the drawbacks of your data system?  What are the approximate costs affiliated with your system?  What would you do differently if you were starting over and designing a data system?  What tips would you offer others that can make their work easier? ROSSELLI (2012) 17