A Comprehensive Unit Assessment Plan Program Improvement, Accountability, and Research Johns Hopkins University School of Education Faculty Meeting October.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

1 Educator Preparation Advisory Council (EPAC) September, 2013.
Continuum of Teacher Development and Shared Accountability Leading to Increased Student Performance Teaching Quality Policy Center Education Commission.
PREPARING FOR NCATE May 19, 2008 Teacher Education Retreat.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Expected Visit Date Spring  Pam Campbell  Patti Chance  Kathi Ducasse  Sandra Odell  Tom Pierce  LeAnn Putney  Nancy Sileo  Shannon Smith.
Conceptual Framework What It Is and How It Works Kathe Rasch, Maryville University Donna M. Gollnick, NCATE October 2005.
UNIT TECHNOLOGY A NARRATED PRESENTATION (TURN ON YOUR SOUND) College of Education.
+ School Psychology Program MISSION Program Overview Structure and Curriculum Sequence Program Highlights Clinical Faculty Cohort Assessment System Use.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Office of Research, Evaluation, and Assessment April 19, 2008.
1 NCATE Standards. 2  Candidate Performance  Candidate Knowledge, Skills, & Dispositions  Assessment System and Unit Evaluation  Unit Capacity Field.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
NTEP – Network for Transforming Teacher Preparation A presentation to the State Board TAC on Tiered Licensure and Career Ladders April 6, 2014.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
February 8, 2012 Session 3: Performance Management Systems 1.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Managing Data Collection: Where we were, where we are, and where we want to be! Gwen Brockman California State University, Dominguez Hills.
Learner-Ready Teachers  More specifically, learner-ready teachers have deep knowledge of their content and how to teach it;  they understand the differing.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Measuring Dispositions Dr. Sallie Averitt Miller, Associate Dean Office for Assessment and Accreditation Columbus State University GaPSC Regional Assessment.
Tk20Tk20 CAMPUS TOOLS FOR HIGHER EDUCATION. WHAT IS IT? Tk20 is an electronic program that offers one, central, easy location to manage all courses. Instructors.
Standard 5 - Faculty Qualifications, Performance, and Development Kate Steffens St. Cloud State University.
2012 Regional Assessment Workshops Session 2 Dr. Maryellen Cosgrove, Dean School of Business, Education, Health and Wellness Gainesville State University.
NCATE STANDARD I REVIEW Hyacinth E. Findlay Carol Dawson Gwendolyn V. King.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
2.3.4 MVNU Annual Education Unit & Program Assessment Part I.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
PTEU Conceptual Framework Overview. Collaborative Development of Expertise in Teaching, Learning and Leadership Conceptual Framework Theme:
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Student Outcomes Assesment for CTE Programs. Student Outcomes Assessment and Program Review  In 2004, the Program Review process was redesigned based.
School of Education California Baptist University Overview of Assessment Project.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Assessment and Continuous Improvement in Teacher Education.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Paris, N.A. (2006) AACTE Session #334 V Conspicuous Excellence: Embracing Accountability, Documenting Impact & Building Trust Nita A. Paris, PhD, Associate.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
PESB Standards Standard 2: Accountability and Program Improvement.
Developing Educational Leaders Who Create Tomorrow’s Opportunities: Issues for Partner Instructors in the College of Education April 10, 2008.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Professional Education Unit Assessment System Model Operationalizing the Conceptual Framework Operationalizing.
Columbus State University C ollege of Education and Health Professions PSC Program Review February 14-17, 2010.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Melanie DiLoreto.  Planning Committee ◦ Spring 2008 – Summer 2008 ◦ Internal & External Stakeholders – CAS, PCL, SoE, Academic Affairs, Supervising Teachers,
Valley City State University School of Education and Graduate Studies Aggregate Assessment Data Please click on the action boxes to navigate your way through.
Graduate School of Education Assessment Update 2009 State of the School September 23, 2009.
Stetson University welcomes: NCATE Board of Examiners.
Wisconsin Administrative Code PI 34 1 Wisconsin Department of Public Instruction - Elizabeth Burmaster, State Superintendent Support from a Professional.
PESB Standards Standard 4: Program Design. How Standards are Judged Standards are deemed unmet, met or exemplary. A standard is deemed ‘met’ if: A program.
College of Notre Dame of Maryland Unit Assessment System.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
NCATE Unit Standards 1 and 2
Eastern’s Assessment System
Field Experiences and Clinical Practice
NCATE Standard 3: Field Experiences & Clinical Practice
COE Assessment The “Then” and “Now”.
Standard Four Program Impact
Presentation transcript:

A Comprehensive Unit Assessment Plan Program Improvement, Accountability, and Research Johns Hopkins University School of Education Faculty Meeting October 26, 2012 Toni Ungaretti Borrowed generously from Jim Wyckoff (October 10, 2010). Using Longitudinal Data Systems for Program Improvement, Accountability, and Research. University of Virginia

Why Assessment? Assessment is a culture of continuous improvement that parallels the School’s focus on scholarship and research. It ensures candidate performance, program effectiveness, and unit efficiency.

Overview Program Improvement: By following candidates and graduates both during their programs and over time after graduation, programs can learn a great deal about their programs Accountability: Value-added analysis of teacher/student data in longitudinal databases is one measure of program accountability Research: A systematic program of experimentally designed research can provide important insights in how to improve candidate preparation

Jim Wyckoff, 2010

Program Improvement Some Questions Who are our program completers—age, ethnicity, areas of certification? What characterizes the preparation they receive? How well do they perform on measures of qualifications, e.g., licensure exams? Where do our program completers teach/work? What is their attrition?—are they meeting program goals and mission? How effective are they in their teaching/work? Ultimate impact!

Accountability What Constitutes Effective Teacher Preparation? Programs work with school districts to meet the teaching needs of the schools where their teachers are typically placed Programs are judged by the empirically documented effectiveness of their graduates in improving the outcomes of the students they teach Retention plays a role in program effectiveness as teachers substantially improve in quality over the first few years of their careers.

Research How Can Programs Add Value ? Selection: Who enters, how does that matter, and how can we influence it? Preparation: What preparation content makes a difference? Timing: Does it matter when teachers receive specific aspects of preparation? Retention: Why is retention important to program value added and what can affect it?

Johns Hopkins University School of Education Comprehensive Unit Assessment Plan

What students learn Program Goals include Professional Standards Student Learning Outcomes Student Learning Outcome Assessment Assessment Tracking Analysis of Assessment Data Unit and Program Improvement How they learn it How we track the learning How we know that they learned What we learn from a review of their learning What we change SOE Vision & Mission Assessment Cycle – Close the Loop Johns Hopkins University School of Education

ADMISSIONS MID- PROGRAM/PRE- INTERNSHIP PROGRAM COMPLETION (CLINICAL EXPERIENCES) POST- GRADUATION 2 YEARS OUT POST- GRADUATION 5 YEARS OUT Entry GPA GRE/SAT scores Admission demographics Personal essay Teaching experience Interview ratings Disposition Survey Course assignments Course grades Content verification E-Portfolio evaluation Academic plan Survey on diversity /inclusion dispositions Reflection on personal growth and goals Advisor/instructor input Student experience survey Course Grades Test results (such as PRAXIS II, CPCE Exam) E-Portfolio evaluation Final comprehensive exam or graduate project Survey on diversity/ Inclusion dispositions University Supervisor and Cooperating Teacher evaluations Course and Field Experience Assessment results Exit interview or End of Program Evaluation Employer survey Alumni survey School partner feedback MSDE data linked to our graduates Employer survey School partner feedback MSDE data linked to our graduates Alumni survey Collect data from graduates through surveys and/or focus groups Major Assessment Points/Benchmarks

Conceptual Framework Themes/Student Outcomes Key Assessment PointsData Points Knowledgeable in their respective content area/discipline Admission Mid-program Program Completion Post-Graduation Comprehensive Exams Praxis Exams Graduate Projects Reflective practitioners Mid Program Program Completion Post Graduation Committed to diversity Admissions Mid-Program Program Completion Post Graduation Data-based decision-makers Mid-Program Program Completion Post Graduation Integrators of applied technology Mid Program Program Completion Post Graduation Alignment of Conceptual Framework to Assessment Plan’s Benchmarks

The Johns Hopkins University Mission Statement CONCEPTUAL FRAMEWORK THE SCHOOL OF EDUCATION VISION SOE Mission Inputs InitiativesDomains Key Assessment Points (For Assessments see Table 3) Outputs Resource Capability Admit Midpoint or Internship Program Completion Post Grad Student OutcomesImpact 2yr5yr Effective Teaching Innovative Tools Excellent Professional Preparation High Quality Research Innovative Outreach Knowledge Disposition Practice Content Experts Reflective practitioners Committed to Diversity Data Based Decision Makers Integrators of Applied Technology Education Improvement Community Well-Being SOE Conceptual Framework Logic Model

Program Assessment Plan Mission, Goals, Objectives/Outcomes aligned with SOE mission and outcomes National, State, and Professional Standards Assessments –Descriptions, Rubrics, Benchmarks Annual Process – Review of findings and recommendations for change – Review of assessments and adjustments Documentation of stakeholder input – – ALL faculty, students, university supervisors, cooperating teachers, partner schools, MSDE, professional organizations, community members, employers