Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.

Slides:



Advertisements
Similar presentations
What “Counts” as Evidence of Student Learning in Program Assessment?
Advertisements

Standards Definition of standards Types of standards Purposes of standards Characteristics of standards How to write a standard Alexandria University Faculty.
Institutional Effectiveness (ie) and Assessment
Collaborative Assessment: A Strategy to Relate, Reflect, and React Leah Barrett, Assistant Vice President, Student Affairs Matt Barone, Assistant Director,
Assessing CADE Student Learning Outcomes Karla Kennedy-Hagan, PhD, RD, LDN Eastern Illinois University Charleston, IL.
Program Review: The Foundation for Institutional Planning and Improvement.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
Standards and Guidelines for Quality Assurance in the European
Student Services Assessment Lee Gordon Assistant Vice President for Student Services Purdue University.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Continuous Improvement Cycle
Pedagogies of Engagement (Cooperative Learning) and Assessment – Overview – Karl A. Smith Engineering Education – Purdue University Civil Engineering -
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Assessment and the Academic Librarian Presentation to Iowa Private Academic Libraries Conference March 16, 2005 Curtis J. Taylor.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Focus on Learning: Student Outcomes Assessment and the Learning College.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Assessment Workshop College of San Mateo February 2006.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Unit 1 – Preparation for Assessment 1.2 Discuss approaches to auditing the capabilities and capacity of an college in or der to evaluate its readiness.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Why Do State and Federal Programs Require a Needs Assessment?
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Student Support Services Standard II B & C. II.B. The institution recruits and admits diverse students who are able to benefit from its programs, consistent.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
The Basics of.  The ACCJC requires it for accreditation  To report it on program review  To make course outlines more relevant (SLOs, assignments,
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
1 Roles and Responsibilities of The Learning Evidence Team at CCRI Presented at CCRI Peggy Maki
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Integrated Planning = Reacting, Reflecting, Recharging.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Assessment Principles John J. Clementson Augustana College.
Kathy Corbiere Service Delivery and Performance Commission
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Accreditation Overview Winter 2016 Mallory Newell, Accreditation Liaison Office.
Assessment: An Overview Original author: Holly Hull Modified by: Jim Julius.
ACS WASC/CDE Visiting Committee Final Presentation South East High School March 11, 2015.
October 20 – November 6, 2014 Alovidin Bakhovidinov Alina Batkayeva
30/10/2006 University Leaders Meeting 1 Student Assessment: A Mandatory Requirement For Accreditation Dr. Salwa El-Magoli Chair-Person National Quality.
HLC Criterion Five Primer Thursday, Nov. 5, :40 – 11:40 a.m. Event Center.
For Academic Assessment Dr. Marjorie Dorimé-Williams.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
PLA Advisory Board February 18, 2014 Ross GarmilNan Travers.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
External Review Exit Report Campbell County Schools November 15-18, 2015.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Karl A. Smith Engineering Education – Purdue University
Governance and leadership roles for equality and diversity in Colleges
Implementing Race to the Top
Presented by: Skyline College SLOAC Committee Fall 2007
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
TLQAA STANDARDS & TOOLS
Presentation transcript:

Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center

Basic Questions  Why assessment & program review? Ensure that basic standards are met and that continuous improvement is occurring.  Who does assessment & program review? Every program, office, department and service of the college.  Can assessment & program review be made easy? Annual improvement plans, following agreed upon principles of assessment, good data systems & continuous collection and reporting of data and information, standardized processes procedures and forms can make assessment & program review easier.  Do you have to be an expert in to do assessment? No – but you do need a willingness for continuous improvement and a desire to provide a continually improving quality of instruction and/or service.  What is I do not have time to do assessment? You need to make time. Not doing assessment is no longer an option.

Basic Questions  How are the assessment and program review results be used? Assessment committee will be responsible for ensuring:  Quality of assessment plans  Quality of assessment reports  Compilation and reporting on results of assessment and program review across the college Impact development of institutional priorities and resource allocation through Planning and Resources Committee and Budget development process Provide the basis for developing a culture of evidence at the college Improvement does not occur in a vacuum – changes in pedagogy, assessment services, strategies, etc. must decided on and implemented with quality for improvement to occur. Improvement may but does not always require additionally resources – often improvement is a result of relocation of existing resources or changes in the way to do a task – a major resource is how we allocate our time. A fundamental question to ask (and answer with quality and depth) in any improvement planning is  How did we (through our internal thinking, our processes, our practices, and our procedures) contribute to or create the circumstances (good and bad) we face now?

Definition of Assessment  Assessment is an ongoing process aimed at understanding and improving student learning. It involves making our expectations explicit and public; setting appropriate criteria and high standards for learning quality; systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and using the resulting information to document, explain, and improve performance. When it is embedded effectively within larger institutional systems, assessment can help us focus our collective attention, examine our assumptions, and create a shared academic culture dedicated to assuring and improving the quality of higher education. Thomas A. Angelo: (AAHE Bulletin, November 1995, P.7)

Principles & Assumptions of Assessment COM-FSM  The assessment process is messy and inexact, but must be done as precisely as possible  Outcomes measures should be as direct as possible, although indirect methods, such as industry perceptions, must be included and should somehow use existing artifacts.  Industry-specific professional testing measures of competence may be applied.  Assessment must impact improvement of curriculum, policy, and planning  Decisions arising out of assessment results are not meant to be punitive; rather, they are to be used for program and service improvements.  Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time.  Assessment works best when the programs it seeks to improve have clear, explicitly stated purposes.  Assessment is a goal-oriented process. Proposed Institutional Assessment Plan

Nine Principles of Good Practice for Assessing Student Learning 1. The assessment of student learning begins with educational values. 2. Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated and revealed in performance over time. 3. Assessment works best when the programs it seeks to improve have clear, explicitly stated purposes. Assessment is a goal-oriented process. 4. Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes. 5. Assessment works best when it is ongoing not episodic. Assessment is a process whose power is cumulative. 6. Assessment fosters wider improvement when representatives from across the educational community are involved. 7. Assessment makes a difference when it begins with issues of use and illuminates questions that people really care about. 8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. 9. Through assessment, educators meet responsibilities to students and to the public. There is a compelling public stake in education. AHHE.

Academic Program Review Indicators  Program enrollment  Graduation rate  Average class size  Student seat cost  Course complete rate for the program  Students’ satisfaction rate  Employment data  Transfer data  Program’s student learning outcomes  Students’ learning outcomes for program courses Reference: Policy on Instructional Programs Evaluation 5/2006  Indicators are largely provided by OAR, IRPO and other Administrative and Student Support Services but programs need to be active in monitoring and interpreting the indicators and impact on program improvement and incorporating results into improvement plans

Administrative & Student Support Services Indicators  Need to be developed or adopted but can include: Enrollment management indicators Satisfaction with services Quality of services Performance budgeting concepts such as reductions in time/cost for processes and procedures, continuous increase in quality of services

Dimensions of Learning (Students today need more than content knowledge) 1.Workplace readiness and general skills 2.Content knowledge/discipline- specific knowledge and skills 3.“Soft Skills” (Noncognitive Skills) 4.Student Engagement with Learning

Processes & procedures  Continuous improvement design for the college  Institutional Assessment Plan Handbook  Training needs identified and continually provided Self study (Internet resources on assessment are immense) Formal training programs Support personnel to provide assessment support and training for academic, student support and administrative services

COM-FSM Strategic Plan

Assessment & program review issues for COM-FSM  Consistency in quality of instruction and assessment across campuses Agreement on rubrics, approaches to embedded questions, common assignments across the same courses delivered at different campuses Agreement on general education assessment  Consistency in quality of administrative and support services across all campuses  Equity in quality and quantity of services, facilities, personnel across all campuses  Seeing assessment & program review as a fundamental responsibility of all personnel  Seeing individual, program, office, department and campuses roles in continuous improvement of programs and services across the college  Blend of improvement outcomes/objectives across the system and site/program specific improvement  Documentation of improvement activities Monthly & quarterly performance reports  Reporting against plans – outcomes & objectives  Reporting against intuitional properties Formal assessment & program review reports Structured reflection on progress being made Following timelines  Culture of evidence Culture that promotes continuous improvement based on evidence and use of data  Formative and Summative assessments over a two year cycle

Worksheet 1: Mission & Outcomes (Improvement Plan)

Worksheet 2: Assessment Plan

Worksheet 3: Report

Timelines (2 year improvement cycle Formative & Summative Evaluation)

Major Surveys SurveyTime Administrative Satisfaction Survey March each year Student Services Satisfaction Survey March each year Academic programs statist faction survey April each year CRE & other programs satisfaction survey November each year Employer satisfaction survey November bi-yearly (even years)

COM-FSM Strategic Plan