Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.

Slides:



Advertisements
Similar presentations
Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Advertisements

Leon County Schools Performance Feedback Process August 2006 For more information
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
Assessment in the Biology Department in AY Caroline Solomon Friday December 5.
An Assessment Primer Fall 2007 Click here to begin.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Computer Science Department Program Improvement Plan December 3, 2004.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
Knowing What We Learn and Demonstrating Our Success: Assessment and Evaluation of FLCs Andrea L. Beach, Ph.D. Western Michigan University Presented at.
The Academic Assessment Process
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
EPortfolio Assessment Pilot. Agenda Purpose of the ePortfolio assessment pilot CSD use of ePortfolio English department use of ePortfolio Future applications.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Keystone State Reading Conference October 29, 2012 Dr. Deb Carr, King’s College.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
M USIC I NVESTIGATION VCE Units 3 and 4. Music Investigation involves both performance research in a Focus Area and performance of contrasting works that.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Program Level Outcomes Jessica Carpenter Elgin Community College.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Adapted from Growing Success (Ontario Schools) by K. Gibson
Student Achievement Teacher and Leader Effectiveness Principal Professional Growth and Effectiveness System Field Test Overview.
Learner-Ready Teachers  More specifically, learner-ready teachers have deep knowledge of their content and how to teach it;  they understand the differing.
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration – Initiated in 2003 – Low-residency—one weekend per month (over.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
 Develop creativity and nurture aesthetic sensitivity  Further develop their musical competence  Construct knowledge and understanding of diverse music.
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
PeopleProcessPlanPurpose $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500 $100 $200 $300 $400 $500.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 15: Capacity development and training on Maternity.
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration –Initiated in 2003 –Low-residency—one weekend per month (over.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessment 101: A Review of the Basics Jill Allison Kern, PhD Director of Assessment Christopher Newport University January 2013.
EE & CSE Program Educational Objectives Review EECS Industrial Advisory Board Meeting May 1 st, 2009 by G. Serpen, PhD Sources ABET website: abet.org Gloria.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Exploring Evidence.
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Florida Board of Governors Student Learning Outcomes Assessment Subcommittee March 17, 2004 University of Central Florida R.E. LeMon, Vice Chancellor,
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Development Team Day 5a October Aim To explore approaches to evaluating the impact of the curriculum on pupil learning.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Program Level Assessment for Continuing Studies Programs.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Lakeland Middle School Professional Learning Communities (PLC)
Student Learning Outcomes Assessment
PORTFOLIO ASSESSMENT Jay Barrett, GED 621.
Presentation transcript:

Dr. Geri Cochran Director, Institutional Research, Assessment and Planning

 It’s all about what is important to you ◦ Identifying what is important, the values that guide what you are doing. ◦ Using those values as a basis for evaluating what you are doing ◦ Taking what you have learned from that evaluation to improve what you are doing in order to better achieve your values

Program assessment is a form of evaluation by which participants in a program judge its effectiveness in achieving their values and use what they have learned to improve the program’s effectiveness in achieving those values.

Premise Assessment is linked directly to value Propositions 1. What we assess indicates what we value 2. What we value should guide what we assess

A form of evaluation in which the values of the participants’ in a program are made explicit expectations for what should “come out” of their actions and those actions are evaluated according to the extent to which the actions achieve the expected outcomes.

Improving Programs Program Outcomes Program Criteria Program Values

 Focusing on the value of education shifts our attention from inputs to outcomes  What “comes out” of an educational experience must be directly or indirectly observable to be assessed.  Program assessment belongs to the program; the purpose of outcomes assessment is to improve programs by encouraging evidence-based decision-making by people in the program

Critical Artistic Analysis  Perform music with a principal instrument or voice at a level of artistry qualifying for consideration of professional employment; absorb and evaluate multiple points of view about performance techniques and musical expression.

Professional Collaboration  Ability to function as a contributor to a work team such as a performing ensemble. Professional Literacy  Practical familiarity with the literature and standard repertoire of the major performing area; understand and interpret music in a variety of styles, genres, mediums, and historical eras.

Professional Readiness  Awareness of basic practical information and realities of careers in music; students demonstrate the ability to prepare, audition and interview for a graduate program of study; students will demonstrate knowledge of the grant writing process and the importance of developing entrepreneurial skills; students will demonstrate knowledge of arts-in-education and other community outreach venues; students develop basic skills for using technology such as communications, music sequencing, and engraving software; students will demonstrate basic writing skills as they apply to their career development.

Technical Skills  Perform music with a principal instrument or voice at a level of technique progressing toward viability of professional employment as a musician.

 What evidence should be gathered for assessing outcomes?  What are the sources of the evidence for the outcomes?  How often is the evidence to be collected?

 Relatively direct (SACS preferred) ◦ Writing assignments ◦ Recitals ◦ Performances ◦ Artistic compositions ◦ Essay exams  Relatively indirect ◦ Surveys ◦ Internship reports ◦ Employer Surveys

 Evidence should be meaningful information that is appropriate for assessing a particular outcome.  Evidence should be manageable: reasonable to attain and evaluate (time, effort, availability)

 List of Outcomes  Evidence to be collected  Source of evidence  Frequency of collection of evidence

OutcomesEvidenceSourceFrequency Ability to function as a contributor to a work team such as a performing ensemble. Ensemble evaluations, internship reports, performances Students, faculty, professional in the field Semester & Annually Perform music with a principal instrument or voice at a level of artistry qualifying for consideration of professional employment; Selection of work in portfolio, video and audio tapes, Recitals, etc StudentsAnnually

 Is the evidence specific enough in describing the form of the evidence and venue for collection?  Does the plan rely mainly on direct evidence?  Is the evidence meaningful for the particular outcome?  Is the evidence manageable (reasonable to collect and evaluate) ?

Improving Programs Program Outcomes Program Criteria Program Values

 Goal: To use the evidence as a basis for judging the extent to which the program is meeting the members’ values for the program.

 Goal: To apply what has been learned in evaluating the program toward identifying actions to address areas of concern

 As a result of your assessment, what changes if any, have you implemented to address areas of concern (in the program or in the assessment of the program)?

 What outcomes are you planning to assess for the next reporting cycle?

 How can I help? Geri Cochran