The who, what, where, why, when, and how of writing end-of-the-year reports Sarah Todd Director of Institutional Research and Assessment Presentation to.

Slides:



Advertisements
Similar presentations
The Process of Building Outcomes Joanne Jacobs Sharon Perkins.
Advertisements

Performance Assessment
Leon County Schools Performance Feedback Process August 2006 For more information
Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
School Based Assessment and Reporting Unit Curriculum Directorate
Introduction to Monitoring and Evaluation
Head teacher Performance Management
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
The BYU Idaho Self-Study Overall Characteristics 200 page maximum 200 page maximum Abundant use of graphical displays Abundant use of graphical displays.
An Introduction to Instructional Design Online Learning Institute Mary Ellen Bornak Instructional Designer Bucks County Community College.
The Difference Between Assessment and Evaluation
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
An Assessment Primer Fall 2007 Click here to begin.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
Program Management n KSPE 4250 n Ch 2. Vision Statement n A concise statement that describes the ideal state to which the organization aspires. u Include.
Measuring Learning Outcomes Evaluation
Assessing and Evaluating Learning
Computer Integrated assessment. Computer integrated assessment Measurement, testing, assessment and evaluation What is the difference?
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
performance INDICATORs performance APPRAISAL RUBRIC
Professional Growth= Teacher Growth
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Assessment COURSE ED 1203: INTRODUCTION TO TEACHING COURSE INSTRUCTOR
Principles of Assessment
Adapted from Growing Success (Ontario Schools) by K. Gibson
ASSESSMENT Formative, Summative, and Performance-Based
Thinking about assessment…
 1. Methods of evaluation are thorough, feasible, and appropriate  2. Use of objective measures to produce quantitative and qualitative data  3. Methods.
Assessment.  Understand why we need to assess  The role of assessment in teaching. Lecture’s objective.
18 Evaluating Staff Performance.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Understanding Meaning and Importance of Competency Based Assessment
Assessment and Evaluation TCH 347 Social Studies Methods Department of Teacher Education Shippensburg University Han Liu, Ph. D.
Let’s Talk Assessment Rhonda Haus University of Regina 2013.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
بسم الله الرحمن الرحيم. Outcome-Based Program Assessment An Overview Dr. Mohammad S. Al-Homoud Director Program Assessment Center Deanship of Academic.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Lecture 1 Why We Need to Assess? Brown, 2004 p. 1-5 Prepared by: Dr. Reem Aldegether Dr. Njwan Hamdan.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
The BYU Idaho Self-Study Overall Characteristics 200 page maximum 200 page maximum Abundant use of graphical displays Abundant use of graphical displays.
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Assessment ? Evaluation?
Stetson University welcomes: NCATE Board of Examiners.
Center for Faculty Excellence Workshops First impressions and the reference encounter (handout.
The Difference Between Assessment and Evaluation H. Stephen Straight Vice Provost for Undergraduate Education Teaching Assistant Orientation 28 August.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
A Commitment to Continuous Improvement in Teaching and Learning Michaela Rome, Ph.D. NYU Assistant Vice Provost for Assessment.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
Interim Assessment Information Hope Allen Director of Assessment 4/12/15.
ASSESSMENT & EVALUATION. Definitions of Teaching To present information, insights. To reveal knowledge or skill. To help students learn. NOTE: All of.
Assessing Learning Outcomes
Classroom Assessments Checklists, Rating Scales, and Rubrics
Dutchess Community College Middle States Self-Study 2015
Thinking about assessment…
The Difference Between Assessment and Evaluation
The assessment process For Administrative units
The Difference Between Assessment and Evaluation
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessment Jargon The who, what, where, why, when, and how of writing end-of-the-year reports Sarah Todd Director of Institutional Research and Assessment.
بسم الله الرحمن الرحيم.
Presentation transcript:

The who, what, where, why, when, and how of writing end-of-the-year reports Sarah Todd Director of Institutional Research and Assessment Presentation to School of Business and Liberal Arts August 24, 2012

 The systematic collection, review, and use of information undertaken for the purpose of improving outcomes (e.g., student learning and development)  Translation: Determining if what we are doing is working, and making changes for improvement, determining if those changes are working, and making further changes for improvement, and making more changes for improvement, and… Systematic: organized and planned Review: Appraise critically, evaluate, a formal examination; practice intended to polish performance or refresh memory Use: Take or consume  Assessment is not an event, it is a constant process

AssessmentEvaluation TimingFormative: Ongoing to foster improvement Summative: Final to gauge quality/performance What is it measuring? Process-oriented: How is it going? Product-oriented: What’s been accomplished? Relationship between administrator and recipient Reflective: Based on internally defined goals and criteria Prescriptive: Externally imposed standards Use of findingsDiagnostic: Identify areas of strength and weakness Judgmental: Arrive at an overall grade/score Standards of measurement Flexible: Adjustable as challenges change Fixed: Designed to reward success and punish failure

 A group of people sharing a specific characteristic: Age Student type Residential/non-residential CSTEP, EOP Program of study First generation Pell eligible

 The standard by which things are measured or compared  The “starting line” Census date Previous report date Dictated by a higher power

 A description or example of performance that serves as a standard of comparison for evaluation or judging quality  Translation: A standard by which something can be measured or judged  Types of benchmarks: Peers (aspirational and reality) Where we are now (baseline) Where we want to be Where others say we should be

 Goal: A general description of the wider problem your project with address, offering a reason why the task will be performed  Objective: More detailed than a goal, includes the who, what, where, why, when, and how Specific: to the problem you are addressing Measureable: changes must be quantifiable, be numeric to address issues of quantity and quality Appropriate/attainable: to the goals and the environment; must be feasible and within your control/influence Realistic: Measures outputs/results – not activities Times: Identifies target date for completion of objectives and includes interim steps and a monitoring plan

 MEASUREABLE  Used to express intended results in precise terms  Specific as to what needs to be assessed and help guide the appropriate assessment tool

 Observable (documentable!) behaviors or actions that demonstrate that the objective has occurred  Your objectives carried over

 Direct: Student learners display knowledge and skills as they respond directly to the instrument itself. Objective tests Essays Presentations Classroom assignments  Indirect: Student learners reflect on their learning rather than demonstrating it. Surveys (exit, current and graduating students, alumni, employer, etc.) Interviews Focus groups

The goals and objectives you insert in your program scorecard depend entirely on the specific goals and direction of your program The report card can assist you in framing goals and objectives on enrollment, retention rates, graduation rates, admissions, number of graduates and diversity In the next installment, enrollments by student type, average GPA, survey results will be incorporated

Follow a cohort of students through the report card The Fall 2008 cohort size and retention rates can assist in predicting the graduation rates The yield rate and enrollment rate can assist in predicting these as well

Program is currently at a 32% graduation rate. Institutional goal is 40% Set a program goals based on the reality of where you are, the interventions/changes you intend to make, and the direction you need to be heading The yield rate and enrollment rate can assist in predicting these as well

 Let’s perform an assessment of my presentation, focusing on wardrobe: What is the cohort? What is the baseline? What benchmarks are we going to use? What are the goals? What are the objectives? What are some direct and indirect measures of student learning? What are the outcomes?