Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012.

Slides:



Advertisements
Similar presentations
Quality Matters Building a Quality Online Course.
Advertisements

[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
The SUNY Assessment Initiative: Best Practices for Mapping Program Objectives to Curricular Activities Presentation to Middle States Commission on Higher.
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
Linking assignments with course and program goals Lehman College Assessment Council and Lehman Teaching and Learning Commons November 28, :30 -
Outcomes Assessment- Full Implementation Meeting Fall 2009.
Assessment Report School of TAHSS Department: Anthropology Chair: Dr. Roger Kurtz Assessment Coordinator: Dr. Pilapa Esara Date of Presentation:
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
General Education Assessment 2013: Introduction
What “Counts” as Evidence of Student Learning in Program Assessment?
1 Closing the Loop: How to Do It and Why It Matters.
Teacher Evaluation and Pay for Performance Michigan Education Association Spring 2011.
Why are we spending time writing assessment reports? 1.Because documenting your process can potentially lead to more trustworthy results and meaningful.
Annual UMES Summer Institute “Making the Adjustment” Student Learning Objectives :
Student Growth Measures in Teacher Evaluation
Assessment Report Mathematics Department School of Science and Mathematics Thursday, October 17, 2013 Dr. Sanford Miller, Interim Chair Dr. Jason Morris,
A Quick Infomercial.  Development of College’s Action Plan on Planning and Assessment, vetting through College Senate, and approval by President’s Cabinet.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
- Overview- Program Review and SLOs Preparing to Write the Self-Study Why? What? How? For more SLO resources see:
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Making Your Assessments More Meaningful Flex Day 2015.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
Presented by: Dr. Sue Courtney Janice Stoudemire, CPA, ATA, ABA Associate Degree Board of Commissioners Copyright Protected: Material can not be use or.
FLCC knows a lot about assessment – J will send examples
ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Assessment Workshop SUNY Oneonta May 23, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
Assessed: 5 Cycles 2006, 2009, 2010, 2011, 2013.
Assessment and Accreditation: Planning for a Compatible Relationship Is it possible?
Assessment Report School of The Arts, Humanities and Social Sciences________________ Department: Political Science and International Studies.
BY Karen Liu, Ph. D. Indiana State University August 18,
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Assessing Student Learning Outcomes in Student Development – Part I Student Development Division Meeting SUNY Oneonta May 9, 2008.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
Curriculum Mapping: Assessment’s Second Step Office of Institutional Assessment & Effectiveness SUNY Oneonta Fall 2010.
Assessment Workshop SUNY Oneonta April 24, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Fall 2015 Professional Development Days C. Cruz-Johnson & R. Gamez August 28, Walking with Integrity Towards Student Success.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
PRESENTATION TO ASSOCIATION OF INSTITUTIONAL RESEARCH AND PLANNING OFFICERS BUFFALO, NEW YORK JUNE 11, 2009 How Campuses are Closing the GE Assessment.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Update on ABET Assessment Work
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Student Learning Objectives (SLO) Resources for Science 1.
Last Revised: 10/01/15. Senate Bill 290 has specific goal-setting requirements for all licensed and administrative staff in the State of Oregon. In ,
Texas Tech University PROGRAM-LEVEL ASSESSMENT WORKSHOP WRAP-UP SESSION FRIDAY, NOVEMBER 11,
Patty Francis Associate Provost Institutional Assessment & Effectiveness.
CALIFORNIA STATE UNIVERSITY, HAYWARD Academic Affairs MEMORANDUM DATE: October 3, 1995 T0: Department Chairs FROM: Frank Martino Provost & Vice President,
& YOUR PROGRAM How to Get Started Thinking About it All…
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Best Practices in CMSD SLO Development A professional learning module for SLO developers and reviewers Copyright © 2015 American Institutes for Research.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Library Publishing and Undergraduate Education: Strategies for Collaboration Stephanie Davis-Kahl, Scholarly Communications Librarian Professor Michael.
Advanced Writing Requirement Proposal
The assessment process For Administrative units
Consider Your Audience
Professor Salary Incentive Program
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Collecting Data to Assess Capstone and Culminating Courses
Assessing Academic Programs at IPFW
Developing a Rubric for Assessment
Presentation transcript:

Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012

Providing a Context: Oneontas Process for Assessing Student Learning Fall 2009: Approval by Presidents Cabinet of APAC Guidelines for Academic Program Assessment Spring 2009: Submission by programs of Step 1 (Establishing Objectives) of guidelines December 2010: Submission by programs of Step 2 (Activities & Strategies) of guidelines June 1, 2011: Submission of Steps 3 (Assessment) and 4 (Closing the Loop) [plans only] All plans approved by academic deans by the end of Fall academic year: First round of data collection, with reports due June 1, 2012 as part of annual reports

Emphases Throughout the Process Establishing congruence among institutional goals, programmatic and course objectives, learning opportunities, and assessments Linkages to disciplinary (and, as appropriate, accreditation/certification) standards Using a variety of assessment measures, both quantitative and qualitative, in search of convergence Value of course-embedded assessment Course- vs. program-level assessment

Course- Vs. Program-Level Assessment Focus of SUNY Oneonta assessment planning is programmatic student learning objectives, not individual students or faculty For the most part, data collection described in approved assessment plans will take place in the context of the classroom (i.e., course-embedded assessment) Focus at this point, however, is how to compile and aggregate data across courses and course sections, as appropriate, and to report these results in a meaningful way

What Youve Done So Far 1. Development of programmatic student learning objectives 2. Curriculum mapping 3. Development of plan for collecting data across a three-year period APAC guidelines recommend that programs assess 1/3 of SLOs every year 4. Development of plan for reviewing data and using that information to revise courses and programs, as appropriate (i.e., closing the loop)

Follow Your Curriculum Map – Its Your Assessment Database! Designed correctly, your curriculum map tells you what to assess, when to assess, and how to assess it!

Collecting Data: Important Points from Spring 2011 Workshop Collaborative work among faculty to assure : SLOs are being assessed using direct measures Measures are of sufficient quality, either by review process or through the use of a checklist that includes good practice criteria A priori success indicators are in place for each measure as well as common categories for results (e.g., exceeding, meeting, approaching, not meeting standards) Decisions on how issue of multiple course sections will be addressed Process in place for assuring: Reliable scoring of qualitative measures Courses (and students) to be included in assessment are representative

Value of Assessment Data Will Be Maximized IF: A variety of assessment measures is used Quantitative and qualitative Course-embedded and stand alone measures Benchmarking as available For each SLO, data are collected from multiple courses/course sections Criteria for scoring and determining what meets standards are consistent across courses Do measures have to be the SAME? No, but…….

And Perhaps Most Important Theres no point in collecting useless data!

Compiling and Reporting Assessment Data Across Courses Aggregating and Documenting Your Assessment Results

Approaching the Task Systematically 1. Start with your curriculum map, which depicts: The courses in which youre covering – and assessing – SLOs And, depending on the detail of your map, the levels at which SLOs are being addressed and the measures being used to assess them 2. Select courses – and faculty – to be included in the assessment and communicate that expectation clearly and establish a time frame for administering assessment and submitting results

Approaching the Task Systematically (cont.) 3. Collect and compile data (categorized as agreed upon by department) on spreadsheet for review 4. Review results collectively, reach conclusions regarding strengths and weaknesses, and make recommendations for change as appropriate Be sure to document all this (i.e., write it down)

A Step-by-Step Example

1.Following Your Curriculum Map The Simplest Case: Which Courses Cover Which SLOs? To What Extent Do Courses Cover SLOs?

1.Following Your Curriculum Map At What Level Do Courses Cover SLOs? How Do Courses Assess SLOs?

2.Selecting Courses to be Included in Assessment Do all courses covering a particular SLO need to be included? Ideally, yes – remember, each SLO is only assessed every 3 years Especially important if SLOs are being assessed at different levels (in terms of proficiency and/or course levels) Do all course sections covering a particular SLO need to be included? No, but why not? If not, important that: Adequate number of students are included in the assessment Sections are selected randomly

2.Selecting Courses to be Included in Assessment (cont.) Once courses are selected, communicate this fact quickly and clearly to faculty, being sure to emphasize: SLOs to be assessed and categories to be used in reporting results Mapping of assignments – and eventually assessment results – to specific SLOs and the need for a priori criteria in categorizing student performance Time line for submitting results (late enough in semester for students to demonstrate competence but early enough for data to be compiled) Must assignments and a priori criteria be the same? No, but the more comparable, the more meaningful the results Ultimately, its a programmatic decision

A Few More Hints About Assignments Final grades are NEVER an adequate measure for a specific SLO Overall assignment grades are rarely – if ever – an adequate measure for a specific SLO Often only minor adjustments are needed in existing assignments to gather information on specific SLOs Limit assignments to one SLO Break assignments into subparts, each of which addresses a specific SLO Identify test questions that map to specific SLOs

3.Compiling Data for Review and Discussion Important for all results be submitted in agreed- upon format, organized, and maintained in a single place (an Excel spreadsheet is perfect) Different ways to organize results for review and deliberation Best to start with all data for each SLO, showing each course and its overall results for student performance organized by performance categories The more you can aggregate, the better (i.e., by course, course level, level of proficiency), being attentive to variations within aggregating categories Importance of removing identifiers

Sample Spreadsheet (for Internal Review by Program Faculty) Course ## StudentsExceedingMeetingApproachingNot Meeting GER GER Sub-Total - GER GER GER Sub-Total 100-level GER GER GER Sub-Total 200-level GER GER GER Sub-Total 300-level GER GER GER Sub-Total 400-level Total

4.Reviewing and Discussing Results and Closing the Loop Important points of discussion How are data consistent? Do students at different course levels perform similarly? Eventually, it will be possible to look at this issue over time How are they distinctive? Do students perform better on some objectives than others? Are assessment results acceptable to programs faculty? What strengths (and weaknesses) are revealed and what explains them? What should (and will) the program do to improve areas of weakness?

Incorporating Assessment Data into Annual Report At present, directions for Annual Report include the following: Descriptions of student learning outcome assessments conducted for your students during the year. These descriptions should include a summary of the assessment methods that were used and the results, including benchmarking as appropriate. Also, whenever possible, detail course-specific student learning outcomes and their relationship(s) to departmental or programmatic expectations.

Incorporating Assessment Data into Annual Report (cont.) APACs recommendation to the Provost and Deans is that Annual Reports include two distinct components with respect to the assessment of student learning, effective June 2012 Chart that summarizes the assessments that were done during the academic year and results Narrative that summarizes the programs discussion of the assessment results, overall conclusions, and planned changes based on the assessment data Programs may also report other relevant data (e.g., results from senior/alumni surveys)

Sample Outcomes Assessment Chart – Aggregate Data Simplest Case (and All Thats Required) Courses Assessment Measure(s) Performance Criteria # of Students % Exceeding Standards % Meeting Standards % Approaching Standards % Not Meeting Standards All Portfolio4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 33818%59%13%10%

Sample Outcomes Assessment Charts – More Detailed By Course Level Courses Assessment Measure(s) Performance Criteria # of Students % Exceeding Standards % Meeting Standards % Approaching Standards % Not Meeting Standards 100-Level Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 20318%59%13%10% 200-Level Papers 4=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 7818%59%13%10% Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 11218%59%13%10% 300-Level Portfolios4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 9618%59%13%10%

Sample Outcomes Assessment Charts – More Detailed By Competency Level Courses Assessment Measure(s) Performance Criteria # of Students % Exceeding Standards % Meeting Standards % Approaching Standards % Not Meeting Standards Beginning Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 20318%59%13%10% Inter- mediate Papers 4=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 789%61%20%10% Tests 90%+=Exceeding 80-89%=Meeting 70-79%=Approaching 69% and below=Not Meeting 11211%54%25%10% Advanced Portfolios4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting 9615%54%18%13%

Issues to Be Addressed in Narrative Overall conclusions of student performance on SLO as revealed by assessment Description of strengths and weaknesses revealed by data Planned revisions for improvements as appropriate Planned revisions to the assessment process itself as appropriate

APAC Members Paul French Josh Hammonds Michael Koch Julie Licata Richard Lee Patrice Macaluso Anuradhaa Shastri Bill Wilkerson (Chair) Patty Francis (ex officio)