Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introductions Institutional Effectiveness,

Similar presentations


Presentation on theme: "Introductions Institutional Effectiveness,"— Presentation transcript:

1 Introductions Institutional Effectiveness,
System Office of Decision Support Christopher Combie, Ph.D. Assistant Director for Assessment, Institutional Effectiveness Programs conduct internal assessments Plans/reports submitted via SAM database Reviewed, analyzed, and compiled by IE Used in reports for Deans, Provost, BOG, SACSCOC We’re are fundamentally ill-prepared to lead (most of us were and are professors at heart) – we know how to generate new knowledge and effectively communicate such to our students or to approach complex problem solving through our research, but… Academic Assessment Workshop, May 8, 2018

2 Compliance vs. Quality Compliance: Enforced by IE to ensure compliance with SACSCOC and BOG requirements Quality: Fostered by ATLE to make assessment more meaningful

3 Academic Assessment Outline
Academic assessment plans and reports Components and requirements of an assessment plan System for Assessment Management

4 Academic Assessment Plans & Reports
Annual report for each major and certificate program Program learning goals and student learning outcomes Plans for future program improvement Formative assessment of the major or certificate program, not summative assessment of students Non-comprehensive Non-linear Program-centric Looking for opportunities for improvement Continuous

5 Components of an Assessment Plan in SAM
Mission statement of program or department Program goals (PGs) Student learning outcomes (1 or more per PG) Student learning outcome statement Method of assessment Performance targets Results Use of assessment results for program improvement

6 Components of an Assessment Plan Schema
Mission Program Goals Student Learning Outcome Learning Outcome Statement Method of Assessment Performance Targets Assessment Results Use of Assessment Results

7 3-year Academic Program Assessment Rubric* Overview
Year 1: Develop and pilot test assessment plan. Can assessment results be meaningfully interpreted? If no, revise assessment plan.   Year 2: Implement assessment plan, interpret assessment results from years 1 and 2, and develop an action plan for improving curriculum or instruction. Year 3: Document implementation of curricular or pedagogical changes outlined in action plan for improvement; begin development of next 3-yr. plan SACSCOC requires that we provide evidence of seeking improvement based on analysis of assessment results. *Please refer to rubrics in your packet

8 Certificates The following three possibilities apply to certificates:
If ALL of the students in the certificate program are non-degree seeking, non-financial aid students, then you can state that in SAM and do not have to complete a program assessment. If even one student who is degree-seeking and receives financial aid is enrolled in the certificate program then you have to provide an assessment (some certificates can be taken as electives in a degree-seeking program and this is where this may come into play). If your certificate is a sub-set of a full degree program (the classes in the certificate are really just a portion of a full degree program), then your certificate assessment can be identical or a subset of your assessment reported for the degree program.

9 The Phases of the Academic Reporting Cycle (Note: For all colleges, except Education and Business)
Important Notes: Changes to your plan can be made at any time Reviews by IE are intended to be helpful – this is a conversation Adapted from Rodriguez & Frederick, Miami Dade College (2013)

10 Academic Assessment Reporting
Academic Reporting Cycle Dates Note: For all colleges except Education and Business Academic Assessment Reporting Plan Year Plan Due Dates Plan Reviewed* Final Report Date Report Reviewed* 2018 August 31 September 14 December 15 January 9, 2019 2019 January 31 February 14 January 9, 2020 2020 January 9, 2021 *Only plans submitted prior to the due date

11 How does USF organize academic program assessment plans?
“SAM”: The System for Assessment Management Database maintained by Institutional Effectiveness Available online with NetID/USF password

12 Accessing the System for Assessment Management
The pink (or orange) paper provided in your training materials has detailed instructions for logging in Recommendation: Once you get to the login page, bookmark it

13

14 How do you know what your plan was rated?

15 Overall Plan Rating Scale*
If a plan receives a yellow or red overall rate, we will select that it is “non- compliant” Programs will be notified and asked to make changes to sections that still need clarification or modification When secondary adjustments have been made, the “Plan complete?” button should be changed to “Yes” or “Submit Plan” *Disclaimer: These changes are in beta testing and may differ in the live production environment.

16 Concluding Points The academic assessment is formative of the program, not summative of the students Not comprehensive assessment of all program efforts, make it manageable Not linear; if one area has met the performance target and is doing well, move on to another area Brainstorm possible area(s) of improvement to tackle over the next three years

17 Transition Macro-level view of assessment
Micro-level view of assessment

18 Program Goals vs. Student Learning Outcomes
Not always measurable Broad statements Skills, content, knowledge or tasks students will have Example: “Development of critical thinking skills” Undergraduate BOG requirements (ALC) Content-specific knowledge Critical thinking skills Communication skills Graduate degrees & certificates No ALC requirement Student Learning Outcomes A demonstrable skill, measurable change in behavior Specific statements What students will know and demonstrate after the course(s) Example: “Majors in this program will be able to conduct original research in this discipline using appropriate methods.” Includes five detailed parts

19 Method of Assessment Section
Should be the most detailed section under the student learning outcome Less flexible with several requirements of what should be included Examples of student work to assess: Written student work Student presentation/performance Portfolio of student work Part of a laboratory report Embedded test questions Part of a thesis or dissertation Standardized tests Internship/practicum evaluation

20 Method of Assessment: Requirements
A clear description of the assessment method A statement detailing how the assessment specifically measures the task, information, or competency stated in the student learning outcome Context of the assessment How the assessment will be scored Which students in the program will be assessed

21 Method of Assessment: Requirements (continued)
If a sample of student work will be analyzed in lieu of all students, include information on the sample If employing a rubric, information is needed on the rubric used and how it was developed and validated E.g. How will inter-rater reliability be addressed Information on how the rubric will be calibrated Information on who will be reviewing and rating the samples of student work

22 Method of Assessment: Inappropriate Assessments
Grades CANNOT be used as outcomes (summative/auditive assessment of student work, not formative/educative assessment of program work) Summative student assessments (the passing or failing of a culminating exam, course or program) CANNOT be used (summative vs. formative) Student/faculty/alumni surveys CAN be supporting evidence, but not the only measure (indirect measures) Publication in an academic journal CANNOT be used as a student learning outcome (external factor) Job placement CANNOT be used (external factor)

23 Performance Targets: Target for measured program performance
SHORT statement on what benchmark(s) the program wants to meet This is not how we think our students will perform, but what we want our program to do Should be worded in terms of performance on the rubric or rating instrument “Performance target will be considered met if 75% of students achieve an overall score of 4 out of 5 or higher.” “Performance target will be considered met if 80% of students assessed will receive a final score of ‘Commendable’ or higher.”

24 Assessment Results SHORT statement on the final results of the assessment(s) Stated in terms of the rubric or assessment instrument Should NOT include interpretation of the results Do not be afraid to report that the target was not met (focus is on finding strengths and weaknesses)

25 Use of Assessment Results: “Dos”
Interpretation & analysis of results data Should include specified, actionable “next steps” the program will take to develop or improve on a programmatic level: For example: Curriculum mapping Revisit or revise the assessment method/rubric Revisions to plan of study/curricular offerings Development of new modules/courses Faculty development

26 Use of Assessment Results: “Don’ts”
This is not an assessment report of student achievement Do not include: tutoring effort, sending students to the writing center, counseling students, providing students feedback, or improved recruitment Remember: The responsibility is on the program, not its students. Use of results section should indicate programmatic improvements

27 PLEASE CONTACT US ALONG THE WAY! We are here to help!
Questions? PLEASE CONTACT US ALONG THE WAY! We are here to help! IE 813/ ATLE 813/


Download ppt "Introductions Institutional Effectiveness,"

Similar presentations


Ads by Google