Presentation is loading. Please wait.

Presentation is loading. Please wait.

Strategies for Implementing Program-Level Assessment through Outcomes Jeremy Anderson Manager, Academic Computing Dr. Kaitlin Walsh Instructional Designer/Technologist.

Similar presentations


Presentation on theme: "Strategies for Implementing Program-Level Assessment through Outcomes Jeremy Anderson Manager, Academic Computing Dr. Kaitlin Walsh Instructional Designer/Technologist."— Presentation transcript:

1 Strategies for Implementing Program-Level Assessment through Outcomes Jeremy Anderson Manager, Academic Computing Dr. Kaitlin Walsh Instructional Designer/Technologist American International College

2 About American International College

3 Who are you? Faculty, staff, administration? What is your experience using Outcomes? – No experience – Heard of it – Investigated it – Adopted it Three burning questions?

4 Outcomes Overview

5

6

7

8 OutcomeCourseAssignment Mature Students Rubric Program Assessment

9 The Challenge

10 Program Assessment Needs to Be: Systemic Sustainable And…

11 Systematic

12 Team-Based Approach

13 Assessment at AIC President Provost Institutional Effectiveness Deans & Departments Senate Assessment Committee Faculty Secretary* EVP Administration IT

14 Working Across the Organization Assessment Committee Working Group Pilot Pilot

15 Methods to Scale - Preparation

16 Getting the Framework in Place External Feedback Assessment Plan Template Program Review Schedule Assessment Calendar Internal Planning Other adopters Bb Consulting

17 Assessment Plan Template

18 Develop outcomes Assessment Plan & Bb Outcomes Gather Evidence & Evaluate Analyze & Discuss Improve Instruction Fall SpringSum/Fall Next AY Assessment Calendar

19 Institutional Support Assessment Day (2) Assessment Hour (1) ½ position to support Outcomes

20 Methods to Scale - Adoption

21 Questions for New Adopters – Readiness Check 1.Do you have an assessment lead? 2.Are your goals ready? 3.Are your assignments ready? 4.Are your rubrics ready? 5.What is your assessment calendar? a.Frequency of each goal b.Frequency of full cycle

22 Questions for New Adopters – Operational Decisions 1.What is your collection period? 2.Who will complete evaluation sessions? How many must complete evaluations? 3.What sampling level will you require? 4.Will you need to keep samples of student work with your reports?

23 Starting Small Undergraduate Psych All outcomes 1 course Capstone assignment Independent evaluator Department rubric MBA 2 outcomes 2 courses (1 o/c) Milestone assignments Independent evaluators AAC&U Value rubric Department rubric

24 Training Introductory section for faculty and chairs in several prospective departments One-on-one training with individual assessment coordinators in pilot departments Reconvened for follow-up training Next step: documentation

25 Roles – What Can They Do? AlignmentsGoalsSurveysRubrics Evidence Collection System Administrator Add Remove Create Edit Delete Create Edit Delete Create Edit Delete Create Edit Delete Assessment Administrator Add Create Edit Delete Create Edit Delete Create Edit Delete Create Edit Delete Assessment Manager NoneRun ReportsNone Create Edit Delete Rubric Manager NoneRead-OnlyNone Create Edit Delete None Survey Author NoneRead-Only Create Edit Delete None Goals Manager None Create Edit Delete None

26 Roles – What Can They See? Admin TabOutcomes Tab Outcomes Dashboard Surveys System Administrator Yes All Assessment Administrator Yes AllOnly their own Assessment Manager NoYes Goals & Assessments Organizations None Rubric Manager No Survey Author No Only their own Goals Manager YesNo

27 Caveat – Class and Program Size AIC’s class and program sizes are generally very small, therefore no need to limit sample size when evaluating outcomes. – Also no real need to make evaluations anonymous - instructor often involved in evaluation Small program sizes also impact outcomes planning – Limits to number of outside evaluators and assessment coordinators Source: US News & World ReportUS News & World Report

28 Things We Considered (But Didn’t Adopt Yet) Help Desk Ticket to submit Outcomes changes Organization to distribute reports Having faculty input own outcomes

29 General Education Outcomes

30 General Education at AIC Largely in flux Writing Intensive Courses Shared rubric for Title III AAC&U’s VALUE Rubrics

31 Pilot COM2200 – Information & Technology Writing Intensive Course 4 sections, 3 faculty, 2 evaluators

32 Next Steps Develop anchor set Train on the rubric Include other WICs

33 Next Steps & Lessons Learned

34 Building on What We Have New programs Assessment Hours throughout the year Expanding within departments ½-time position

35 Lessons Learned – Standardize Everything!

36 Lessons Learned – Rubric Design

37 More on Reports Depending on the type of data you need, you may need to adjust your reports. – Convert rows to columns – Cleanup for SPSS – May be no problem for some faculty – Need different data? Submit an enhancement request

38 Lessons Learned – Assignment Submission Needed to train some faculty on collecting assignments in Bb No direct submit No multiple attempts Group assignments not recommended

39 Unexpected benefit! Using outcomes rubrics motivated faculty to explore the use of Blackboard’s rubric tool within their own courses.

40 Questions? THANK YOU! Jeremy Anderson jeremy.anderson@aic.edu Kaitlin Walsh kaitlin.walsh@aic.edu


Download ppt "Strategies for Implementing Program-Level Assessment through Outcomes Jeremy Anderson Manager, Academic Computing Dr. Kaitlin Walsh Instructional Designer/Technologist."

Similar presentations


Ads by Google