Presentation is loading. Please wait.

Presentation is loading. Please wait.

Welcome! Session Recording

Similar presentations


Presentation on theme: "Welcome! Session Recording"— Presentation transcript:

1 Welcome! Session Recording
Session Recording This session will be recorded. If you would prefer to remain anonymous during the recording, you may want to change your display name to your department name. To change your display name: Click on [Participants], located at the bottom of the meeting window. Click [Rename]. Enter your preferred display name (e.g. – department name). Audio Set Up You will need to select the audio device(s) you want to use when listening and/or speaking during the session. To set up your audio device(s): Click the audio icon in the lower left corner of the meeting window. Select your preferred microphone and speaker. A USB headset is recommended. Mute your microphone when you are not speaking by clicking [Mute]. This will prevent unwanted audio feedback.

2 Academic Outcomes Assessment
Beth Wuest, Ph.D. Associate Vice President for Institutional Effectiveness

3 Workshop Goals To write thoughtful assessment reports by learning knowledge and skills to Draft Results Assess the extent to which outcomes are achieved Describe findings Develop Action Plans Develop action plans with strong potential to improve results Describe Improvements Provide evidence of improvement based on changes in results and action plans from previous years

4 SACSCOC Influence SACSCOC Standard STUDENT ACHIEVEMENT
The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of actively seeking improvement based on analysis of the results in the areas below: a. Student learning outcomes for its educational programs (Student outcomes and assessment: educational programs) b. Student learning outcomes for collegiate-level general education competencies (Student outcomes and assessment: general education) c. Academic and student services that support student success (Student outcomes and assessment: academic and student support)

5 General Expectations at Texas State
In Fall programs revisit Mission, Outcomes, and Method In Spring programs report Results, Action Plans, and Evidence of Improvement

6 Results

7 Define/Revise Mission & Goals Identify/Revise Outcomes Select/Revise Methods of Assessment Describe Evidence of Improvement Create & Implement Action Plan Assess Student Learning and Report Results Develop/Revise Curriculum and Instruction

8 Results For each method, clearly describe the findings illustrating what students learned based on valid and reliable data Summarize findings related to the outcome and method Compare findings to target Compare this year’s findings to last year’s findings Aggregate and disaggregate findings based on location and method delivery Provide sufficient detail in the summary of findings to suggest means for improvement

9 Results Example During spring 2017, 50 undergraduate students in two sections of GEO 43xx were assessed using six embedded test questions to measure their knowledge of quantitative methods for geography. The instructors found that 82% of the undergraduate students met (n=29 or 58%) or exceeded (n=12 or 24%) expectations by demonstrating their knowledge of quantitative methods for geography, which exceeded our 80% target. The 18% of the students who failed to meet expectations had the most difficulty with basic arithmetic and algebra skills. The online section enrolled 20 of the 50 students. Of the online students 85% met (n=11 or 55%) or exceeded (n=6 or 30%) expectations while of the 30 students enrolled in the face-to-face section only 80% met (n=18 or 60%) or exceeded (n=6 or 20%) expectations. Given these findings, students should continue to improve their knowledge of basic arithmetic and algebra skills as prompted by the instructor. Instructors will closely examine the third embedded test question because students seemed to have the most difficulty answering it correctly. A related finding shows that students who failed to meet expectations had significant attendance problems. Overall, it should be noted that the results show an improvement over the spring findings of 79.2% of GEO 43xx students meeting the expectation and met the target of 80% for the first time in both the distance and face-to-face sections.

10 Checklist on Results Are the results a reflection upon and discussion of the findings amassed from the corresponding method? Are the results based on reliable and valid data collection methods? Do the numbers add up? Are the results more than the reporting of grades? Do the results focus on student accomplishments and success? Has the data been aggregated and disaggregated? How do the results compare to those obtained last year? Do the results address achievement of the outcome? Do the results provide indicators for improvement?

11 Action Plan

12 Define/Revise Mission & Goals Identify/Revise Outcomes Select/Revise Methods of Assessment Describe Evidence of Improvement Create & Implement Action Plan Assess Student Learning and Report Results Develop/Revise Curriculum and Instruction

13 Action Plans For which methods did this year’s results of student learning fall short of last year’s results and/or not meet the expected target? Describe actions that could be taken to improve the results (student learning) What actions will be taken? How are they likely to improve student learning? Action plans are intended to Enhance effectiveness of program Provide for efficient use of resources

14 Action Plans Example Despite the fact that College Algebra is a prerequisite for our quantitative methods course, many students continue to have difficulty with basic arithmetic and algebra. GEO 43XX instructors plan to spend significant time reviewing basic mathematical operations and techniques and plan to continue to do so in the future as required. Instructors also plan to elaborate on a primary spatial problem using a sample data set that requires statistical analysis using either Excel or SPSS. With this added instruction and experience working through a sample problem in both the face-to-face and distance delivered courses, students should more readily apply quantitative methods desired in Outcome 3, Method 1 and consistently meet the 80% target.

15 Checklist for Action Plans
Do results suggest actions? How can results be improved? Are action plans based on the findings reported in the results? Is it likely that the actions will lead to an improvement in student learning? Are action plans feasible/realistic considering available time and resources?

16 Evidence of Improvement

17 Define/Revise Mission & Goals Identify/Revise Outcomes Select/Revise Methods of Assessment Describe Evidence of Improvement Create & Implement Action Plan Assess Student Learning and Report Results Develop/Revise Curriculum and Instruction

18 Evidence of Improvement
For which methods did this year’s results of student learning show measureable improvement from last year’s results? Describe those that showed the most significant improvement as evidenced in comparative results Last year vs this year Last semester vs this semester Provide data to support statements

19 Evidence of Improvement Example
Assessment results of this year’s six embedded questions for Outcome 3, Method 1 show that 82% of the students met or exceeded expectations compared to 79.2% of students meeting or exceeding expectations during the previous academic year – a 3.5% improvement, due at least in part, to a review-session activity created by the instructor as described in last year’s action plan. Although all students met the target, student in the online section performed better (85% met or exceeded expectations) than those in the face-to-face version of the course (80% met or exceeded expectations).

20 Checklist on Evidence of Improvement
Were improvements were noted? Do improvements focus on student learning? Can the evidence of improvement be verified in the results? How do improvements relate to previous action plan?

21 General Tips on Writing Results

22 Is information and data presented accurately? Do the numbers make sense?
Have the sections been aggregated and disaggregated for location (San Marcos vs Round Rock) and method of delivery (face-to-face vs distance education)? Would the report make sense to someone who is not in your program or field of study? Has the report been proofread? Will changes in outcomes or methods be needed for future assessment cycles?

23 Negative Examples Targets were met in most cases. Student learning improved. More student responded positively than previous year. Lowering targets resulted in better achievement levels. Positive example Low success rates on Outcome #3 last year (67% answered embedded test questions correctly) led to doubling class instructional time devoted to orbital mechanics in selected class sections. The correct percentage rose to 86% this year in those classes and remained at about the same (68%) in unchanged classes. Extended instructional time will continue for this next year to determine if results are repeatable.

24 Outcomes Assessment Resources
Outcomes Assessment Website with Process Information Outcomes Assessment Data Repository for Entering or Reviewing Reports


Download ppt "Welcome! Session Recording"

Similar presentations


Ads by Google