Download presentation
Presentation is loading. Please wait.
Published byAnthony Thomas Modified over 6 years ago
1
Going Beyond Grades In Assessing Student Learning Slides: www.zogotech.com/innovations
Dr. Blaine Bennett Richard (Dick) Whipple Michael Taft
2
Outline About SWTJC Review Traditional Analysis Learning Outcomes
Completion, Grades, Placement Scores, etc. Learning Outcomes Find It Assess It Improve It - Close the Loop Questions Hector: Welcome and Explain who we are, our recent growth, our SACS issues and ATD. BB
3
About SWTJC Large service area HSI ~ 6,000 Enrollment
11 Counties (16,769 square miles) Rural HSI ~ 6,000 Enrollment 70+% FTIC need remediation Achieving the Dream Developmental Education Gateway Courses Culture of Evidence Hector: Welcome and Explain who we are, our recent growth, our SACS issues and ATD. BB
4
Background: SWTJC Student Data Warehouse
Blaine Presents: Lots of data. Give opportunity to discuss other data elements (quantitative + qualitative) BB -> DW
5
Traditional Learning Analysis
Grades + GPA 2.0 = LEARNING “Those were the days!” DWhipple: Discuss the old way of determining if students are learning; and new state mandates (access to success) DW
6
Previous Work Course Completion
DW
7
Previous Work Gateway Course Completion
Above: Shows the correlation between gateway course completion and graduation. Within 4 terms of completing College Level math (in green) 30% of students have graduated (almost double that of the reading and writing gateway courses!) DW
8
Previous Work Test scores
Looking at class and test history, impossible to see where student is in dev ed sequence DW 8
9
Previous Work Test Scores + Grades = Subject Levels
Incorporate test scores + grades into single “level” Easy to understand, analyze Test History Class History Bottom Line è DW-BB
10
Learning Indicators Direct Indirect (affects learning) Grades (GPA)
Test Data Placement Completion (dev ed / gateway courses, core, certificate/degree) Subject Levels (normalized placement) Academic Status (Probation, Suspension, PTK, Near graduation, LHF) Outcome Assessment Data Indirect (affects learning) Demographic & Personal Data Attendance (99% of success is showing up) Student Activities/Support Services (Engagement- Advisement, Tutoring) Enrollment patterns External factors (Financial, job, family, etc) Blaine Presents: Lots of data. Give opportunity to discuss other data elements (quantitative + qualitative) DW 10
11
Find It, Assess It, Improve It
SACS Expectations 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: educational programs, to include student learning outcomes Find It, Assess It, Improve It Close the Loop Dick Starts here DW
12
Outline Find It - identify expected outcomes
Assess It – integrate multiple assessment methods Improve It/Close the loop – use detailed student information to improve learning outcomes DW
13
Identify Outcomes (Find it)
(C.S ) Identifies expected outcomes Academic Departments developed Program Outcomes – e.g. General Studies – “Reason quantitatively as well as verbally.” Identify course outcomes as they relate to Program Outcomes Dick Presents DW
14
Outcomes – College Algebra
Define and interpret functions including: linear, quadratic, higher degree polynomial, absolute value, radical, exponential and logarithmic Use the graph and other methods to determine the domain and range of a function Solve equations/inequalities of various types with particular emphasis on linear and quadratic polynomials. Perform basic operations on functions including arithmetic combinations, composition/inverse and the transformations of their various graphs Use properties of logarithmic/exponential expressions in graphing and solving Apply exponential functions to model exponential growth/decay Use graphing calculator features to determine algebraic and graphical results Use the fundamental theorem of algebra Use algebraic concepts to solve real-world applications Note: Outcomes developed by faculty DW
15
Assess Outcomes (Assess it)
Assesses expected outcomes Acquired Prosper (Scantron Test Management System) to collect and report outcome assessment data Developed custom forms using objective and subjective answer responses Used forms to create pre and post assessments for use in dev-ed and gateway courses Created assessment-based “grades” to better reflect true student learning DW
16
Develop Outcome Assessment
Develop questions for each outcome (e.g. 4 questions per outcome) Enter questions and answers into Prosper Link questions to outcomes Create assessments from question pool Establish outcomes mastery level (e.g. 75% 3 of 4 questions for related outcome answered correctly) Score assessments and store results on central Prosper database for immediate access Upload results to Estudias (student data warehouse) More often than not, grades and assessment do not line up. DW-MT
17
Key points: Automatic downloads, no additional data entry
Data Flow More often than not, grades and assessment do not line up. Key points: Automatic downloads, no additional data entry MT
18
Prosper Answer Form Objective and subjective (i.e. writing samples) responses captured MT
19
Assess It – Captured Image Writing for Academic Success
Objective Image CGarabedian discusses the Prosper form and WAS. Subjective Image
20
Outcome Analysis (Demo)
MATH 1314 (College Algebra) outcomes MT
21
Outcome Analysis (Demo) looking at one outcome across faculty members
We can see great differences in how students did on one particular outcome across faculty members MT
22
Improved Programs & Learning (Institution)
College Algebra Final Grade vs. Post-test Assessment (All sections) Ideally peaks should line up. DW
23
Improved Programs & Learning (Faculty)
College Algebra Final Grade vs. Post-test Assessment (Single Section) Realistic result: Assessment lags grade. Better alignment between grades and assessment DW
24
Identify Outliers Once Assessment scores are in the data warehouse we can compare how the students did on the post-assessment test to what letter grade (Final Grade above) they received in the class. One would expect them to be similar, but … In theory, most students should fall on the two diagonals shown Students off the diagonals (outliers) offer possibilities for interventions 21% of the students who got a C on the post-test, received an A in the class Students who work hard, but are poor test takers? 4% of the students who got an A on the assessment, got a C in the class Students who know the material, but are bored with the class? DW
25
So what does all of this have to do with Aaron & Susan?
Aaron Abel Suzanna Alvarez BB
26
Student Dashboard (Improve It / Close the Loop)
Blaine Presents: Lots of data. Give opportunity to discuss other data elements (quantitative + qualitative) BB
27
Demo Get groups, flip through students Good group: Close to graduation
Bad: Not attending Good outlier, bad outlier Groups based on assessment
28
Conclusion Don’t forget about individual students in institutional analysis (student information should drive institutional information) Need multiple data sources to analyze beyond just grades (i.e. outcomes assessment ,attendance, pre/post testing, NSC) A data warehouse (i.e. ZogoTech) can help Blaine Presents: Lots of data. Give opportunity to discuss other data elements (quantitative + qualitative)
29
Questions Southwest Texas Junior College ZogoTech Blaine Bennett
(830) Dick Whipple (830) Hector Gonzales ZogoTech Michael Taft (214) x801
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.