Download presentation
Presentation is loading. Please wait.
Published byArleen Alexander Modified over 7 years ago
1
Department of Physics and Goal 2 Committee Chair
General Education Assessment of University Goal 2: Knowledge of the Physical and Natural World John Jaszczak Department of Physics and Goal 2 Committee Chair Friday, May
2
Goal 2: Knowledge of the Physical and Natural World
2.1 Scientific Knowledge 2.2 Quantitative Problem Solving 2.3 Interpretation of Mathematical Representations 2.4 Assumptions (Quantitative Literacy) 2.5 Data Analysis 2.6 Proposes Solutions/Models/Hypotheses
3
Challenges Broad Participation Across Many Departments
Mathematics (as well as Psychology and Business) Physical Sciences (Biology, Chemistry, Kinesiology, Physics, Social Science) “within current discipline-specific frameworks” Most participating programs were not used to formally conducting assessment of student learning outcomes. AAC&U had no science-related VALUE* rubric Assessment Methodology? *Valid Assessment of Learning in Undergraduate Education
4
Goals Work closely with faculty
Think about the rubric Thoughtfully develop assignments for assessment Take advantage of what they are already doing Let data drive changes to Assessment process Assignments Instruction
5
Process Formulate Initial Rubric
Pilot Assessment of Student Work in Committee Host a “Coffee Chat” workshop with Goal 2 Instructors Large-Scale Pilot Spring 2016 Communicate & Solicit Student Work Team of Assessors Evaluated Student Work Norming/Assessing/Debrief Report Findings to Assessment & Gen Ed Councils & Instructors
6
What Outcomes to Assess. http://www. mtu
Goal 2: Knowledge of the Physical and Natural World
7
What Did We Collect? Sampled Student Work
Final Exams Homework Assignments Laboratory Reports Scantron Data from Large-Enrollment Courses Raw Data Instructors supply mapping of questions to rubric criteria
8
What Did We Learn? Student work needs to be graded (with rubric & comments). Instructors need to identify relevant Goal 2 rubric criteria. Entire exams/homework/laboratory reports are better replaced by select (intentionally designed) questions/sections. Small sample sizes may limit utility. Instructor- or department-level assessment and reporting may be most valuable. (Mathematical Sciences is experimenting now…) Process needs champions and university-level support. Motivating, Reminding, Explaining, Handling Data, Coordinating Meetings…
9
Two Examples: Scantron Assessment for University Physics 1: Mechanics
Spring 2015 – 638 students Spring 2016 – 617 students Instructor-level Assessment and Reporting for College Physics 1
10
Scantron Assessment of Univ. Physics I
Comprehensive 40-question multiple-choice final exam. Not intentionally designed for Gen Ed assessment. Same primary instructor over several years. No sampling needed. Excel spreadsheet tool
11
PH2100: University Physics 1
First-semester, calculus-based physics covering Newtonian mechanics Primarily for Engineering and Science majors 1-credit laboratory is separate pre/co-requisite Spring enrollment is typically 550 to over 700 students in 2 sections.
13
PH2100: University Physics 1
First-semester, calculus-based physics covering Newtonian mechanics Primarily for Engineering and Science majors 1-credit laboratory is separate pre/co-requisite Spring enrollment is typically 550 to over 700 students in 2 sections. All exams are Scantron-graded, multiple-choice
14
Raw Data: Individual Student Answers to Each Question
15
Answers Compared to Key
=(IF(Answers!AL3=Answers!AL$2,1,0))
16
Criteria Matching Instructors assign relevant goal criteria to each question
17
Outcomes Per Student We are not interested in class averages but in numbers of students achieving each goal criteria EVERY student is evaluated on a % basis for every goal category. =SUMPRODUCT(B3:AO3,'Criteria Weights'!$C$3:$AP$3)/'Criteria Weights'!$AR$3*100
18
2.1 Scientific Knowledge 38 questions Most questions deal with scientific knowledge so 2.1 correlates nearly perfectly with the overall exam score. Most questions deal with scientific knowledge so 2.1 correlates nearly perfectly with the overall exam score. Data and graphs shown are for illustrative purposes only.
19
2.1 Scientific Knowledge Proficiency Levels 2 3 Level 1 4 38 questions
In consultation with a Goal 2 committee representative, instructors assign proficiency levels to the % score for each goal category. 38 questions Data and graphs shown are for illustrative purposes only.
20
2.2 Quantitative Problem Solving
20 questions Data and graphs shown are for illustrative purposes only.
21
2.2 Quant. Problem Solv. Proficiency Levels Level 1 2 3 4 20 questions
Data and graphs shown are for illustrative purposes only.
22
Topics for Further Discussion
Test Robustness Relative to the Subjective Judgements Discernment of Proficiency-Level Cut-Offs Instructor-Level Expectations General University-Level Expectations Encourage More Backwards Design Affecting Improvement Future Considerations? Major, year, repeat status, math proficiency, drop rate, etc.
23
A Model of Instructor-Level Assessment PH1110 Fall 2016
Mike Meyer, Director William G. Jackson Center for Teaching and Learning Friday,May
24
Class description Introductory, first-term algebra-based physics course 56 students (one did not take final exam) Survey of Mechanics, Sound, Fluids, and Thermo
25
Plan: Final Exam Q1-Q8: Multiple choice
Definitions, units, etc. Drawn from pools, so variable Q9-12 and Q29 Targeted subgoal questions Same questions for all students Q Numerical problems Drawn from pools by topic Values vary by student Initially graded as 100% right or wrong by computer Students meet to discuss work and receive partial credit
26
Scientific Knowledge:
Goal 2.1: Level 1: Scores ≥ 50% (4/8) or better on multiple choice section of final exam (Q. 1-8) 100% of students met this criteria Level 2: Scores ≥ 75 % (6/8) or better on multiple choice section of final exam (Q. 1-8) 94% of students met this criteria
27
Problem Solving/Modeling
Goals 2.2 and 2.6: Level 1: Scores ≥ 40% or better (160/400) on problem portion of exam after partial-credit follow-up 96% (53/55) of students met this criteria Level 2: Scores ≥ 60% or better (240/400) on problem portion of exam after partial credit follow-up 89% (49/55) of student met this criteria Level 3: Scores ≥ 80% or better (320/400) on problem portion of exam after partial credit follow-up 51% (28/55 students) met this criteria
28
Problem #11 (2.3 and 2.5) Problem #12 (2.3 and 2.5)
29
Problem #29 (2.3)
30
Mathematical Representations
Goal 2.3 Level 2: Scores ≥ 60% or better on Graphing problems (#11, #12, and #29) 51% (28/55) of students met this criteria Level 3: Scores ≥ 80% or better on Graphing problems (#11, #12, and #29) 25% (14/55) of students met this criteria
31
Assumptions Goal 2.4 Level 1 – score ≥ 50% or better on final exam
95% of students met this criteria Level 2: Level 1 + at least one of two assumption problems (#9 and #10 on final exam) correct. 53% of students met this criteria
32
Data Analysis Goal 2.5 Level 2 – at least one of two graphing problems (#9, #10) correct 98% of students met this criteria Level 3 – Both graphing problems (#9, #10) correct 51% of students met this criteria
33
Analysis/Summary Course seems to be accomplishing Level 2 “Developing” goal for: 2.1 – Scientific Knowledge 2.2 – Quantitative Problem Solving 2.6 – Models/Hypotheses Assessment might need work for: 2.4 – Assumptions 2.5 – Data Analysis Content area that likely needs more focus: 2.3 – Graphing
34
Instructor-Level Assessment Model
Intentionally designed student work meets the needs of both the course/instructor and assessment process Subjective decisions are made at the root level Results are readily available for reflection and action Opportunities for discussion need to be fostered
35
Contact information: Mike Meyer CTL: John Jaszczak Image credit: Johanne Bouchard,
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.