Department of Physics and Goal 2 Committee Chair

Slides:



Advertisements
Similar presentations
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Advertisements

Why are we spending time writing assessment reports? 1.Because documenting your process can potentially lead to more trustworthy results and meaningful.
Assessment Report Biology School of Science and Mathematics Rey Sia, Chair Laurie B. Cook, Assessment Coordinator.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
MAINTAINING QUALITY IN BLENDED LEARNING: FROM CLASSROOM ASSESSMENT TO IMPACT EVALUATION PART I: DESIGNING AND EVALUATING LEARNING Suzanne Weinstein, Ph.D.
Department of Mathematical Sciences School of Science and Technology B.A. in Mathematics CIP Code: Program Code: Program Quality Improvement.
Overview of the Department’s ABET Criterion 3 Assessment Process.
4/16/07 Assessment of the Core – Science Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
EDU 385 Education Assessment in the Classroom
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
Making the Connection between Assessment and Learning More Intentional Carol Rovello, Director of Employee & Organization Development Sherian Howard, CAD.
Chemistry Assessment Update C101-Lecture C121-Laboratory Chemistry department expanding into C100-Lecture C120 Laboratory Assessment Committee Mandate.
Student Preferences For Learning College Algebra in a Web Enhanced Environment Dr. Laura J. Pyzdrowski, Pre-Collegiate Mathematics Coordinator Institute.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Competency Assessment Advisory Team (CAAT) QUANTITATIVE REASONING DEPARTMENT OF MATHEMATICS REP – ROB NICHOLS 1.
General Education Assessment Report Assessment Cycle.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Quantitative Literacy Across the Curriculum. Members of the QLAC Committee Beimnet Teclezghi – co-chair Laura Pannaman – co-chair Marilyn Ettinger John.
8/23/ th ACS National Meeting, Boston, MA POGIL as a model for general education in chemistry Scott E. Van Bramer Widener University.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Introduction to Teacher Evaluation
Nuts and Bolts of Assessment
Heidi Manning, Susan Larson and Bethany Leraas
Direct vs Indirect Assessment of Student Learning: An Introduction
New Developments in NYS Assessments
Office of Planning & Development
CRITICAL CORE: Straight Talk.
Supporting Sustainable Active Learning
Fullerton College SLOA Workshop:
Developing a Student Learning Outcomes Assessment Plan and Report
Assessing Students' Understanding of the Scientific Process Amy Marion, Department of Biology, New Mexico State University Abstract The primary goal of.
Scaffolding Online Math HW for Effective Assessment
Mathematics Assessment (Math 321 & Math 334)
How Technologically Literate are EMCC Students?
TRCC TAP Course Vetting: What we have learned so far
Creating Analytic Rubrics April 27, 2017
General Education Assessment
ASSESSMENT OF STUDENT LEARNING
General Education Assessment
The Good The Bad & The Ugly Real-Life Examples of the SLO Assessment Report Form With Tips on How to Complete It August 21, 2012.
Course Evaluation Committee
Institutional Learning Outcomes Assessment
Assessment and Course Redesign in Community College Geosciences
Capturing Student Learning Outcomes via
Updates on the Next-Generation MCAS
Quantitative Literacy at WOU
Problem Solving Institutional Learning Outcome Assessment
General Studies ePortfolio Pilot
Jillian Kinzie, Indiana University Center for Postsecondary Research
Rubrics for academic assessment
The Academic Alert System: Fall 2007 Report
Somerset Primary Data Report/SBG Information Session
Assessment Literacy: Test Purpose and Use
Assessment of GE Synthesis
Applied Psychology Program School of Business and Liberal Arts Fall 2016 Assessment Report
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Chair: Nadine Jennings Date of Presentation: January 19, 2017
COURSE PLANNING IN AN OPEN ENROLLMENT ENVIRONMENT
Welcome Reporting: Individual Student Report (ISR), Student Roster Report, and District Summary of Schools Report Welcome to the Reporting: Individual.
Curriculum Coordinator: Marela Fiacco Date : January 18, 2018
Curriculum Coordinator: Patrick LaPierre February 3, 2017
STATISTICS derived from the Latin word STATUS, Italian word STATISTA, German word STATISTIK, and French word STATISTIQUE which express one meaning “ Political.
Student Learning Outcomes Assessment
General Studies ePortfolio Pilot
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Department of Physics and Goal 2 Committee Chair General Education Assessment of University Goal 2: Knowledge of the Physical and Natural World John Jaszczak Department of Physics and Goal 2 Committee Chair Friday, May 5 2017

Goal 2: Knowledge of the Physical and Natural World 2.1 Scientific Knowledge 2.2 Quantitative Problem Solving 2.3 Interpretation of Mathematical Representations 2.4 Assumptions (Quantitative Literacy) 2.5 Data Analysis 2.6 Proposes Solutions/Models/Hypotheses

Challenges Broad Participation Across Many Departments Mathematics (as well as Psychology and Business) Physical Sciences (Biology, Chemistry, Kinesiology, Physics, Social Science) “within current discipline-specific frameworks” Most participating programs were not used to formally conducting assessment of student learning outcomes. AAC&U had no science-related VALUE* rubric Assessment Methodology? *Valid Assessment of Learning in Undergraduate Education

Goals Work closely with faculty Think about the rubric Thoughtfully develop assignments for assessment Take advantage of what they are already doing Let data drive changes to Assessment process Assignments Instruction

Process Formulate Initial Rubric Pilot Assessment of Student Work in Committee Host a “Coffee Chat” workshop with Goal 2 Instructors Large-Scale Pilot Spring 2016 Communicate & Solicit Student Work Team of Assessors Evaluated Student Work Norming/Assessing/Debrief Report Findings to Assessment & Gen Ed Councils & Instructors

What Outcomes to Assess. http://www. mtu Goal 2: Knowledge of the Physical and Natural World

What Did We Collect? Sampled Student Work Final Exams Homework Assignments Laboratory Reports Scantron Data from Large-Enrollment Courses Raw Data Instructors supply mapping of questions to rubric criteria

What Did We Learn? Student work needs to be graded (with rubric & comments). Instructors need to identify relevant Goal 2 rubric criteria. Entire exams/homework/laboratory reports are better replaced by select (intentionally designed) questions/sections. Small sample sizes may limit utility. Instructor- or department-level assessment and reporting may be most valuable. (Mathematical Sciences is experimenting now…) Process needs champions and university-level support. Motivating, Reminding, Explaining, Handling Data, Coordinating Meetings…

Two Examples: Scantron Assessment for University Physics 1: Mechanics Spring 2015 – 638 students Spring 2016 – 617 students Instructor-level Assessment and Reporting for College Physics 1

Scantron Assessment of Univ. Physics I Comprehensive 40-question multiple-choice final exam. Not intentionally designed for Gen Ed assessment. Same primary instructor over several years. No sampling needed. Excel spreadsheet tool

PH2100: University Physics 1 First-semester, calculus-based physics covering Newtonian mechanics Primarily for Engineering and Science majors 1-credit laboratory is separate pre/co-requisite Spring enrollment is typically 550 to over 700 students in 2 sections.

PH2100: University Physics 1 First-semester, calculus-based physics covering Newtonian mechanics Primarily for Engineering and Science majors 1-credit laboratory is separate pre/co-requisite Spring enrollment is typically 550 to over 700 students in 2 sections. All exams are Scantron-graded, multiple-choice

Raw Data: Individual Student Answers to Each Question

Answers Compared to Key =(IF(Answers!AL3=Answers!AL$2,1,0))

Criteria Matching Instructors assign relevant goal criteria to each question

Outcomes Per Student We are not interested in class averages but in numbers of students achieving each goal criteria EVERY student is evaluated on a % basis for every goal category. =SUMPRODUCT(B3:AO3,'Criteria Weights'!$C$3:$AP$3)/'Criteria Weights'!$AR$3*100

2.1 Scientific Knowledge 38 questions Most questions deal with scientific knowledge so 2.1 correlates nearly perfectly with the overall exam score. Most questions deal with scientific knowledge so 2.1 correlates nearly perfectly with the overall exam score. Data and graphs shown are for illustrative purposes only.

2.1 Scientific Knowledge Proficiency Levels 2 3 Level 1 4 38 questions In consultation with a Goal 2 committee representative, instructors assign proficiency levels to the % score for each goal category. 38 questions Data and graphs shown are for illustrative purposes only.

2.2 Quantitative Problem Solving 20 questions Data and graphs shown are for illustrative purposes only.

2.2 Quant. Problem Solv. Proficiency Levels Level 1 2 3 4 20 questions Data and graphs shown are for illustrative purposes only.

Topics for Further Discussion Test Robustness Relative to the Subjective Judgements Discernment of Proficiency-Level Cut-Offs Instructor-Level Expectations General University-Level Expectations Encourage More Backwards Design Affecting Improvement Future Considerations? Major, year, repeat status, math proficiency, drop rate, etc.

A Model of Instructor-Level Assessment PH1110 Fall 2016 Mike Meyer, Director William G. Jackson Center for Teaching and Learning Friday,May 5 2017

Class description Introductory, first-term algebra-based physics course 56 students (one did not take final exam) Survey of Mechanics, Sound, Fluids, and Thermo

Plan: Final Exam Q1-Q8: Multiple choice Definitions, units, etc. Drawn from pools, so variable Q9-12 and Q29 Targeted subgoal questions Same questions for all students Q13-28 Numerical problems Drawn from pools by topic Values vary by student Initially graded as 100% right or wrong by computer Students meet to discuss work and receive partial credit

Scientific Knowledge: Goal 2.1: Level 1: Scores ≥ 50% (4/8) or better on multiple choice section of final exam (Q. 1-8) 100% of students met this criteria Level 2: Scores ≥ 75 % (6/8) or better on multiple choice section of final exam (Q. 1-8) 94% of students met this criteria

Problem Solving/Modeling Goals 2.2 and 2.6: Level 1: Scores ≥ 40% or better (160/400) on problem portion of exam after partial-credit follow-up 96% (53/55) of students met this criteria Level 2: Scores ≥ 60% or better (240/400) on problem portion of exam after partial credit follow-up 89% (49/55) of student met this criteria Level 3: Scores ≥ 80% or better (320/400) on problem portion of exam after partial credit follow-up 51% (28/55 students) met this criteria

Problem #11 (2.3 and 2.5) Problem #12 (2.3 and 2.5)

Problem #29 (2.3)

Mathematical Representations Goal 2.3 Level 2: Scores ≥ 60% or better on Graphing problems (#11, #12, and #29) 51% (28/55) of students met this criteria Level 3: Scores ≥ 80% or better on Graphing problems (#11, #12, and #29) 25% (14/55) of students met this criteria

Assumptions Goal 2.4 Level 1 – score ≥ 50% or better on final exam 95% of students met this criteria Level 2: Level 1 + at least one of two assumption problems (#9 and #10 on final exam) correct. 53% of students met this criteria

Data Analysis Goal 2.5 Level 2 – at least one of two graphing problems (#9, #10) correct 98% of students met this criteria Level 3 – Both graphing problems (#9, #10) correct 51% of students met this criteria

Analysis/Summary Course seems to be accomplishing Level 2 “Developing” goal for: 2.1 – Scientific Knowledge 2.2 – Quantitative Problem Solving 2.6 – Models/Hypotheses Assessment might need work for: 2.4 – Assumptions 2.5 – Data Analysis Content area that likely needs more focus: 2.3 – Graphing

Instructor-Level Assessment Model Intentionally designed student work meets the needs of both the course/instructor and assessment process Subjective decisions are made at the root level Results are readily available for reflection and action Opportunities for discussion need to be fostered

Contact information: Mike Meyer mrmeyer@mtu.edu CTL: 487-3000 John Jaszczak jaszczak@mtu.edu 487-2255 Image credit: Johanne Bouchard, https://www.linkedin.com/pulse/20141125161326-18341-gratitude-and-our-ability-to-humbly-say-thank-you