Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.

Slides:



Advertisements
Similar presentations
The IB Hexagon. Group 1 Language A Language A The primary language of the country The primary language of the country 45 languages available 45 languages.
Advertisements

Chapter 2 The Process of Experimentation
Animal, Plant & Soil Science
Metadisciplinary Outcomes for Science Literacy (Can Assess Now by Standardized Concept Inventory) STUDENT WILL BE ABLE TO… 1. Define the domain of science.
General Studies Areas Core Areas –Literacy & Critical Inquiry (L) –Mathematical Studies (MA/CS) –Humanities & Fine Arts (HU) –Social & Behavioral Sciences.
The First-Year Experience at UMBC Office of Undergraduate Education Data Summary Fall, 2012 – Spring, 2013.
Evidence of Student Learning Concordia University Elizabeth Owolabi, Ph.D. Katherine Brandon, M.A.
General Education Revision. Mission & Purpose Mission Rooted in the tradition of liberal arts education, FGCU’s General Education Program provides students.
Core Competencies Student Focus Group, Nov. 20, 2008.
Working with Rubrics: Using the Oral Communication, Writing, and Critical Thinking Rubrics VALUE Rubrics Ashley Finley, Ph.D Senior Director of Assessment.
KSU General Education Learning Outcomes Assessment: Fostering a Faculty-Driven Process Thomas J. Doleys, Ph.D. Department of Political Science & International.
An Assessment Primer Fall 2007 Click here to begin.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
Stony Brook Model for General Education Assessment Pilot Report November 13, 2003 GEAR as a Catalyst for Change Beginning to Build a Campus- Wide Culture.
Foundations Development Program and Major Declaration M.Ali Alpar, Director, FDP EUA-EC0P at Sabancı University Turkish students enter institutions.
Refining Your Research Question. In this session, we will… Discuss guidelines for creating a ‘good’ research question Provide time to revisit and revise.
Basics of A-G Courses. UC ARTICULATION CONFERENCE Overview  Purpose of “a-g” subject requirements & course lists  Enhancements to “a-g” course criteria.
Monroe Community College 1 Assessment of General Education Social Sciences Knowledge and Skills Area Presenters Frances Dearing, Assessment Coordinator.
FLCC knows a lot about assessment – J will send examples
1 Core Assessment: Attributes of a Successful Program at James Madison University TAIR Professional Development Workshop September 2004 Dr. Dena Pastor.
James Madison University General Education Program
Discovering Assessment BRYAN DOWLING PSYCHOLOGY 1 ST ANNUAL ASSESSMENT DAY APRIL 21, 2015.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Team Members Savannah, Georgia Institutional Profile Founded in 1935 in historic downtown Savannah, Georgia as a junior college, Armstrong is now a part.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
Jason D. Powell Ferrum College Saturday, October 15, :30-2:30 PM ACA Summit Asheville, NC.
Writing Student Learning Outcomes Consider the course you teach.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
T EACHING THE T EACHERS : B UILDING I NFORMATION L ITERACY I NTO THE B IOLOGY C URRICULUM Meris Mandernach James Madison University LOEX 2008.
Advancing Assessment of Quantitative and Scientific Reasoning: A Progress Report on an NSF Project Donna L. Sundre, Douglas Hall, Karen Smith,
Process Skill demonstrate safe practices during laboratory and field investigations, including chemical, electrical, and fire safety, and safe handling.
NASPA IARC 2009 – New Orleans, LA Thank you for downloading this presentation from the NASPA IARC Web site. For additional information on student motivation.
Cluster 5 Spring 2005 Assessment Results Sociocultural Domain.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Assessment of Student Learning North American Colleges and Teachers of Agriculture Cia Verschelden June 17, 2009.
Data-Informed Faculty Development Planning Howard Community College August 17, 2012 Shelley A. Chapman, PhD.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
+ General Education Assessment Spring 2014 Quantitative Literacy.
An Analysis of Successful Online Behaviors Across Disciplines Catherine Finnegan, University System of Georgia Libby V. Morris, University of Georgia Kangjoo.
Bonnie Paller 2013 AALC Assessment Retreat.  The charge of the Task Force is to identify the abilities and intellectual traits that all students are.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
THE NATURE OF SCIENCE. What Science Is and Is Not.
The James Madison University Story Donna L. Sundre, Professor of Graduate Psychology Executive Director Center for Assessment and Research Studies James.
Advancing Assessment of Quantitative and Scientific Reasoning Donna L. Sundre Amy D. Thelk Center for Assessment and Research Studies (CARS) James Madison.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Presentation to Faculty September 15, Reasons: It has been eight years since we created our Major assessment plans. It has been several years.
Chapter Eight: Quantitative Methods
784-1 Brooklyn College Sarah Kessar July 16, 2009.
Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Quantitative Literacy Assessment At Kennedy King College Fall 2013 DRAFT - FOR DISCUSSION ONLY 1 Prepared by Robert Rollings, spring 2014.
Orange Coast College Office of Institutional Effectiveness ISLO Update to Institutional Effectiveness Committee 4/25/2014 ISLO GE SLO Local AA/AS.
First Year Learning Communities as a Transition & Retention Strategy Brett Bruner Director of Persistence & Retention Alma Hidalgo Graduate Assistant for.
Laboratory Science and Quantitative Core Requirements.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Quantitative Literacy Assessment
Department of Physics and Goal 2 Committee Chair
Direct vs Indirect Assessment of Student Learning: An Introduction
General Education Assessment Subcommittee Report
Timothy S. Brophy, Director of Institutional Assessment
The General Education CLAS Core
A new “pre-graduation expectation” for graduating seniors
Center for Academic Programs Assessment
Center for Academic Programs Assessment
Biological Science Applications in Agriculture
STT215 Course Overview Collecting Data Exploring Data
Curriculum Coordinator: Patrick LaPierre February 1, 2016
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research Studies James Madison University

The First Habit #1 Be proactive

A Proactive Assessment Model Provides for program improvement Fulfills accountability requirements

Preview JMU assessment model  Background  Five stages Assessment in Science & Math  Methods  Results  Discussion

James Madison University Public Harrisonburg, VA Enrollment  15,000 undergraduate  700 graduate Center for Assessment and Research Studies (CARS)

General Education at JMU ClusterCredits 1: Skills for the 21 st Century9 2: Arts and Humanities9 3: The Natural World10 4: Social and Cultural Processes7 5: Individuals in the Human Community6 Critical Thinking, Written and Oral Communication, Information Literacy Culture, Philosophy, Fine Arts and Literature Problem-solving Skills in Science and Mathematics Global and American: History, Government, Economics, Anthropology Wellness, Psychology, and Sociology

Example from Cluster Three MATH 103. The Nature of Mathematics MATH 107. Fundamentals of Mathematics I MATH 205. Introductory Calculus I MATH 220. Elementary Statistics MATH 231. Calculus with Functions I MATH 235. Calculus I GSCI 101. Physics, Chemistry, & the Human Experience GSCI 102. Environment Earth GSCI 103. Discovering Life GSCI 104. Scientific Perspectives Choose one

Stages of the Assessment Process Establishing Objectives Selecting/ Designing Instruments Collecting Data Analyzing Data Using Results Continuous Cycle

Definition of Assessment “…the systematic basis for making inferences about student learning and development” (Erwin, 1991, p. 28).

Establishing Objectives Expected/intended student outcomes Good objectives:  Student-oriented  Observable  Measurable  Reasonable Drive the assessment process

Examples from Cluster Three Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon. Discriminate between association and causation, and identify the types of evidence used to establish causation. Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.

Selecting / Designing Instruments Selecting an instrument  Other universities  Resources such as MMY and TIP Considerations  Does it match my objectives?  How much will it cost?  How do I administer it?  Are the scores reliable and valid?

Selecting / Designing Instruments Designing an instrument  Table of specifications  Item pool  Pilot test  Item analysis  Item revision  Pilot test again  Reliability and validity Learning Objective # and % of Items Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon % Discriminate between association and causation, and identify the types of evidence used to establish causation. 9 11% Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses % Total test85 items

Collecting Data Who? When? How? Why? Assessment Days at JMU All students August and February P-&-P, Computer-based

Assessment Day Data Collection Plan COHORT 1 COHORT 2 COHORT 3 Repeated Measures Students in each cohort are tested twice on the same instrument – once as entering freshmen and again in the second semester of the sophomore year. Spring 2003 Fall 2003 Spring 2004 Fall 2004 Spring 2005 Fall 2005 Spring 2006 Fall 2002

Analyzing Data Research questions  Are there group differences?  Do scores change over time?  Are there expected relationships?  Are students meeting expectations?

Maintaining Data Organize and archive Trends emerge Informs future assessment

Using Results Accountability  Allocating resources  Enhancing reputation  Compliance

Using Results Program improvement  Curriculum  Teaching  Sequence of course offerings

Using Results Benefits  Faculty  Students  Institution

Assessment in Cluster Three

Measures NAW-5  Created by Cluster Three faculty  50-item multiple-choice test  12 of 17 learning objectives  Reliability =.75

Measures SOS  Examinee motivation  10-item rating scale  Two dimensions Effort Importance

Participants August, 2001  746 freshmen February 2003  316 sophomores

Research Questions and Data Analysis Do scores change over time? Does number of courses impact scores? Does motivation impact scores? Paired samples t- test ANOVA Multiple regression

Results: Question 1 Do NAW-5 scores change over time?  Mean Diff = 2.6, t(315) = 3.96, p <.001, d =.23

Results: Question 2 Does the number of courses completed impact NAW-5 scores?  F(4, 315) = 1.145, p =.335

Results: Question 3 Does examinee motivation have an impact on NAW-5 scores?  Pre-test scores explain 22%  Motivation scores  significant improvement in predictive utility R 2 change =.11, F change (2, 312) = 25.70, p <.001  Squared semi-partials Pre-test =.24 Effort =.07 Importance =.01

Discussion Scores increased from FR to SO year Number of courses had no impact More effort, higher NAW-5 scores Using results  Program improvement  Accountability

Using Results for Program Improvement Cluster Three objectives NAW-8 Better alignment of curriculum with learning objectives Test with full and balanced coverage, better reliability

Using Results for Accountability Scientific and Quantitative Reasoning Assessment of core competency Marketable instrument

Conclusion Proactive assessment works for JMU 1.Establish objectives 2.Design instruments 3.Collect data 4.Analyze data 5.Use results Assessment in Cluster Three

Final Thoughts Stimulating Meaningful Challenging FUN! If you’re not having fun, you’re doing it wrong!