Download presentation
Presentation is loading. Please wait.
Published byDonna Miranda Roberts Modified over 9 years ago
1
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research Studies James Madison University http://www.jmu.edu/assessment/
2
The First Habit #1 Be proactive
3
A Proactive Assessment Model Provides for program improvement Fulfills accountability requirements
4
Preview JMU assessment model Background Five stages Assessment in Science & Math Methods Results Discussion
5
James Madison University Public Harrisonburg, VA Enrollment 15,000 undergraduate 700 graduate Center for Assessment and Research Studies (CARS)
6
General Education at JMU ClusterCredits 1: Skills for the 21 st Century9 2: Arts and Humanities9 3: The Natural World10 4: Social and Cultural Processes7 5: Individuals in the Human Community6 Critical Thinking, Written and Oral Communication, Information Literacy Culture, Philosophy, Fine Arts and Literature Problem-solving Skills in Science and Mathematics Global and American: History, Government, Economics, Anthropology Wellness, Psychology, and Sociology
7
Example from Cluster Three MATH 103. The Nature of Mathematics MATH 107. Fundamentals of Mathematics I MATH 205. Introductory Calculus I MATH 220. Elementary Statistics MATH 231. Calculus with Functions I MATH 235. Calculus I GSCI 101. Physics, Chemistry, & the Human Experience GSCI 102. Environment Earth GSCI 103. Discovering Life GSCI 104. Scientific Perspectives Choose one
8
Stages of the Assessment Process Establishing Objectives Selecting/ Designing Instruments Collecting Data Analyzing Data Using Results Continuous Cycle
9
Definition of Assessment “…the systematic basis for making inferences about student learning and development” (Erwin, 1991, p. 28).
10
Establishing Objectives Expected/intended student outcomes Good objectives: Student-oriented Observable Measurable Reasonable Drive the assessment process
11
Examples from Cluster Three Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon. Discriminate between association and causation, and identify the types of evidence used to establish causation. Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.
12
Selecting / Designing Instruments Selecting an instrument Other universities Resources such as MMY and TIP Considerations Does it match my objectives? How much will it cost? How do I administer it? Are the scores reliable and valid?
13
Selecting / Designing Instruments Designing an instrument Table of specifications Item pool Pilot test Item analysis Item revision Pilot test again Reliability and validity Learning Objective # and % of Items...... Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon. 18 21% Discriminate between association and causation, and identify the types of evidence used to establish causation. 9 11% Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses. 12 14%...... Total test85 items
14
Collecting Data Who? When? How? Why? Assessment Days at JMU All students August and February P-&-P, Computer-based
15
Assessment Day Data Collection Plan COHORT 1 COHORT 2 COHORT 3 Repeated Measures Students in each cohort are tested twice on the same instrument – once as entering freshmen and again in the second semester of the sophomore year. Spring 2003 Fall 2003 Spring 2004 Fall 2004 Spring 2005 Fall 2005 Spring 2006 Fall 2002
16
Analyzing Data Research questions Are there group differences? Do scores change over time? Are there expected relationships? Are students meeting expectations?
17
Maintaining Data Organize and archive Trends emerge Informs future assessment
18
Using Results Accountability Allocating resources Enhancing reputation Compliance
19
Using Results Program improvement Curriculum Teaching Sequence of course offerings
20
Using Results Benefits Faculty Students Institution
21
Assessment in Cluster Three
22
Measures NAW-5 Created by Cluster Three faculty 50-item multiple-choice test 12 of 17 learning objectives Reliability =.75
23
Measures SOS Examinee motivation 10-item rating scale Two dimensions Effort Importance
24
Participants August, 2001 746 freshmen February 2003 316 sophomores
25
Research Questions and Data Analysis Do scores change over time? Does number of courses impact scores? Does motivation impact scores? Paired samples t- test ANOVA Multiple regression
26
Results: Question 1 Do NAW-5 scores change over time? Mean Diff = 2.6, t(315) = 3.96, p <.001, d =.23
27
Results: Question 2 Does the number of courses completed impact NAW-5 scores? F(4, 315) = 1.145, p =.335
28
Results: Question 3 Does examinee motivation have an impact on NAW-5 scores? Pre-test scores explain 22% Motivation scores significant improvement in predictive utility R 2 change =.11, F change (2, 312) = 25.70, p <.001 Squared semi-partials Pre-test =.24 Effort =.07 Importance =.01
29
Discussion Scores increased from FR to SO year Number of courses had no impact More effort, higher NAW-5 scores Using results Program improvement Accountability
30
Using Results for Program Improvement Cluster Three objectives NAW-8 Better alignment of curriculum with learning objectives Test with full and balanced coverage, better reliability
31
Using Results for Accountability Scientific and Quantitative Reasoning Assessment of core competency Marketable instrument
32
Conclusion Proactive assessment works for JMU 1.Establish objectives 2.Design instruments 3.Collect data 4.Analyze data 5.Use results Assessment in Cluster Three 1 2 3 4 5
33
Final Thoughts Stimulating Meaningful Challenging FUN! If you’re not having fun, you’re doing it wrong!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.