The James Madison University Story Donna L. Sundre, Professor of Graduate Psychology Executive Director Center for Assessment and Research Studies James.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

General Education and Assessment: Engaging Critical Questions, Fostering Critical Learning Strategies for Assessing Student Learning in General Education.
The Graduate Teacher Training Program: Analyzing and Improving Graduate Student Teaching of Psychology Inaugural Academic Symposium at UVA: April 14, 2011.
The Complexities of Grading: What’s fair? What’s worth grading? Kay M. Sagmiller Oregon State University Center for Teaching and Learning.
UMR’s Accreditation Self-Study. The Value of Accreditation  Institutional Reputation  Standard of Quality  Vehicle for Self Improvement  Transferability.
Arkansas Tech University Continuous Improvement Plan Office of Assessment and Institutional Effectiveness Needs Assessment Prioritize Goals &
“Learning and Living Leadership” James Madison University Harrisonburg, Virginia Psychology Peer Advising Presenters: Joshua P. Tarr, Peer Advising Coordinator.
09/20/021 Susan Albertine The College of New Jersey Nevin Brown Education Trust Ron Henry Georgia State University Quality in Undergraduate Education QUE.
SPSU 1001 Hitchhiker’s Guide to SPSU Mid Term Assessment Copyright © 2010 by Bob Brown.
Supported by the National Science Foundation Award HRD Role and Support of Faculty Mentors in an Undergraduate Research Program Rafael Bahamonde,
Session Overview Tennessee’s Current Postsecondary Landscape and Challenges Early Postsecondary Opportunities Reporting Processes for Specific.
ICE Evaluations Some Suggestions for Improvement.
1 General Education Assessment at Cleveland State University What We Have Accomplished What We Have Yet to Do.
Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY.
© 2004 Michigan State University PROM/SE: Promoting Rigorous Outcomes in Math and Science Education Overview, Fall 2004.
F o u n d e d  R o l l a, M i s s o u r i.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
1 Core Assessment: Attributes of a Successful Program at James Madison University TAIR Professional Development Workshop September 2004 Dr. Dena Pastor.
Prepare for your future in the global market … International Internship or Co-op.
Assessment of Undergraduate Psychology Curriculum: JMU Practices & Experiences Donna Sundre Executive Director, Center for Assessment and Research Studies.
COMPASS National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
Being a Successful Graduate Student  As a new graduate student, you are likely wondering:  What is graduate school like?  What should I expect?  Can.
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Program Review In Student Affairs Office of the Vice President Division of Student Affairs Virginia Tech
End of Course Evaluation Taimi Olsen, Ph.D., Director, Tennessee Teaching and Learning Center Jennifer Ann Morrow, Ph.D., Associate Professor of Evaluation,
Using Just-in-Time Teaching for Large Course Instruction Kevin J. Apple James O. Benedict James Madison University.
Advancing Assessment of Quantitative and Scientific Reasoning: A Progress Report on an NSF Project Donna L. Sundre, Douglas Hall, Karen Smith,
Graduate School of Education Assessment October 10, 2013.
Cluster 5 Spring 2005 Assessment Results Sociocultural Domain.
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Truman State University Kirksville, Missouri Glenn Wehner – Agriculture Ian Lindevald – Physics Philip Ryan – Mathematics Karen Smith - Psychology.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
2006 Fall Workshop PLANNING and ASSESSMENT: A TUTORIAL FOR NEW DEPARTMENT CHAIRS – A REFRESHER COURSE FOR OTHERS.
Quality Jeanne M. Burns, Ph.D. Louisiana Board of Regents Qualitative State Research Team Kristin Gansle Louisiana State University and A&M College Value-Added.
Middle States Reaccreditation Process at The Catholic University of America.
Comparing Senior And Sophomore Knowledge and Confidence Concerning Academic Advising Anecdotal evidence suggested that a discrepancy existed between what.
Intro to Outcomes. What is “Outcomes”? A. a statewide initiative aimed at improving learning and accountability in education B. a standing SFCC committee.
Columbus State University C ollege of Education and Health Professions PSC Program Review February 14-17, 2010.
Advancing Assessment of Quantitative and Scientific Reasoning Donna L. Sundre Amy D. Thelk Center for Assessment and Research Studies (CARS) James Madison.
School Improvement Activities Accreditation (AdvancED) Process Office of Service Quality Veda Hudge, Director Donna Boruch, Coordinator, School Improvement.
Challenges of Quantitative Reasoning Assessment Donna L. Sundre Center for Assessment and Research Studies
Alabama State Department of Education AP Initiative/A+ College Ready Parent Information.
Graduate Instruction Methods Fall 2008 Being a successful TA Cesar D. Guerrero Department of Computer Science and Engineering October.
CAA Review Joint CAA Review Steering Committee Charge Reason for Review Focus Revision of Policy Goals Strategies Milestones.
#TacklingTransfer Davis Jenkins Community College Research Center
The Registrar’s Office The Registrar’s Office RO–103 Grades
Innovation in T&L: moving the needle?
A Winning Duet The Partnering of Two and Four Year Colleges to Ensure Transfer Student Success Kate McDaniel Rajan Shore.
Preparing for Graduate School
Experienced Faculty Rate Distance Education Most Effective for Achieving Many Student and Administrative Outcomes ePoster Presented Wednesday July 26,
WebAssign and Open Source Content
Student Retention Three Ways:
Melissa Zantello, Executive Director of Program Development
Timothy S. Brophy, Director of Institutional Assessment
Institutional Effectiveness USF System Office of Decision Support
Rory McFadden, Gustavus Adolphus College
Using Student Survey Data to Build Campus Collaborations
The Heart of Student Success
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
US Graduate Education Model
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
Oral Reviews: Helping Students Make Connections
Departmental Evaluation Criteria
SARA BRANCH, HO HUYNH, JULIE LAZZARA, LYRA STEIN
AP Literature, Dual Enrollment, or English 12?
Angelo State University College of Education
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

The James Madison University Story Donna L. Sundre, Professor of Graduate Psychology Executive Director Center for Assessment and Research Studies James Madison University Harrisonburg, Virginia

Quantitative and Scientific Reasoning Assessment  1997: New General Education program  We piloted our first version of the test in 1996  A revolution in assessment thinking  2006: Our NSF Grant was funded  2007: Our 9 th version of the tests

How Do We Get People to Play? We Want to Work Smarter, not Harder!  We have an infrastructure for assessment  Center for Assessment and Research Studies  Uses an assessment liaison system  Dean of Undergraduate Studies  Oversees General Education, Advising, and other offices  Five Cluster Areas of Study  Cluster Steering Committees

These are Working Committees  Chaired by the Cluster Coordinator  They have release time  Representatives from every department with a course that counts toward General Education completion  CARS assessment liaison assigned to each  Committee work is recognized as service to the department and the university  Quality assessment is a form of scholarship

Inferences About General Education  It can take years to define meaningful learning objectives  Most institutions have not really completed this step successfully  Very difficult to write good items  Items cannot privilege one course over another  No factoids or trivial pursuit items  Takes you back to the construct again and again

We Modified Our Objectives  At one time, we had 17 QR/SR objectives!  Finally reduced to 8  Content alignment exercises helped us  QR now has 2 objectives  We still cannot make inferences at the objective level  Need higher reliability estimates  We feel pretty confident talking about QR and SR now

Construct Test

Student Learning Objectives  This is the Engine that drives assessment  QR Learning Objectives: 1.Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon. (21 items) 2.Discriminate between association and causation, and identify the types of evidence used to establish causation. (10 items)  QR Test has 25 items (a few overlap)

Compelling Student Data  Since Fall 2000, we have tested over  8,000 Entering 1 st -Year Students  7,000 Sophomore and Junior Students  We have improved our test 5 times  Version 6—now on Version 9  Reliability has improved:  QR: from.50 to.66  SR: from.54 to.74  We user fewer but better items

Compelling Student Data  Our faculty are getting better at writing items  We are guided by the work of George Cobb at Mount Holyoke on how to write items  Our faculty write better items outside of their teaching area  That’s when they are using General Education  Our students like this test

Compelling Student Data  Questions We Can Answer:  Do Grades Matter?  Does Taking More Classes Improve Test Scores?  Do AP Courses Predict Performance?  Do JMU Courses Predict Performance?  Do Transfer Credits Predict Performance?  Do Students Change Over Time?  Do Students Perform at the Level Our Faculty Would Like to See?

Mapping Our Items to Objectives  Our test items map to the objectives of other institutions:  *Truman State University: 100%  *Michigan State University: 98%  *Virginia State University: 97%  *St. Mary’s University (TX) 92%  Virginia Tech 84%  Virginia CC System 78%  Our NSF grant helped us to advance assessment of QR and SR with 4 partner institutions*

Other Institutions Like Our Tests  Other Scientific Reasoning Test Users:  Radford University  East Stroudsburg University  Liberty University  University of Mary Washington  University of Miami

Other Institutions Like To Use Our Tests  Other Scientific Reasoning Test Users:  Georgia Tech University  Liberty University  Radford University  University of Mary Washington  University of Miami  Virginia Tech

Plans for the Future  Work on using faculty expectations for student performance  Developing a system for providing students with feedback on their performances  Using eCampus  Norm referenced interpretations  Criterion referenced interpretations  Conducting our Self-Study this year

Questions? Contact Information Donna L. Sundre Center for Assessment and Research Studies James Madison University