Sally K. Murphy Director of General Education and Freshman Year Programs California State University, East Bay Julie Stein, Instructor, General Studies.

Slides:



Advertisements
Similar presentations
NSSE:RetrospectiveandProspective George Kuh AAC&U General Education and Assessment Conference March 2, 2007.
Advertisements

Program Level GE Assessment: A Pilot Project Stephen Branz Associate Dean, Undergraduate Studies and Director of General Education San José State University.
Office of Academic Student Instructional Support -OASIS- -Cheri Tillman, Pat Burns.
Beyond Engagement: Academic Advisers as Process Engineers NACADA Georgia Drive-In Georgia Perimeter College September 23, 2011.
Name of presentation Month 2008 A multi-disciplinary study of the benefits students gain from engaging in research experiences Dr Kirsten Zimbardi & Dr.
What is LEAP? Roundtable Discussions October 19 & 20.
Using Data to Inform our Practice with Peer Leader Programs Brett Bruner, Director of Persistence & Retention Fort Hays State University 2014 Peer Mentor.
ANDREA BROWN DIRECTOR, PROGRAM ASSESSMENT AND INSTITUTIONAL RESEARCH How to Use Student Survey Data (NSSE) to Improve Your Teaching.
2008 National Survey of Student Engagement – SUNY Oneonta Patty Francis Steve Perry Fall 2008.
Organizing Assessment to Foster Students’ Best Work Council for the Advancement of Standards National Symposium November 16, 2009 Carol Geary Schneider.
Maximizing Your NSSE & CCSSE Results
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
The “Self Regarding Institution” Using Assessment for Accountability and Improvement Douglas R. Davenport Dean, College of Arts and Sciences Truman State.
Prepared by: Fawn Skarsten Director Institutional Analysis.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
RETENTION(N): DOING THE SAME THING OVER AND OVER AGAIN AND EXPECTING DIFFERENT RESULTS (INSANITY)
Students who are… …engaged in the classroom – pass; …engaged in their academic program - return; …engaged in deep learning – graduate. What constitutes.
Assessing Library Contributions to University Outcomes 9th Northumbria International Conference University of York, England Joe Matthews August 2011.
Dr. Bettina Shuford, Associate Vice Chancellor of Student Affairs Dr. Amy Gauthier, Senior Associate Director, Housing and Residential Education High Impact.
Integrated Learning Experiential Assessment Program (I-LEAP) Julie Burdick Director of Academic Planning & Assessment
Program Assessment, WASC and Cal Poly Pomona Bob Hurt Faculty Associate for Program Assessment and Academic Program Review.
Data on Student Learning Office of Assessment University of Kentucky.
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
Betsy Barefoot John Gardner. Integrative Learning.
EDUCATION INITIATIVE The Next Generation Cosumnes River College.
Derek Herrmann & Ryan Smith University Assessment Services.
Report on Ideas to Action (i2a) March 1, 2010 Patricia R. Payette, Ph.D. Executive Director, Ideas to Action Associate Director, Delphi Center for Teaching.
Outcomes and Course Design for the General Education Program- GE outcomes, course outcomes: finding the common ground called alignment.
NSSE – Results & Connections Institutional Research & Academic Resources California State Polytechnic University, Pomona October 2, 2013 – Academic Senate.
SCIENCE AND MATHEMATICS TEACHER INITIATIVE Improving the Undergraduate Pipeline to Math and Science Teaching Credentials Program Overview.
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Measuring the value of HIP – Impact on faculty Preliminary FSSE data as presented by Thomas F. Nelson Laird at AAC&U’s 2012 Institute on HIP & Student.
An Introduction: NSSE and the Concept of Student Engagement.
1. Stuart Boersma: Professional Development Coordinator, Mathematics. Kandee Cleary: Director of Diversity and Inclusivity, and Sociology (chair). George.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
An Overview.  Association of American Colleges and Universities (AAC&U)  Liberal Education and America’s Promise (LEAP)  aacu.org/leap.
Drake Curriculum Analysis Committee Update December 2012.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
RE-ENGAGING THE CAMPUS IN DISCUSSING, DELIVERING, AND ASSESSING LEARNING OUTCOMES C. “Griff” Griffin, Director of General Education Wendy Wenner, Dean,
National Survey of Student Engagement 2007 Results for Students in Graduate and Professional Studies.
Alison Morrison-Shetlar Kaleidoscopic Learning: looking at learning through different lenses.
Vitality in Undergraduate Research Programs Claire Peinado Fraczek, PhD University of Washington Bothell March 7, 2014.
Three GE Projects at SJSU Compass Project Stephen Branz & Maureen Scharberg (Debra David) GE Program Assessment Stephen Branz & Scot Guenter Integrated.
© 2015,Gardner Institute for Excellence in Undergraduate Education Enhancing Student Success and Learning With Promising (Evidence-Based) Retention Practices.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Highlights of NSSE 2001: University of Kentucky December 10, 2001.
A TALE OF TWO METHODS: COMPARATIVE ASSESSMENT OF FACE- TO-FACE AND ONLINE LEARNING National Institute on the Assessment of Adult Learning 2011: Virtual.
Engagement and Accountability at Queensborough Community College using the Epsilen Social Learning Platform Michele Cuomo Associate Dean for Academic Affairs.
Integrating Teaching, Learning and Assessment: A Programmatic ePortfolio Process AAEEBL Conference, Boston, MA - July 28, 2011 Karen Stein, Faculty Director,
Using High Impact Practice Activities to Engage Students in the Classroom While Building Student Self-Confidence in the Community.
College of Education and Allied Studies Office of Semester Conversion Academic Programs and Graduate Studies February 4, :00 pm – 4:00 pm Oakland/Concord.
Getting Ready for the Higher Learning Commission (NCA) July 2011 Dr. Linda Johnson HLC/Assessment Coordinator/AQIP Liaison Southeast Technical Institute.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
Why Are HIP Practices so Important to Students?... Where and how are we accomplishing these at CWU? HIGH IMPACT PRACTICES: Create an environment that helps.
Integrating High-Impact Practices on CSU Campuses November 9, 2011 Dial in: (866) Access: # Fall 2011 webinar from CSU Graduation Initiative.
The University of Texas-Pan American Susan Griffith, Ph.D. Executive Director National Survey of Student Engagement 2003 Results & Recommendations Presented.
The University of Texas-Pan American
NSSE 2004 (National Survey of Student Engagement)
High Impact Practices: HU-HIPs plan
Learning Communities Promoting community, curricular connections, collaboration, & reflective practice (Levine Laufgraben, 2005, p. 375)
Derek Herrmann & Ryan Smith University Assessment Services
Fall Institute for Academic Deans and Department Chairs
Natasha Cook, M.Sc., Kerry Ritchie, Ph.D.,
Title V Initiatives to Promote Student Success: Capstone Seminar and Student Research With Faculty May 31, 2016.
Jeanne Butler, Director Office of Assessment
Presentation transcript:

Sally K. Murphy Director of General Education and Freshman Year Programs California State University, East Bay Julie Stein, Instructor, General Studies and General Education Advisor

Agenda Workshop Context: High Impact Practices and Your Goal The Story of Freshmen General Education Learning Communities 13 Years of Learning Community Indirect Assessment Exercise Lessons Learned and the Critical Importance of Data Some Assessment Data Sources

High Impact Educational Practices High Impact PracticeRelative Cost of Implementation High/Med/Low Value to Your Institution High/Med/Low Possible Assessment Data First Year Seminars & Experiences Common Intellectual Experiences Learning Communities Writing Intensive Courses Collaborative Assignments and Projects Undergraduate Research Diversity/Global Learning Service Learning, Community Based Learning Internships Capstone Courses and Projects *Peer Mentor (Not listed AAC & U, 2008)

Freshmen General Education Learning Communities PRIOR TO 1998 WASC report CSUEB make G.E. more coherent increase retention provide increased academic support to freshmen such as more lower division G.E. courses General education learning communities or clusters where: freshmen in cohorts of about 90 students take GE classes throughout the year that are integrated by a common theme

Assessing General Education Learning Communities Standardized assessments Authentic assessments

Freshman Learning Community Impact of Data Exercise The goal of this exercise is to see how survey data can tell a story about the impact of a program or practice. Data Set Perceived Needs Data Set Retention and Remediation Data Set Perceived Improvement

Freshman Learning Community Impact of Data Exercise 1. Open Folder #1. In your group, analyze the data for the Freshmen Learning Community program from What conclusions could you draw about the program with the data provided?

Freshman Learning Community Impact of Data Exercise 1. Open Folder #1. In your group, analyze the data for the Freshmen Learning Community program from What conclusions could you draw about the program with the data provided? 2. Next, open folder #2 and overlay the transparency on the data chart. What other conclusions could you draw about the program now?

Additional Data Still Needed Where to Deepen Learning/ Provide Support? Data tells some of the story Still need a finer grained picture for measure of direct learning. For example a portfolio assessment with 2nd-4th year writing and other academic markers to identify development over time Other relevant data (class size, socioeconomic)

Lessons Learned Start now Ideal to collect direct measures Your program is unlikely to receive long term support unless success is measured No matter what your program is, you need to collect, analyze, and compare over time whatever meaningful data is available Examine interrelationships between data Avoid the danger analyzing limited data

Some Assessment Data Sources GENERAL DATA Institutional Self Reporting Qualitative Authentic Assessment STANDARDIZED ASSESSMENT INSTRUMENTS Entering Student Survey (ESS) The College Student Experiences Questionnaire (CSEQ) National Survey of Student Engagement (NSSE) Faculty Survey of Student Engagement (FSSE), College Assessment of Academic Proficiency (CAAP) Collegiate Learning Assessment CLA

High Impact Educational Practices High Impact PracticeRelative Cost of Implementation High/Med/Low Value to Your Institution High/Med/Low Possible Assessment Data First Year Seminars & Experiences Common Intellectual Experiences Learning Communities Writing Intensive Courses Collaborative Assignments and Projects Undergraduate Research Diversity/Global Learning Service Learning, Community Based Learning Internships Capstone Courses and Projects *Peer Mentor (Not listed AAC & U, 2008)

Sally K. Murphy Director of General Education and Freshman Year Programs Julie Stein, Instructor, General Studies