Student Learning Outcomes: Interpretations, Validity, and Factor Development Krista Soria and Laura Gorny This project was funded by the Undergraduate.

Slides:



Advertisements
Similar presentations
Association of American Colleges and Universities.
Advertisements

Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Learning outcomes: PwC’s perspective
David J. Sammons, Dean UF International Center. Southern Association of Colleges and Schools: SACS is our regional accrediting authority. The last SACS.
Internationalization of the Undergraduate Curriculum.
Special recognition: University of Florida.  Participants will be able to: ◦ Articulate specifications for learning outcomes ◦ Classify learning outcomes.
General Studies Areas Core Areas –Literacy & Critical Inquiry (L) –Mathematical Studies (MA/CS) –Humanities & Fine Arts (HU) –Social & Behavioral Sciences.
Connecting Work and Academics: How Students and Employers Benefit.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
University Mission Learning-centered environment Integration of teaching research, service, and co-curricular experiences Prepare students to be responsible.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
Learning without Borders: Internationalizing the Gator Nation M. David Miller Director, Quality Enhancement Plan Timothy S. Brophy Director, Institutional.
General Education Revision. Mission & Purpose Mission Rooted in the tradition of liberal arts education, FGCU’s General Education Program provides students.
Core Competencies Student Focus Group, Nov. 20, 2008.
Where Have We Been? Where Are We Going? Using Student Surveys to Assess and Improve Literature Courses Kelly Douglass, PhD Asst. Professor, English Riverside.
Authentic Performance Tasks
WEBER STATE UNIVERSITY General Education Roundtable Weber State University January 8, 2008.
Achievement of Educational Outcomes: Seniors’ Self- evaluations from 2004 & 2007 National Surveys of Student Engagement (NSSE) Cathy Sanders Director of.
NMU Towards the 21 st Century Mitchell Klett Alan Willis Ruth Watry Laura Reissner Gary Brunswick.
Implementing Undergraduate-Faculty Teaching Partnerships in Your Classroom Anna L. Ball Neil A. Knobloch University of Illinois, Urbana-Champaign.
1 Student Learning Assessment Assessment is an ongoing process aimed at understanding & improving student learning Formative Assessment – Ongoing feedback.
UniLOA The University Learning Outcomes Assessment The Center for Learning Outcomes Assessment, Inc. ©
Examples of Successful Collaborative Campus Critical Thinking Examples of Successful Collaborative Campus Projects in Critical Thinking n “Seeing Women.
Faculty Perceptions of the Importance of Selected Educational Outcomes Findings from the 1998, 2001, and 2004 UCLA HERI Faculty Surveys Cathy Sanders Director.
Lindsey Wilson College October, In 2010, the LWC faculty adopted new institutional student learning outcomes including: Lindsey Wilson students.
Colorado Learning About Science Survey for Experimental Physics Benjamin Zwickl, Heather Lewandowski & Noah Finkelstein University of Colorado Physics.
Assessing a Campus-wide Internationalization Initiative: Aligning Assessments with Institutional Outcomes M. David Miller Timothy S. Brophy University.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
Connecting Work and Academics: How Students and Employers Benefit.
The Student Experience Project Overview for Kosovo Higher Education visit Mark Wilkinson October 2014.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
Assessment of Student Learning Faculty In-service June 5, 2006.
Trevecca Nazarene University  Why have a General Education curriculum?  How is General Education assessed?  What were the results of last.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Assessing Student Learning Outcomes in Student Development – Part I Student Development Division Meeting SUNY Oneonta May 9, 2008.
Conceptual Framework for the College of Education Created by: Dr. Joe P. Brasher.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
NSW Curriculum and Learning Innovation Centre Draft Senior Secondary Curriculum ENGLISH May, 2012.
Presentation of Results NSSE 2003 Florida Gulf Coast University Office of Planning and Institutional Performance.
Joo Hee “Judy” Kim ED 480 Teachback Fall 2007 / M. Campo.
Trevecca Nazarene University  Why have a General Education curriculum?  How is General Education assessed?  What were the results of last.
Intentional Learning Goals and the Carleton Curriculum LTC January 13 and January 14 Mary Savina, Clara Hardy, Carolyn Sanford, Nelson Christensen.
Humanities Blogs Rock! Integrating information technology into the School of Liberal Arts and Academic Partnerships Shereen Hassanein.
NSSE and the College of Letters and Sciences Chris Fastnow Office of Planning and Analysis November 7, 2008.
1 Presentation of Results NSSE 2005 Florida Gulf Coast University Office of Planning and Institutional Performance.
REVISIONS TO GENERAL EDUCATION STUDENT LEARNING OUTCOMES Auburn University Senate Information Item, August 2014.
ICT Work Programme Objective 4.2 Technology Enhanced Learning European Commission, DG Information Society and Media Unit E3 – Cultural Heritage.
Institutional Learning Outcomes Office of Institutional Effectiveness Presented to the Learning Assessment Task Force November 5, 2013.
Pilot Training for Volunteers General Education Assessment Committee.
Chapter 1 Defining Social Studies. Chapter 1: Defining Social Studies Thinking Ahead What do you associate with or think of when you hear the words social.
Knowledge Exchange and Impact in the AHRC Susan Amor Head of Knowledge Exchange Conny Carter Impact and Policy Manager University of Exeter 7 April 2011.
Bonnie Paller 2013 AALC Assessment Retreat.  The charge of the Task Force is to identify the abilities and intellectual traits that all students are.
General Education at CityU. Framework cu.
Integrate English as a Second Language (ESL) strategies into the teacher preparation program for all students the semester prior to student teaching. Require.
Science Department Draft of Goals, Objectives and Concerns 2010.
This material is based upon work supported by the National Science Foundation under Grant No and Any opinions, findings, and conclusions.
CREATING A CULTURE OF EVIDENCE Student Affairs Assessment Council October 2013 Dr. Barbara Copenhaver-Bailey Assistant Vice President for Student Success.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Systems Wide Learning at a Community College Developments in the last five years –SACS-COC (Course Outcomes to Program Outcomes) –The Texas Higher Education.
Advising 101: Putting the CORE into Context Professor Amy Nawrocki Senior Lecturer in English Advisor, First Year Studies Program.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
ELA Grade 11/12 Cohort Common Core Transition Training SY March 7, 2014 Professional Development Center (PDC) Judy Henderson, Emily Jimenez, Elizabeth.
Good teaching for diverse learners
General Education Review
NSSE Results for Faculty
Derek Herrmann & Ryan Smith University Assessment Services
Timothy S. Brophy, Director of Institutional Assessment
THE ATHLETES OF IB – PEYTON MANNING
Presentation transcript:

Student Learning Outcomes: Interpretations, Validity, and Factor Development Krista Soria and Laura Gorny This project was funded by the Undergraduate Research Opportunity Program (UROP) at UMNTC

Overview Two separate research projects funded by the Undergraduate Research Opportunity Program (UROP) at the University of Minnesota Two separate lists of survey items –Student learning outcomes (SLOs)— institutionally defined –General development supporting SLOs Factor analysis for web reporting

Self-Reported Surveys Several challenges exist with students’ self- reports of their learning: –Subjective and encourage responses based on social desirability –Students interpret survey items differently –Survey items can be ambiguous, leading to different responses –Students’ self-reports have been deemed unreliable (Bowman, 2009; Porter, 2011)

Research Questions How do college students interpret SERU survey items assessing their development of key learning outcomes? What evidence do college students provide to support their understanding of their development?

Validity One type of validity evidence (NSSE’s Psychometric Portfolio) Response Process –Are items consistently interpreted and understood by respondents the way the researchers intended? Cognitive interviews and focus groups NSSE Source:

SERU Items Measuring Development Please rate your proficiency in the following areas when you started at this institution and now: 1.Analytical and critical thinking skills 2.Ability to be clear and effective when writing 3.Ability to read and comprehend academic material 4.Understanding a specific field of study 5.Ability to understand international perspectives (economic, political, social, cultural) 6.Ability to appreciate, tolerate, and understand racial and ethnical diversity 7. Ability to appreciate cultural and global diversity

SERU Items Measuring SLOs To what extent do you feel that your experiences at this campus have contributed to your learning and development in the following areas? 1.Have mastered a body of knowledge and mode of inquiry 2.Can locate, define, and solve problems 3.Can communicate effectively 4.Can locate and critically evaluate information 5.Have acquired the skills for effective citizenship and lifelong learning 6.Understand the role of creativity, innovation, discovery, and expression across disciplines 7.Understand diverse philosophies and cultures within and across societies

Methods Two separate qualitative studies –Cognitive interviews asking students to interpret items related to their development in several areas –Interviews asking students to interpret seven student learning outcome items

Interviewing Sample: third-year students who responded to the learning and development items 14 items placed into factor analysis and four factors emerged Students recruited had factor scores +/- 1 SD from the mean Met approximately 40 minutes –Open-ended, traditional cognitive interview

Cognitive Interviewing PART ONE (RQ1): Open-ended, non-leading probes –What are you thinking now? –What leads you to say that? –Could you say more? PART TWO (RQ2): More direct, yet non-leading probes –When you rated yourself good and then very good, what were you thinking? –How did that [experience] come about?

Results: Cognitive Interviewing Tendency to avoid extremes –Modesty –Human nature

Results: Cognitive Interviewing Notice “excellent”

Results: Cognitive Interviewing Making sense of items –Subjective terminology Complicated questions –“How can I answer this?” Loaded words

Results: Interviews Wide spectrum of responses along a theme “Have mastered a body of knowledge and a mode of inquiry” Liberal education, general education requirements Mastering every single class Being completely confident in a specific academic major

Results: Interviews Some consistency in broad ideas using different terminology—but often missing alternative ideas –“Being able to speak clearly to people –“I think it is like public speaking” –“To be able to talk to other students or people you’re working with…” –“Interact with another person and get your ideas across so you can tell each other about your results…”

Results: Interviews Long pauses and “I don’t knows” –“What does effective citizenship even mean?” –“Maybe I should actually take the time to read survey items!” Consistently referencing the same limited framework –Solve problems: “made me think of math and science” –Critically evaluate info: “makes me think of graphs or charts you see in math or science” –Lifelong learning: “developing critical thinking skills but not by taking another math class”

Recommendations “As a result of your experiences at [University X]…” “When you started”…your first semester at this institution “Current ability”…as of this current semester Frame questions based on where students might have learned skills (e.g., from coursework, did you learn…) Avoid double-barreled questions

Recommendations In the case of SLOs, increase visibility across campus to enhance familiarity Consider offering these items on a separate survey so that students can focus on answering questions related to their learning/development

Recommendations Offer examples of items (e.g., understand the importance of research to advance society— describes “understand the role of discovery”) Offer several items to describe one idea (e.g., participate in electoral process, engage in community service to address “effective citizenship”) Pilot items and interview students about the items before administration

Factor Analysis for SLOs Combined core and wildcard items to create 7 student learning outcomes

Thank you! Any questions?