An Empirical Model of Culture of Assessment in Student Affairs

Slides:



Advertisements
Similar presentations
MPI Mission Perception Inventory Answer Accreditation Standards on Institutional Mission With Results from the Mission Perception Inventory (MPI) Ellen.
Advertisements

Assessing Institutional Effectiveness: The Mission Engagement Index (MEI) as a Measure of Progress on Mission Goals Ellen Boylan, Ph.D. ● Marywood University.
FLCC knows a lot about assessment – J will send examples
1 Using Factor Analysis to Clarify Operational Constructs for Measuring Mission Perception Ellen M. Boylan, Ph.D. NEAIR 32 nd Annual Conference November.
Looking at Student work to Improve Learning
The Learning Behaviors Scale
Virginia Foundation for Educational Leadership Virginia Department of Education Webinar Series 2012 Welcome to Webinar 7.
The Leader and Global Systems: The Impact of an International Partnership Activity on the Redesign of the Doctoral Program in Leadership Studies at Gonzaga.
MPI Mission Perception Inventory Institutional Characteristics and Student Perception of Mission: What Makes a Difference? Ellen Boylan, Ph.D. ● Marywood.
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Leading Change. THE ROLE OF POLICY IN CHANGE Leading Change – The Role of Policy Drift to Quantitative Compliance- Behavior will focus on whatever is.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Advanced Correlational Analyses D/RS 1013 Factor Analysis.
Perceptive Agile Measurement: New Instruments for Quantitative Studies in the Pursuit of the Social-Psychological Effect of Agile Practices Department.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Office of Service Quality
Presented at the 3rd Annual Best Practices In Assessment Poster Competition Contact Information : Molly M. Jameson, Ph.D. Department of Psychology
Feedback: Keeping Learners Engaged Adult Student Recruitment & Retention Conference Sponsored by UW-Oshkosh; March 21-22; Madison, WI Bridget Powell,
Using Data to Inform our Practice in Orientation, Transition & Retention Brett Bruner, Director of Persistence & Retention Fort Hays State University Randy.
Southern Illinois University Edwardsville,
District Engagement with the WIDA ELP Standards and ACCESS for ELLs®: Survey Findings and Professional Development Implications Naomi Lee, WIDA Research.
Learning Assessment Techniques
Student Affairs: A Culture of Assessment
The Assessment Process: A Continuous Cycle
Exploratory Factor Analysis
A short instrument to assess topic interest in multimedia research
actionable assessment
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
What is HEA Fellowship? What’s the UK PSF?
First-Year Experience Seminars: A Benchmark Study of Targeted Courses for Developmental Education Students.
Creating a Culture of Assessment In Student Affairs:
Presenter: Dr. Molly Hayes Sauder
Exploratory Factor Analysis Participants, Procedures, & Measures
Course credit hour definition (7.13)
Student Affairs Assessment
Social Presence within Two Massive Open Online Courses (MOOCs)
Director of Policy Analysis and Research
Analysis of Survey Results
Qualitative Data Collection: Accuracy, Credibility, Dependability
Designing Professional Development for Elementary School Teachers
Showcasing the use of Factor Analysis in data reduction: Research on learner support for In-service teachers Richard Ouma University of York SPSS Users.
Customer Orientation in Public Relations and Corporate Communication Industry in Malaysia: An Empirical Study Sufyan Rashid Department of Communication,
Raymond J. Wlodkowski, Ph.D.
Assessment & Evaluation workshop
TSMO Program Plan Development
Helping students know what they know
Director, Institutional Research
EDU 675Competitive Success/snaptutorial.com
EDU 675 Education for Service-- snaptutorial.com
EDU 675 Teaching Effectively-- snaptutorial.com
Monday, March 5, 2018 Convention Center, ROOM#
Jennifer McLean, Associate Dean for Student Affairs
Welcome to Practitioner-to-Faculty: An Examination of Narratives
Lead Evaluator for Principals Part I, Series 1
Build With Them, Not For Them: Students as Culture-Building Partners
Leading Assessment Cultures Without a Project Leader
Management and Evaluation Programs for Faculty
EPSY 5245 EPSY 5245 Michael C. Rodriguez
Presented by: Skyline College SLOAC Committee Fall 2007
Resident Assistant Use of Student Development Theory
TL 1071 – Implementing Learner-centered Teaching for Student Success
Archie P. Cubarrubia, Ed.D. The George Washington University
The Heart of Student Success
What to do with your data?
Presenters: Michael Biesiada, Yusuke Kuroki
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Learning-oriented Organizational Improvement Processes
Developing Online Discussion Forums to Promote Higher-Order Thinking
PRINCIPLES OF GOOD PRACTICE IN SoTL
Presentation transcript:

An Empirical Model of Culture of Assessment in Student Affairs Matthew B. Fuller, Ph.D. and Forrest C. Lane, Ph.D. Sam Houston State University Tuesday, March 14, 2017 Marriott RiverCenter, Salon A

Background and Literature Assessment has emerged as a defining facet of higher education. Many institutions devote significant resources to assessment and accreditation activities. Approximately $660 million is spent on direct costs associated with assessment and accreditation (Woolston, 2012).

Background and Literature Assessment in student affairs is relatively new topic in the field. Much of the prior scholarship on student affairs assessment has focused on applications of assessment techniques. There is very little research about assessment culture. Few theories or models of assessment culture have emerged because very little empirical evidence or data exist.

Literature Recent publications, pressures, and opportunities have heightened the need to examine cultures of assessment in student affairs (Bingham & Bureau, 2015; Henning & Roberts, 2016; Schuh, Biddix, Dean, & Kinzie, 2016) As higher education faces increased scrutiny to describe its worth, so does student affairs. Many in student affairs are driven by a quest to improve their practices and services as well as an ethical commitment to students.

Prior Studies Factors of assessment culture based on attitudes of administrators (Fuller & Skidmore, 2014). Prior factors from Assessment and IR Administrators’ Study Clear Commitment Connection to Change Vital to Institution Faculty Perceptions

Prior Studies Factors of assessment culture based on attitudes of faculty (Fuller et al., 2016). Prior factors from Faculty Study Faculty Perceptions Use of Data Sharing Compliance or Fear Motivators Normative Purposes for Assessment

Our Work Examines assessment culture in the context of student affairs. Draws upon Student Affairs Survey of Assessment Culture Exploratory Factor Analyses to examine a model. Prior studies with other samples offer an opportunity to compare groups. Assessment Administrators Faculty

Instrument Student Affairs Survey of Assessment Culture Forty-eight items Five factors (Fuller et al., 2016) Faculty Perceptions Use of Data Sharing Compliance or Fear Motivators Normative Purposes for Assessment Internal consistency (α) of factor are reported to be above 0.7 (Fuller & Skidmore, 2014; Fuller et al., 2016).

Sample Contacted institutions’ Division of Student Affairs Ensured at least 1 participating institution in a stratification matric with 3 levels (institutional size, region, Primarily Associates/ Primarily Bachelors) Invited mid-manager or higher level practitioners 1,624 practitioners from 59 institutions were invited to participate in the study. 771 practitioner responded to the survey (47.% response rate).

Methods Principal Components Analysis Factors were extracted using principal components analysis (PCA). Components were obliquely rotated using Promax criteria. Multiple criteria were used for component retention Kaiser-Guttman rule (i.e., Eigenvalues > 1) scree test parallel analysis Factor pattern matrix and structure matrix were considered in the interpretation of components (Henson & Roberts, 2006)

Results The initial model included 9 components explaining 62% of the variance. Parallel Analyses and the scree plot suggested a more parsimonious model. The final model included 4 components explaining 55% of the variance across items. Reliability Coefficients ranged from 0.65 to 0.78.

Factors Clear Commitment to Assessment Assessment Communication “Upper Student Affairs Administrators have made clear their expectations regarding assessment.” Assessment Communication “Communication of assessment results has been effective.” Connection to Change “Change occurs more readily when supported by assessment results.” Fear of Assessment “Assessment results are used to scare student affairs staff into compliance with what the administration wants.”

Discussion Assessment culture may be viewed differently among student affairs practitioners. There is a need to refit the data to all three groups. Practices, such as communication strategies or planned use of data, associated with each factor. Professional development for senior and mid-level student affairs leaders. Consider revisions to the instrument.

Implications for Practice Reflective practice opportunities. See Handout Engage student affairs leaders in their reasons for education, growth, development. Clear depiction of how assessment supports those reasons Clarify roles and responsibilities Advise strong yet responsible methods

Implications for Practice for SAA Leaders Surveys as “pulse checks” or “read outs” of your division’s culture of assessment across time or across planned developments. Not a comparative tool because we believe culture should not be compared. Items can help you lay out a plan for improvements in your division. An example Matt

Implications for Professional Development (PD) Where are your division’s staff and leaders on the technical—cultural scale? Match their needs with sessions, communications, success stories. Who is the best person to deliver X, Y, or Z topics in PD sessions? Remember the “peopled” side of assessment. Staff and leaders often need tactics for interacting with a variety of people in the “language of assessment” or research. Who is their “assessment shepherd?” Do you have one? (Fuller & Letizia, 2015). This work is not “one off” and is “highly relational.”

References Bingham, R. P., & Bureau, D. (2015). Tenet one: Understand the "why" of assessment. In R. P. Bingham, D. Bureau, & A. G. Duncan, Leading assessment for student success: Ten tenets that change culture and practice in student affairs (pp. 9-21). Sterling, Virginia: Stylus Publishing, LLC. Fuller, M.B. & Letizia, A. (2015). Examining inequalities in faculty and administrators perceptions of assessment as servant leadership. Presented Nov. 7, 2015 at the Association for the Study of Higher Education: Washington, D.C. Fuller, M. B., & Skidmore, S. (2014). An exploration of factors influencing institutional cultures of assessment. International Journal of Educational Research, 65(1), 9- 21. Fuller, M., Skidmore, S., Bustamante, R., & Holzweiss, P. (2016). Empirically exploring higher education cultures of assessment. The Review of Higher Education, 39(3), 395-429. doi:10.1353/rhe.2016.0022 Henning, G., & Roberts, D. (2016). Student affairs assessment: Theory to practice. Sterling, VA: Stylus Publishing. Schuh, J. H., Biddix, J. P., Dean, L. A., & Kinzie, J. (2016). Assessment in Student Affairs. San Francisco: Jossey Bass Pub. Co. Woolston, P. J. (2012, August). The costs of institutional accreditation: A study of direct and indirect costs. Los Angeles, CA. Retrieved from http://cpedinitiative.org/files/Woolston.pdf

Questions? Dialogue? Matthew Fuller, Ph.D. Forrest Lane, Ph.D. assessmentculture@shsu.edu forrest.lane@shsu.edu 936.294.3399 936.294.1147 @assessculture

Thank you for joining us today! Please remember to complete your customized online evaluation following the conference. See you in Philly in 2018!