Download presentation
Presentation is loading. Please wait.
Published byHoratio Washington Modified over 6 years ago
1
Archie P. Cubarrubia, Ed.D. The George Washington University
Exploring the Influence of External Standards of Institutional Effectiveness on Program Assessment in Student Affairs Archie P. Cubarrubia, Ed.D. The George Washington University
2
Overview Purpose of the Study Context Literature Methodology Results
Problem of Practice Problem of Research Literature Methodology Results Themes and Implications Discussion
3
Purpose The purpose of this study was to explore the influence of external standards of institutional effectiveness on program assessment in student affairs.
4
Context Era of assessment and accountability in higher education
Pressure to provide more and better evidence of program effectiveness in student affairs “Without assessment, student affairs is left only to logic, intuition, moral imperatives, goodwill, or serendipity in justifying its existence” (Upcraft & Schuh, 1996, p. 12).
5
Problem of Practice There is not a direct line of sight between assessment activities in student affairs and the critical work functions of the institution, and vice versa. There is a lack of alignment between how program effectiveness in student affairs is demonstrated and how institutional effectiveness is evaluated.
6
Problem of Research There is limited research on student affairs assessment in general (Doyle, 2004; Green, Jones, & Aloi, 2008). Although accountability is frequently cited as an impetus for conducting assessment in higher education, there is limited empirical research on the impact of external accountability standards on how student affairs practitioners assess program effectiveness.
7
Literature Review Assessment as part of the growing public policy agenda of higher education accountability Pressure at the federal level Pressure at the state level Pressure from the general public Responses from the higher education community Standards of institutional effectiveness Accrediting agencies State higher education agencies Professional organizations Assessment in higher education Challenges Dualisms in assessment Assessment in student affairs
8
Research Questions In what ways do student affairs functional areas assess program effectiveness? Is there a relationship between incidence of program assessment and institution region, type, or size? What is the primary purpose of program assessment in student affairs? Are there differences by institution region, type, or size? What are the influences on program assessment in student affairs? What is the relative influence of accreditation standards, state higher education standards, and professional standards on program assessment in student affairs?
9
Research Design Exploratory study Web-based data collection
22-item survey developed by the researcher and adapted from two instruments Woodard, Hyman, von Destinon, and Jamison (1991) Bridges and Cubarrubia (2006) Love and Estanek’s (2004) dualisms of assessment served as the conceptual framework for the study Dualism in purpose: Assessment can be conducted for accountability or for improvement. Statistical analyses conducted include descriptive statistics, cross-tabulations, chi-square tests of independence, and ANOVAs, as appropriate
10
Population and Sample Population Sampling Frame
Director-level student affairs administrators at institutions of higher education in the United States Sampling Frame Student affairs administrators who were members of NASPA with “director” in their title
11
Sample Information Demographics of the final sample generally reflected characteristics of the entire NASPA membership. Initial sample 1,538 individuals Final sample 1,472 individuals Initial response rate 30.9% (n = 455) Final response rate 29.8% (n = 438)
12
Sample Information, continued
Institution Type Note. N = 432. Includes two-year institutions. Institution Type n % Public four-year 230 53.2 Private four-year 189 43.8 Public two-year 13 3.0 Private two-year -
13
Sample Information, continued
Institution Size Note. N = 417. Total Undergraduate Student Enrollment n % Under 1,000 18 4.3 1,000-2,499 70 16.8 2,500-4,999 66 15.8 5,000-9,999 63 16.5 10,000-19,999 77 18.5 20,000-29,999 74 17.7 30,000-40,000 24 5.8 Over 40,000 19 4.6
14
Sample Information, continued
Region Note. N = 415. Regional Accrediting Agency n % Middle States Association of Colleges and Schools 77 18.6 New England Association of Schools and Colleges 45 10.8 North Central Association of Colleges and Schools 133 32.0 Northwest Commission on Colleges and Universities 26 6.3 Southern Association of Colleges and Schools 91 21.9 Western Association of Schools and Colleges 43 10.0
15
Sample Information, continued
Functional Areas Represented Functional Area n % Student Activities 110 27.6 Leadership Programs 105 26.4 Orientation and New Student Programs 101 25.4 Residence Life and Housing 98 24.6 Multicultural Student Services 60 15.1 Judicial Affairs 56 14.1 Greek Affairs 55 13.8 Community Service and Service Learning Programs 49 12.3 College or Student Unions 43 10.8 Career Development 40 10.1
16
RQ1 Findings In what ways do student affairs functional areas assess program effectiveness? INCIDENCE 95.9% of respondents indicated that they conduct some form of program assessment or evaluation. No association was found between incidence of program assessment and institution region, type, or size.
17
RQ1 Findings, continued COMPONENTS OF ASSESSMENT The majority of respondents were either likely or very likely to assess all components of Schuh, Upcraft, and Associates’ (2001) comprehensive assessment model. 66.1% were very likely to assess student satisfaction. 64.3% were very likely to assess program use. 56.4% were very likely to assess program outcomes/student learning. 53.7% were very likely to assess student and other stakeholder needs.
18
RQ1 Findings, continued INSTRUMENTS AND MEASURES USED
Not Likely at All Somewhat Likely Likely Very Likely n Nationally normed survey 31.0% 17.5% 18.5% 32.9% 416 Locally developed student survey 9.4% 12.1% 27.1% 51.3% 413 Student interviews or focus groups 10.4% 17.9% 35.3% 36.5% 414 Program participation rates 2.9% 13.6% 31.1% 52.4% 412 External benchmarking 21.6% 31.3% 23.3% 23.8%
19
RQ1 Findings, continued CAMPUS ATTITUDES TOWARD ASSESSMENT Attitude
Institution (n = 434) Division (n = 433) Views assessment with disdain .5% .7% Not supportive and uninterested 1.4% 2.6% Barely supportive and interested 9.8% 9.6% Somewhat supportive and interested 46.3% 39.2% Very supportive and interested 42.0% 47.8%
20
Is there a dedicated office for student affairs program assessment
RQ1 Findings, continued CAMPUS SUPPORT FOR ASSESSMENT Is there a dedicated office for student affairs program assessment Yes No n … At the institution level? 48.2% 51.8% 409 … At the division level? 33.2% 66.8% 413 … At the department level? 24.3% 75.7% 408 … At the program level? 20.4% 79.6% 406
21
RQ2 Findings What is the primary purpose of program assessment in student affairs? RATIONALES FOR CONDUCTING PROGRAM ASSESSMENT Not Likely at All Somewhat Likely Very Likely n To justify programs 14.5% 24.8% 37.1% 23.6% 415 To improve program quality 0.5% 3.9% 26.3% 69.3% 414 To gauge program affordability and cost effectiveness 13.8% 33.7% 34.6% 17.9% 413 To support the department's strategic planning process 5.5% 14.4% 30.0% 50.0% 416 To inform policy development 13.6% 23.5% 39.3% 412 To satisfy institutional requirements 16.4% 29.4% 33.3% 21.0%
22
RQ2 Findings, continued PURPOSES OF CONDUCTING ASSESSMENT 66.7% of respondents indicated that improvement was their primary reason for conducting program assessment. Public four-year institutions were more likely than private four-year institutions to indicate accountability as the primary reason for conducting program assessment. There were no differences in the stated purposes of assessment by region or by institution size.
23
RQ2 Findings, continued AUDIENCES FOR RESULTS OF PROGRAM ASSESSMENT
Not Likely at All Somewhat Likely Likely Very Likely n Students 11.5% 27.8% 33.5% 27.3% 418 Faculty 14.8% 30.6% 30.1% 24.4% Staff 2.6% 12.7% 34.1% 50.6% 417 Senior Administration 1.9% 6.7% 26.0% 65.3% 415 Board of Trustees 31.2% 23.3% 18.2% Accrediting Agency 33.2% 23.1% 21.4% 22.4% 416 State Board of Higher Education 59.3% 19.5% 14.0% 7.2% Professional Organizations 37.8% 33.0% 19.8% 9.4% General Public 48.3% 28.1% 17.5% 6.0%
24
RQ3 Findings What are the influences on program assessment in student affairs? RESPONDENTS’ FAMILIARITY WITH ASSESSMENT AND EXTERNAL STANDARDS OF INSTITUTIONAL EFFECTIVENESS Not at All Familiar Somewhat Familiar Familiar Very Familiar Extremely Familiar n Standards issued by the agency that accredits your institution 15.0% 30.9% 26.3% 17.9% 9.9% 414 Standards issued by your state board of higher education 40.5% 26.2% 20.1% 9.7% 3.4% 412 Standards issued by professional organizations 2.9% 17.1% 26.0% 32.3% 21.7% 415
25
RQ3 Findings, continued CONDITIONS UNDER WHICH PROGRAM ASSESSMENT WAS INITIATED Note. N = 416. n % Desire to improve programs 346 83.2 Student retention issues 127 30.5 External demands for accountability 123 29.6 Concerns about departmental performance 72 17.3 Institutional reorganization 45 10.8 Financial difficulty 35 8.4 Don't know 10 2.4
26
RQ3 Findings, continued INTERNAL AND EXTERNAL INFLUENCES ON PROGRAM ASSESSMENT Not Influential at All Somewhat Influential Influential Very Influential n Departmental mission 1.9% 12.8% 31.3% 54.0% 415 Divisional mission 7.0% 17.6% 39.4% 36.0% 414 Institutional mission 5.3% 21.5% 36.6% 413 Departmental strategic plan 8.5% 9.9% 36.8% 44.8% Divisional strategic plan 13.0% 18.6% 39.1% 29.2% Institutional strategic plan 12.0% 22.5% 36.2% 29.3% 409 Standards issued by the agency that accredits your institution 25.2% 32.0% 27.1% 15.6% Standards issued by your state board of higher education 42.3% 31.8% 19.1% 6.8% Standards issued by professional organizations 11.9% 33.4% 18.2%
27
RQ4 Findings What is the relative influence of accreditation standards, state higher education standards, and professional standards on program assessment in student affairs? Not Influential at All Somewhat Influential Influential Very Influential n Standards issued by the agency that accredits your institution 25.2% 32.0% 27.1% 15.6% 409 Standards issued by your state board of higher education 42.3% 31.8% 19.1% 6.8% Standards issued by professional organizations 11.9% 33.4% 36.6% 18.2% 413 Do Not Use at All Seldom Use Usually Use Regularly Use n Standards issued by the agency that accredits your institution 31.7% 31.0% 25.4% 11.9% 413 Standards issued by your state board of higher education 47.2% 30.4% 17.0% 5.4% 411 Standards issued by professional organizations 8.5% 25.1% 45.9% 20.5% 414
28
RQ4 Findings, continued There were differences in the perceived influence and use of external standards by region. Institutions under the jurisdiction of SACS were more likely to consider external standards influential to their program assessment efforts. They were also more likely to use external standards in their program assessment efforts. There were also differences in the perceived influence and use of external standards by institution type. Public four-year institutions were more likely to consider external standards influential to their program assessment efforts. Institution size did not seem to be a factor in the perceived influence of or use of external standards.
29
Themes Theme 1: The incidence of program assessment in student affairs has increased but has remained focused on program improvement.
30
Implications Implications for student affairs professional organizations: Student affairs professional organizations must establish a coordinated, comprehensive student affairs assessment agenda that: Is aligned with other accountability activities at the state and national levels ICSSIA, VSA, U-CAN, etc. Will serve as the framework for accountability activities across the entire profession Enhance and increase use of CAS standards in accreditation and state reviews Will be the basis for a robust research agenda Empirical research is still limited. Increasing research will allow practitioners to develop a common language around assessment.
31
Themes Theme 2: Assessment efforts in student affairs are not connected to institutional processes that ensure effectiveness.
32
Implications Implications for senior administrators:
Senior administrators must create a culture of accountability across the entire institution: Increase capacity and create a systems infrastructure that provides opportunities for cross-functional collaboration and that rewards assessment Require all units to assess program effectiveness within an established institutional framework Report the results of assessment to a broader group of stakeholders to gain credibility and external support for assessment activities
33
Implications Implications for director-level administrators:
Directors must create a culture of accountability by adopting what Love and Estanek (2004) call an “assessment mindset” and expanding the scope of assessment activities beyond program improvement: Integrate assessment into all aspects of the program cycle Actively use external standards to develop program activities Increase personal expertise in assessment
34
Themes Theme 3: External standards do not promote a direct line of sight between critical work activities within student affairs functional areas and outcomes of institutional effectiveness.
35
Implications Implications for accrediting agencies:
Agencies should ensure that their standards address the critical work functions of student affairs functional areas more explicitly. Accreditation standards should include other outcome measures. Accrediting agencies should include student affairs staff in accreditation review teams more intentionally.
36
Implications Implications for state higher education agencies:
State agencies should reevaluate their standards and ensure that measures appropriately consider outcomes resulting from the critical work functions of student affairs. State agencies should partner with accrediting agencies and professional organizations to better coordinate evaluation activities. State agencies should communicate more effectively with institutions regarding state priorities for higher education and how outcomes and measures used relate to those priorities.
37
Ancillary Findings Familiarity and level of expertise of director-level administrators regarding assessment Functional areas that are considered part of student affairs
38
Design Issues and Limitations
Limitations in sampling Representativeness of sampling frame in relation to the intended population Inclusion of several types of institutions Assumption that population of interest is the most appropriate audience Limitations in research design Low response rate Response bias Reliance on self-reported data from a sample whose levels of expertise with assessment were not controlled
39
Future Research How standards are used in student affairs
Details of program assessment practice by functional area What can student affairs functional areas learn from each other? Analysis by accrediting agency and state Which accrediting agency or state standards seem to have the most impact on assessment practice in student affairs? How external standards can promote a more direct line of sight between program assessment in student affairs and evaluations of institutional effectiveness What types of policies best facilitate the alignment of standards with actual performance?
40
Discussion What measures of success do you use in your programs?
How did you develop these measures? Are your measures connected to standards used by accrediting agencies and state higher education agencies? To whom do you report results of your program assessment? What measures of success should we use? How can we better align our assessment activities in student affairs with what’s expected by our stakeholders? How can we better advocate for improving external standards to take into accounat what’s already happening at the program level?
41
Contact
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.