Responding to Calls for Greater Accountability

Slides:



Advertisements
Similar presentations
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Advertisements

Integrated Learning Experiential Assessment Program (I-LEAP) Julie Burdick Director of Academic Planning & Assessment
Spellings Commission Report Arlene Carney. Key Issues Access Cost and affordability Financial aid Learning.
The Role of the Office Institutional Research and Program Assessment at Baruch College Presented by: John Choonoo, Director Jimmy Jung, Assistant Director.
Institutional Accreditation Review Christine M. Ladisch Vice Provost for Academic Affairs Getting Prepared:
The Voluntary System of Accountability (VSA) College Senate SUNY Oneonta February 25, 2008.
Working Toward a Statewide Information System to Track the Effectiveness of Student Aid Financial Programs in Maryland Michael J. Keller Director of Policy.
Race to the Top Program Update January 30, State Funding 2.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
Life After VSA: An Update Oklahoma Association of Institutional Research and Planning Fall 2009 Conference October 23, 2009.
The Voluntary System of Accountability (VSA SM ).
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Higher Learning Commission Annual Conference Chicago, IL ▪ April 8, 2013 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate Director.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Responding to Calls for Greater Accountability SHEEO-NCES Network Conference April 17, 2008 Christine M. Keller NASULGC The VSA Project.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
AASCU Academic Affairs Summer Meeting Portland Oregon ▪ July 28-31, 2011 Christine M Keller VSA Executive Director.
North East Association for Institutional Research Bethesda, Maryland ▪ November 3-6, 2012 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate.
1 VSA UPDATE The Next Generation of the College Portrait Presentation by William E. Kirwan USM Chancellor Association of Governing Boards Sunday, April.
VSA & Assessment of Learning Outcomes Sally Frazee, Temple University Criss Gilbert, University of Minnesota-Twin Cities Susan Morgan – Appalachian State.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
The NASULGC Agenda Elements of Accountability Competitiveness-STEM Teachers International-1,000,000 abroad Ag Act Reauthorization- Create 21.
VOLUNTARY SYSTEM OF ACCOUNTABILITY AND LEARNING OUTCOMES: AN UPDATE Teri Hinds Voluntary System of Accountability Natasha Jankowski National Institute.
Voluntary System of Accountability UNF sign-up March 1, 2008 Slides from the NASULGC Opening General Session on the VSASlides from the NASULGC Opening.
Updates on AASCU & APLU Data Projects Christine M. Keller, PhD Executive Director, Voluntary System of Accountability APLU Associate Vice President, Academic.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
Development of Statewide Community College Value- Added Accountability Measures Michael J. Keller Director of Policy Analysis and Research Maryland Higher.
Middle States Re-Accreditation Town Hall September 29, :00-10:00 am Webpage
Performance-Based Accreditation
Dutchess Community College Middle States Self-Study 2015
CONTEXT and objectives
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
President, Virginia Board of Education
Research from the NCCSD: What’s new and exciting?
Director, Center for Teaching, Learning, & Assessment
Exploring CSU and the WHY
Consider Your Audience
The Texas Affordable Baccalaureate Creating opportunity for Texans
A nationwide US student survey
Student Affairs Assessment
SUNY Applied Learning Campus Plan Parts V-VII
Director of Policy Analysis and Research
Strategic Planning Council (SPC)Update
NSSE Results for Faculty
National Survey of Student Engagement (NSSE)
The University of Texas-Pan American
MidAIR Conference Kansas City, MO | November 9, 2017
ORGANIZATIONAL STRUCTURE
General Education Assessment
Building Partnerships:  How the Office of Assessment and Accreditation Can Help You and Your Program Be Successful.
Defining and Measuring Student Success Dr
Using the ePortfolio as an Student Learning Assessment Tool
Derek Herrmann & Ryan Smith University Assessment Services
Governance and leadership roles for equality and diversity in Colleges
21st Century Learning Environments Phase 1 Professional Development
Student Learning Outcomes Assessment
Middle States Accreditation Standards and Processes
2018 OSEP Project Directors’ Conference
Imagine Success Engaging Entering Students Innovations 2009
Implementation Guide for Linking Adults to Opportunity
General Education Redesign Task Force
Presented by: Skyline College SLOAC Committee Fall 2007
2019 Local School District Charter Application Process
Writing the Institutional Report
What to do with your data?
Trends in Postsecondary Data Policy & Institutional Data Use
Teaching Quantitative Reasoning
April 24, 2019 Making College More Affordable for California’s Community College Students.
Presentation transcript:

Responding to Calls for Greater Accountability The VSA Project Responding to Calls for Greater Accountability Christine M. Keller NASULGC Good afternoon. I am Christine Keller blah blah Who has heard of the VSA? Are any of you from institutions that are participating in the VSA? Overview of the background development and goals of VSA Walk through sections of the college portrait Leave some time for Questions SHEEO-NCES Network Conference April 17, 2008

Commission on the Future of Higher Education “Lack of useful data and accountability hinder policymakers and the public from making informed decisions and prevents higher education from demonstrating its contribution to the public good.” (Final report – September 2006) Background & Context – This quote from the final report of the national commission on the future of higher education (spellings commission) provides a good summary of the context in which the VSA was developed IR background take exception to the comment about lack of data - what about IPEDS? Key point is USEFUL data – there may be too much complicated data for people to sort through. And as associations made up of public institutions, funded in part by public monies the last phrase was particularly unsettling

The Voluntary System of Accountability (VSASM) Demonstrate accountability and stewardship to public Assemble information that is transparent, comparable, and understandable Measure educational outcomes to identify and enhance effective educational practices So within that environment – the VSA or Voluntary System of Accountability was developed. Our response 3 primary goals of the VSA project Third goal – not just about measurement and reporting but using the information to improve the education of students

AASCU & NASULGC Partnership Represent over 525 public, 4-year institutions Enroll 7.5 million students annually Award 70% of bachelor’s degrees in U.S. each year The VSA is a joint project between 2 higher education associations whose memberships are 4 year public institutions – American Association of State Colleges and Universities and the National association of state universities and land-grant colleges. Large segment of undergraduate education in the US

Development of VSA Lumina Foundation Grants Higher Education Community 7 Committees 20 Meetings 70 Public Institutions 82 Committee Members The vsa framework was outlined by AASCU and NASULGC but the details of system were created through the hard work of 82 exceptional committee members – presidents, provost, student affairs officers, ir directors, and refined through feedback and dialogue with the broad higher education community 2 grants from Lumina Foundation ($267K and $317) g

College Portrait 5-page web reporting template Standard, comparable format Most elements compiled from currently available data sources Selected data elements based on focus groups, higher education community, research

Audience Students and families Public Policymakers, legislators Accreditors, institution and state boards Campus faculty and staff Several different audiences that sometimes made the selection and presentation of the data elements challenging. However, the primary audience is the first one listed – prospective students and their families.

College Portrait Components Consumer Information Student Experiences and Perceptions Student Learning Outcomes Walk through the different parts of the College Portrait 3 main sections Customization – MORE links, text boxes on pages 1, 3, 5

Consumer Information Student and Institutional characteristics Success and Progress rate Cost of Attendance and Financial Aid College Cost Calculator Post Graduation plans Timeline: 3 months to 1 year Student and institutional characteristics – size & demographics of student body, class size, campus safety, student life activities, most popular degree areas (scattered throughout the first 3 pages) Cost of Attendance and financial aid – primarily on page 2 Post graduation plans of senior – employment, graduate school, military etc College Cost calculator (based on the calculator used by the U of Texas system) – online tool that allows prospective students and their families to enter info about their financial circumstances and get back an estimate of the financial aid they would be likely to receive at a particular institution. The calculator can be installed and customized by each VSA participating institution. Optional. Public overestimates the cost of attending colleges – some studies say by as much as 30% for public institutions. And Lower incomes students often don’t even consider college as an option. Student Success and Progress rate – working with the data available through the national student clearinghouse to show the completion and enrollment rates for students at the original institution and at other institutions they may attend and graduate from. This was intended to shift the focus from the single graduation rate number that is often pretty low for public institutions (seems to hover around 50 or 60%) to show a more complete picture of the progress and success of students. This is particularly important since the majority of students today attend more than one institution before they graduate.

Student Experiences & Perceptions Institution-specific assessments and national student surveys Choose from four surveys: CSEQ, CSS (CIRP), NSSE, UCUES Survey responses reported for VSA- selected questions under six common constructs Timeline: within 2 years Not intended to show gains but paint a picture of student experiences on campus that have rough comparability across campuses Institutions can link to institution specific assessments at the top of the page The remainder of the page reports the responses from seniors that completed one of four surveys – College Student Experiences Questionnaire (CSEQ) out of Indiana University, the College Senior Survey or CSS which is part of Cooperative Institutional Research Program CIRP and housed at the Higher Education Research Institution at UCLA, National Survey of Student Engagement NSSE house at Indiana University University of California Undergraduate Engagement Survey Each VSA institution reports the responses from the same questions on one of the 4 surveys. The questions were selected and grouped within 6 areas that research tells us are correlated with student learning and success – group learning, active learning, interaction with faculty, experiences with diversity, institutional commitment to student success, and student satisfaction. The 6 standards area allows for at least rough comparisons across institutions and surveys.

Student Learning Outcomes Institution-specific assessments and educational outcomes Pilot project to measure learning gains in critical thinking, analytic reasoning, written communication at institution level Choose from three instruments: CAAP, CLA, MAPP Timeline: within 4 years The third section of the College Portrait template reports evidence of student learning in two ways. 1. At the top of the page, institutions provide a description of how they evaluate student learning. This description includes links to institution-specific outcomes data such as such as pass rates for professional licensure exams, assessments in the major or within general education or whatever learning outcomes the institution thinks are appropriate. 2. The bottom half of the page reports the result from a VSA pilot project designed to measure student learning gains in critical thinking, analytic reasoning, and written communication. An institution can select from one of three instruments to measure these broad cognitive skills – CAAP, CLA, MAPP. Two important points about this section - the skills are measured at the institution level across all academic disciplines and are intended to be comparable across institution types. It is a pilot project since many public institutions have not previously measured these broad cognitive skills at the institutional level and then analyzed the results to report learning outcomes in this manner. 4 year trial period Results are described on College Portrait template in two ways: 1. learning gains between the freshman and senior years (or the value-added component); and as 2. the actual average test scores for freshmen and seniors. A. On the left hand side - Learning gains or value-added scores reflect the difference between the actual and expected scores of graduating and entering students, taking into account the academic ability of the students. Each of the three testing organizations will use the same method to compute and characterize their learning gains or value-added scores for VSA purposes: Well Above Expected, Above Expected, At Expected, Below Expected, and Well Below Expected. Cross sectional. The reporting of the actual average scores demonstrates whether the average score of the seniors is higher than the average score of the freshmen. Since the range of scores varies across the three instruments, their results do not allow for direct comparisons between instruments.

VSA Update 234 universities registered from 45 states/territories (4/14/08) Public launch in 3 to 4 months Accessible version under development VSA Oversight Board established to guide future direction 44% of 525 NASULGC/AASCU members Systems – U of Wisconsin, Cal State U, U of NC, U of Texas, Texas A&M, U of Louisiana, PASSHE Big 12 (75%) – U of Kansas, U of Missouri, Iowa State, OK State, U of Texas Austin, Texas Tech, Texas A&M, U of Nebraska Big 10 (50%) – U of Iowa, Purdue, Michigan State, U of Minnesota, U of Illinois, U of Wisconsin AAU publics (50%) Cal State Northridge – accessible web-based version – current PDF files not as user friendly as we would like – probably ready by fall Larry Abele Provost and Vice President for Academic Affairs Florida State University Linda Bennett Provost & Vice President for Academic Affairs University of Southern Indiana Daniel Bradley President Fairmont State University William (Brit) Kirwan Chancellor University System of Maryland Jolene Koester President California State University-Northridge Mitchel Livingston Vice President University of Cincinnati Linda Mannering Director, Institutional Research University of Nebraska at Omaha Keith Yehle Director of Government Relations University of Kansas

FIPSE Grant $2.4 million grant shared by NASULGC, AASCU, AAC&U Research on student learning assessment Three areas of focus Comparison of VSA measurement tools Measurement of student growth Development of e-portfolio framework NASULGC will coordinate efforts by educational researchers and leaders from the Educational Testing Service (ETS), the Council for Aid to Education (CAE), and the American College Testing Program, Inc. (ACT) to examine the extent to which disparate measurement tools recommended as part of the VSA can be used interchangeably, whether these tools are measuring similar or dissimilar outcomes or levels of achievement, and the role test format (e.g. multiple choice vs. open-ended/constructed response measures) plays in the correlation among measures. CONTENT VALIDITY AAC&U will lead an effort to develop an e-portfolio framework for assessing a wider array of learning outcomes than those measured by these other tests. This part of the project will foreground practices that base assessments on authentic examples of student work collected over time in an e-portfolio. Titled VALUE, this research and development effort will collect and synthesize best practices in faculty-developed rubrics to highlight commonalities of outcomes and expectation of achievement levels across institutions. AAC&U also will develop models and templates through which e-portfolios can be used to demonstrate, share, and assess student accomplishment of advanced and integrative learning outcomes. AASCU will lead a third part of the initiative to develop a validated survey instrument to measure changes in student growth especially related to the development of competence in skills effective in the workplace and those related to civic engagement. FROM WORK OF VSA STUDENT GROWTH TASK FORCES

More Information? Website: www.voluntarysystem.org Christine Keller VSA Executive Director ckeller@nasulgc.org