Measuring Up on College- Level Learning Margaret Miller, Project Director September 2003
Measuring Up 2000
Learning in the States: Incomplete [Add state map on incomplete]
State Efforts to Measure Learning (taxonomy: Peter Ewell, Change magazine) Certification of individual students –E.g., Texas ’ s TASP, Florida ’ s CLAST Institutional assessment for improvement –E.g., Tennessee's performance measures –Missouri ’ s accountability program –Campus-based assessment Institutional assessment for accountability –E.g., S. Dakota and Arkansas
National Attention to College-Level Learning Pew ’ s Quality of Undergraduate Education and writing assessment projects American Association of Colleges and Universities ’ general education assessment project Council on Higher Education Accreditation ’ s project on institutional effectiveness Secretary's Commission on Achieving Necessary Skills (SCANS) skills Equipped for the Future National Skills Standards Board
Key Questions What do the state ’ s college-educated citizens know and what can they do that contributes to the social good? What kind of educational capital do they represent? and
Key Questions (cont.) How well do the state’s public and private, two- and four-year colleges and universities collectively contribute to that capital? What do those whom they educate know, and what can they do?
Key Decisions Whose learning will we measure? What learning will we measure? How will we use the information? What strategies will we pursue?
Whose Learning The college-educated in the states and college students Whose learning What learning The policy uses for the information Assessment strategies
What Learning National Education Goal 6: “ By the year 2000, every adult American will be literate and will possess the knowledge and skills necessary to compete in a global economy and exercise the rights and responsibilities of citizenship ” Whose learning What learning The policy uses for the information Assessment strategies
What Learning (cont.) National Goal 6, objective for college education: “ By the year 2000, every adult American will be literate and will possess the knowledge and skills necessary to compete in a global economy and exercise the rights and responsibilities of citizenship ” Whose learning What learning The policy uses for the information Assessment strategies
Policy Purposes Higher education policy and K-12 education + economic development + adult literacy policy Whose learning What learning The policy uses for the information Assessment strategies
Direct Strategies National Assessment of Adult Literacy Graduate-admissions and licensing exams General intellectual skills tests Whose learning What learning The policy uses for the information Assessment strategies
National Assessment of Adult Literacy (NAAL) concludes12/03 Disadvantages: Labor-intensive, expensive Decadal federal survey --timing National sample only, except in 6 states Not what colleges think they teach Advantages: Advanced literacy levels of a good measure of educational capital Assesses general population Comparison group of non-college-educated Household survey – respondent motivation high
Existing Exams Graduate-admissions exams –Dental –Graduate Management –Graduate Record –Law School, –Medical College –Optometry –Pharmacy Licensing exams –Clinical Pathology –Dental Hygiene –Occupational Therapy –Physical Therapy –Physician Assistant –Nursing –Respiratory Therapy –Teaching Whose learning What learning The policy uses for the information Assessment strategies
Existing Exams data gathered by 03/04 Disadvantages: Selection bias Uneven coverage by discipline Variable (and sometimes small) numbers of test- takers in each state Most in health professions Advantages: Established, credible instruments Highly motivated test- takers Admissions tests assess general intellectual abilities Availability Low cost
General Intellectual Skills Tests administered fall 03 WorkKeys to a sample of two-year students in each state –Applied Math –Locating Information –Reading for Information –Business Writing Collegiate Learning Assessment (CLA) to a sample of four-year students in each state
WorkKeys and CLA Disadvantages: Institutional motivation Test-taker motivation Expense Advantages: Excellent tests of general & functional intellectual skills Can impart useful information to student and school
Indirect Measures NSSE/CCSSE co-administered with tests CRS summer through fall, 03 National Survey of Student Engagement (NSSE) Community College Survey of Student Engagement (CCSSE) College Results Survey (CRS) Whose learning What learning The policy uses for the information Assessment strategies
Surveys Disadvantages: Not direct learning measures Not yet cross- correlated with direct measures Advantages: Excellent and recently developed instruments Process measure could lead to improvement Both have face validity Respondent motivation good
Challenges Political instability in states: gubernatorial, SHEEO Personnel changes among key players Institutional skepticism Faculty resistance Data-collection hurdles Test-taker motivation
General Timeline Measuring Up 2002: model tested with incomplete data from Kentucky : Five-state pilot to test assessment model: IL, KY, NV, OK, SC Measuring Up 2004: publish the results of the pilot Measuring Up 2006: if enough states adopt the model, grade states on learning
Reasons to Act It is the right thing to do. We can determine how to do it right. This initiative will generate information useful to states, institutions, and students. State-level analysis can promote collaborations to serve underachieving subpopulations or regions of the state. State resources can be effectively targeted.