Can Institutions Really Be Compared Using Standardised Tests? Presented at the 2008 Meeting of the European Association for Institutional Research Copenhagen,

Slides:



Advertisements
Similar presentations
Dr Sarah Richardson Senior Research Fellow, Higher Education Australian Council for Educational Research
Advertisements

Future of Universities Yunus Söylet President of Istanbul University
USING THE CLA TO INFORM CAMPUS PLANNING Anne L. Hafner Campus WASC Faculty Coordinator Winter 2008.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Collegiate Learning Assessment Montclair State University.
Building an Evidence-Based Culture in Student Affairs Presented at the Ivy Tech Community College Student Affairs Leadership Summit July 1, 2009 By Trudy.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
SUNY GENERAL EDUCATION ASSESSMENT CONFERENCE Guidelines for and Implementation of Strengthened Campus-Based Assessment.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Data on Student Learning Office of Assessment University of Kentucky.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Jeannie Couper, MSN, RN-BC Seton Hall University May 9, 2012.
Profiles of Good Practice in Assessing Student Learning Outcomes Presented at the Symposium on Tertiary Assessment and Student Outcomes at the Victoria.
MAPP as an Accountability Assessment Brent Bridgeman Educational Testing Service Center for Validity Research.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
Curriculum Design. A Learner Centered Approach May, 2007 By. Rhys Andrews.
The Voluntary System of Accountability (VSA SM ).
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Placement Testing Dr. Edward Morante Faculty to Faculty Webinar April 19, 2012 Sponsored by Lone Star College.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Assessment Update Report to the University Senate 3 October 2006.
Liberal Education and America’s Promise: Changing the Conversation about Student Success and Institutional Accountability SHEEO—Denver, CO August 2009.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Responding to Calls for Greater Accountability SHEEO-NCES Network Conference April 17, 2008 Christine M. Keller NASULGC The VSA Project.
AASCU Academic Affairs Summer Meeting Portland Oregon ▪ July 28-31, 2011 Christine M Keller VSA Executive Director.
North East Association for Institutional Research Bethesda, Maryland ▪ November 3-6, 2012 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate.
U.S. Survey Reveals Small Impact of Outcomes Assessment on Student Learning Presented at the 31 st EAIR Forum Vilnius, Lithuania 24 August, 2009 By Trudy.
VSA & Assessment of Learning Outcomes Sally Frazee, Temple University Criss Gilbert, University of Minnesota-Twin Cities Susan Morgan – Appalachian State.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
The NASULGC Agenda Elements of Accountability Competitiveness-STEM Teachers International-1,000,000 abroad Ag Act Reauthorization- Create 21.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
An Overview of Prior Learning Assessment (PLA) 1 October 24, 2012.
Measuring College Value-Added: A Delicate Instrument
TA & PFF Workshop– Spring 2011 Artze-Vega
The Assessment Imperative: A Work in Progress
… more socially mobile…
Report from Curriculum Committee 4/3/2017
Use of Academic Resources Among Different Socioeconomic Classes
Director, Center for Teaching, Learning, & Assessment
WHAT MAKES A GOOD ASSIGNMENT?
Lakeland Middle School Professional Learning Communities (PLC)
Director of Policy Analysis and Research
How Technologically Literate are EMCC Students?
NSSE 2004 (National Survey of Student Engagement)
ASSESSMENT OF THE GEN ED PROGRAM –
Assist. Prof.Dr. Seden Eraldemir Tuyan
The University of Texas-Pan American
Seminar on the importance of Education Research and Innovation
Educate Alabama PLP Development.
Teaching and Educational Psychology
Clarkson CHS Improving Academic Standards
Differentiation in Instruction
Continuing to Advance the Culture of Assessment in Massachusetts
Diocese of Baton Rouge Catholic Schools
NJCU College of Education
Jo Lynn Autry Digranes Coordinator for Assessment Updated 10/2017
UTRGV 2018 National Survey of Student Engagement (NSSE)
Jillian Kinzie, Indiana University Center for Postsecondary Research
The Heart of Student Success
Critical Analysis of Rankings of Universities
HART RESEARCH A S O T E C I AAC&U Members On Trends In Learning Outcomes, General Education, and Assessment Key findings from online survey among 433 Chief.
Milwaukee Public Schools University of Wisconsin-Milwaukee
Teaching Mathematics Finding the most effective way to teach middle school students mathematics.
Student Interpretation of Learning Outcomes
Presentation transcript:

Can Institutions Really Be Compared Using Standardised Tests? Presented at the 2008 Meeting of the European Association for Institutional Research Copenhagen, Denmark August 2008 By Trudy W. Banta Professor of Higher Education and Senior Advisor to the Chancellor for Academic Planning and Evaluation Indiana University-Purdue University Indianapolis 355 N. Lansing St., AO 140 Indianapolis, Indiana iupui.edu

Standardized tests of generic skills cannot provide valid comparisons of institutional quality. © TWBANTA-IUPUI

 Standardized tests do have a place in assessing learning.  Higher education must be accountable © TWBANTA-IUPUI

My History  Educational psychology  Program evaluation & measurement  Performance funding in Tennessee  1990 USDOE effort to build a national test  1992 Initiated evidence-based culture at IUPUI © TWBANTA-IUPUI

Group Assessment Has Failed to Demonstrate Institutional Accountability Focus on improvement at unit level Rare aggregation of data centrally Too few faculty involved HE scholars focused on K-12 assessment

© TWBANTA-IUPUI Now We Have the Press to Assess with a Test

© TWBANTA-IUPUI 2006 Commission on the Future of Higher Education  We need a simple way to compare institutions  The results of student learning assessment, including value added measurements (showing skill improvement over time) should be... reported in the aggregate publicly.

OECD’s AHELO for 10 HEIs from 3-4 countries 1. Generic skills (CLA) 2. Disciplines (Engineering and Economics) 3. Value added 4. Contextual information indicators © TWBANTA-IUPUI

Two-Pronged Strategy in Washington 1. Pressure accreditors 2. Voluntary System of Accountability - NASULGC - AASCU

© TWBANTA-IUPUI Voluntary System of Accountability Report Scores in critical thinking, written communication, analytic reasoning using CAAP MAPP CLA

© TWBANTA-IUPUI Collegiate Assessment of Academic Proficiency (6 independent modules) Reading Writing Skills Writing Essay Mathematics Science Critical Thinking

© TWBANTA-IUPUI Measure of Academic Proficiency & Progress Humanities Social Sciences Natural Sciences College – Level Reading College – Level Writing Critical Thinking Mathematics Total Score

© TWBANTA-IUPUI Collegiate Learning Assessment Performance and Analytic Writing Tasks measuring Critical Thinking Analytic Reasoning Written Communication Problem Solving

© TWBANTA-IUPUI Standardized tests CAN initiate conversation

Advantages of standardized tests of generic skills  promise of increased reliability & validity  norms for comparison © TWBANTA-IUPUI

Limitations of standardized tests of generic skills  cannot cover all a student knows  narrow coverage, need to supplement  difficult to motivate students to take them! © TWBANTA-IUPUI

TN = Most Prescriptive (5.45% of Budget for Instruction) 1. Accredit all accreditable programs (25) 2. Test all seniors in general education (25) 3. Test seniors in 20% of majors (20) 4. Give an alumni survey (15) 5. Demonstrate use of data to improve (15) ___ 100

© TWBANTA-IUPUI At the University of Tennessee CAAP Academic Profile (now MAPP) COMP (like CLA and withdrawn by 1990) College BASE

© TWBANTA-IUPUI In TN We Learned 1) No test measured 30% of gen ed skills 2) Tests of generic skills measure primarily prior learning 3) Reliability of value added =.1 4) Test scores give few clues to guide improvement actions

© TWBANTA-IUPUI VSA Instrument Reviews Revealed Available tests of generic skills – 1) Lack test – retest reliability 2) Lack content validation (except with ACT/SAT) 3) Have no evidence that score gain is due to college experience

© TWBANTA-IUPUI An Inconvenient Truth.9 = the correlation between SAT and CLA scores of institutions thus 81% of the variance in institutions’ scores is due to prior learning

© TWBANTA-IUPUI How Much of the Variance in Senior Scores is Due to College Impact? Student motivation to attend that institution (mission differences) Student mix based on age, gender socioeconomic status race/ethnicity transfer status college major

© TWBANTA-IUPUI How Much of the Variance in Senior Scores is Due to College Impact? (continued) Student motivation to do well Sampling error Measurement error Test anxiety  College effects ______ 19 %

Threats to Conclusions Based on Test Scores 1. Measurement error 2. Sampling error 3. Different tests yield different results 4. Different ways of presenting results 5. Test bias 6. Pressure to raise scores - Daniel Koretz “Measuring Up” Harvard U. Press © TWBANTA-IUPUI

Student Motivation Samples of students are being tested Extrinsic motivators (cash, prizes) are used We have learned: Only a requirement and intrinsic motivation will bring seniors in to do their best

Concerns About Value Added Student attrition Proportion of transfer students Different methods of calculating Unreliability Confounding effects of maturation © TWBANTA-IUPUI

Recent University of Texas Experience  30 – 40% of seniors at flagships earn highest CLA score (ceiling effect)  flagship campuses have lowest value added scores © TWBANTA-IUPUI

Other Recent Critiques  Cross-sectional design incorporates too many confounding variables ~ OR ~  Freshman scores generally are NOT randomly drawn from same population as senior scores © TWBANTA-IUPUI

Word from Measurement Experts Given the complexity of educational settings, we may never be satisfied that value added models can be used to appropriately partition the causal effects of teacher, school, and student on measured changes in standardized test scores. - Henry Braun & Howard Wainer Handbook of Statistics, Vol. 26: Psychometrics Elsevier 2007 © TWBANTA-IUPUI

Better Ways to Demonstrate Accountability 1. Performance Indicators  Access, social mobility  Diversity  Workforce development  Economic development  Engaging student experience © TWBANTA-IUPUI

Better Ways to Demonstrate Accountability 2. Measures of Learning  Standardized tests in major fields  Internship performance  Senior projects  Electronic portfolios  External examiners © TWBANTA-IUPUI

AAC&U Essential Learning Outcomes  Knowledge of human cultures and physical and natural world  Intellectual and practical skills (writing, thinking, team work)  Personal and social responsibility  Integrative learning © TWBANTA-IUPUI

Expensive Alternatives ?  Agreement on outcomes  Agreement on standards of achievement  Peer review © TWBANTA-IUPUI

Accompanying Benefits Teach faculty how to develop better classroom assessments Involve faculty in using results to improve learning More collaboration across disciplines and institutions Closer ties with community © TWBANTA-IUPUI