Richard Arum and Josipa Roksa February 18, 2011 Inside Higher Ed Audio Conference *We thank the Carnegie Corporation of New York and the Lumina, Ford and.

Slides:



Advertisements
Similar presentations
[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
Advertisements

BEING AN ETHICAL INSTRUCTIONAL LEADER SHIRLEY JOHNSON & KEISHA D. SMITH BASED ON THE WORKSHOP FROM AEA 2012 PROFESSIONAL DEVELOPMENT CONFERENCE.
WASC Visiting Committee Report 3/28/2007. Areas of Strength Organization The Co Principals and the School Leadership Team provide direction and support.
Models for Future Comparative Measurement of Higher Education Learning: Lessons from the Collegiate Learning Assessment Longitudinal Study in the U.S.*
Potential impact of PISA
The SUNY Assessment Initiative: Best Practices for Mapping Program Objectives to Curricular Activities Presentation to Middle States Commission on Higher.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
[Insert faculty Banner] Consistency of Assessment
EPortfolio Assessment at the College of General Studies, Boston University: Authentic, evidence-based and ENCOURAGING!
Foundations of Excellence ® in the First College Year (4-year institutions) Salisbury University Project Description of Review Process of First College.
World’s Largest Educational Community
Engaging Online Faculty and Administrators in the Assessment Process at the American Public University System Assessment and Student Learning: Direct and.
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
1 Keys to a Successful Webinar Tech Support—For technical assistance, dial Q&A—To ask a question, simply type your question into the Q&A.
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
Adjunct Training for Elementary and Middle Grades Masters Program Gardner-Webb University Graduate Program Dr. Jane King – Elementary Dr. Kelly Taylor.
Academic Advising Implementation Team PROGRESS REPORT April 29, 2009.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Consistency of Assessment
Key Communities and Objectives Outcomes- Based Assessment Telling the Story Results Closing the Loop.
Apples to Oranges to Elephants: Comparing the Incomparable.
Diversity Assessment and Planning with members of the October 14, 2005.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
Data on Student Learning Office of Assessment University of Kentucky.
Race to the Top Program Update January 30, State Funding 2.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
The Future of Higher Education in Texas
MARTIN COMMUNITY COLLEGE ACHIEVING THE DREAM COMMUNITY COLLEGES COUNT IIPS Conference Charlotte, North Carolina July 24-26, 2006 Session: AtD – Use of.
Derek Herrmann & Ryan Smith University Assessment Services.
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Undergraduate Core at Doane March 14, Overview of Undergraduate Core at Doane Philosophy of the Undergraduate Core at Doane (aligned with mission)
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
The Voluntary System of Accountability (VSA SM ).
Hazlet Township Public Schools
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
Foundations of Excellence TM in the First College Year Improving the First Year of College: Foundations for Excellence Scott E. Evenbeck IUPUI FACULTY.
PERSONNEL EVALUATION SYSTEMS How We Help Our Staff Become More Effective Margie Simineo – June, 2010.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
PREPARING [DISTRICT NAME] STUDENTS FOR COLLEGE & CAREER Setting a New Baseline for Success.
Success is what counts. Achieving the Dream: Supporting Community College Student Success Richard Kazis Jobs for the Future Arkansas Legislative Task Force.
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Assess It Jim Lenge l. 21st-century skills 1. Define 2. Teach 3. Assess.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Stetson University welcomes: NCATE Board of Examiners.
ACADEMIC RIGOR LEAD 615 – Group Presentation Marcelle Fung, Erin Gehant, Bruce Guttman, Kathryn Stanley, Keisha Warner
Defining & Aligning Local Curriculum. What is Curriculum? Individually consider your personal definition of the term curriculum What words do you think.
Council for the Advancement of Standards in Higher Education.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Grading Practices Study Group Trigg County Board of Education November 19, 2015.
The University of Texas-Pan American National Survey of Student Engagement 2014 Presented by: October 2014 Office of Institutional Research & Effectiveness.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Director of Policy Analysis and Research
Limited Learning on College Campuses Richard Arum and Josipa Roksa
Derek Herrmann & Ryan Smith University Assessment Services
Helping US Become Knowledge-Able About Student Engagement
Senate Ad hoc Committee for the Assessment of the Higher Education Research Institute (HERI) Faculty Survey Report on Findings Felicia Lassk, Associate.
“Laying Foundations for the Future!”
DCB Annual Review of Teaching Performance
February 21-22, 2018.
Presentation transcript:

Richard Arum and Josipa Roksa February 18, 2011 Inside Higher Ed Audio Conference *We thank the Carnegie Corporation of New York and the Lumina, Ford and Teagle Foundations for their generous financial support and the Council for Aid to Education for collaboration and assistance with data collection. Responding to Academically Adrift: What Colleges Can Do

Research Questions Are students improving their critical thinking, complex reasoning, and writing skills during college? What specific experiences and college contexts are associated with student learning? How do disadvantaged groups of students fare in college with respect to learning?

Longitudinal Design Longitudinal Design Fall 2005, Spring 2007, Spring 2009, Spring 2010, Spring 2011 (planned) Large Scale Large Scale : 24 diverse four-year institutions; 2,341 students : 29 diverse four-year institutions, 1,666 students Breadth of Information Breadth of Information Family background and high school information, college experiences and contexts, college transcripts Collegiate Learning Assessment (CLA) Determinants of College Learning Dataset

Dimensions of learning assessed Dimensions of learning assessed critical thinking, complex reasoning, and written communication Distinguishing characteristics Distinguishing characteristics Direct measures (as opposed to student reports) NOT multiple choice Holistic assessment based on open-ended prompts representing real-world scenarios Used in other contexts Used in other contexts One of the measures of learning used by VSA Will be utilized in 2016 by OECD-AHELO project Collegiate Learning Assessment (CLA)

Performance Task (example) You are the assistant to Pat Williams, the president of DynaTech, a company that makes precision electronic instruments and navigational equipment. Sally Evans, a member of DynaTechs sales force, recommended that DynaTech buy a small private plane (a SwiftAir 235) that she and other members of the sales force could use to visit customers. Pat was about to approve the purchase when there was an accident involving a SwiftAir 235.

Performance Task (example, cont.) Students are provided with a set of materials (e.g. newspaper articles, Federal Accident Report, exchanges, description and performance characteristics of AirSwift 235 and another model, etc.) and asked to prepare a memo that addresses several questions, including what data support or refute the claim that the type of wing on the SwiftAir 235 leads to more in-flight breakups, what other factors may have contributed to the accident and should be taken into account, and their overall recommendation about whether or not DynaTech should purchase the plane.

Course Requirements Note: Based on Spring 2007 survey.

Students Time Use Note: Based on Spring 2007 survey.

Academic Commitment Over Time Academic Commitment Over Time (source: Phillip Babcock and Mindy Marks, forthcoming 2010 ) Academic time from in time diaries relatively constant (39.2 to 34.1)

0.18 standard deviations – 7 percentile point gain (0.47 sd, 18 percentile points, ) No statistically significant gains in critical thinking, complex reasoning and writing skills for 45 percent of the students in the sample (36 percent, ) CLA Gains (performance task)

0.47 standard deviations – 18 percentile point gain No statistically significant gains for 36 percent of the students over four years CLA Gains (performance task)

CLA Performance: Faculty Expectations and Reading/Writing Requirements Note: Predicting 2007 CLA scores while controlling for 2005 CLA scores, student characteristics, and institutions attended.

CLA Performance: Studying and Fraternities/Sororities Note: Predicting 2007 CLA scores while controlling for 2005 CLA scores, student characteristics, and institutions attended.

CLA Performance: College Major Note: Predicting 2007 CLA scores while controlling for 2005 CLA scores.

Inequality in CLA Performance: Parental Education Note: Based on a 3-level HLM model, controlling for a range of demographic/family characteristics.

Inequality in CLA Performance: African American vs. White Note: Based on a 3-level HLM model, controlling for a range of demographic/family characteristics.

Institutional Variation 23 percent of CLA growth between 2005 and 2009 occurs across institutions

Note: Based on a 3-level HLM model, controlling for a range of demographic/family characteristics.

Summary of Findings Students experiencing low (and likely declining) levels of academic rigor. Gains in student performance are disturbingly low in U.S. higher education. Learning in U.S. higher education is characterized by persisting and/or growing inequality with respect to individual characteristics. There is notable inequality in experiences and outcomes across U.S. institutions associated with college selectivity.

Policy Recommendations Federal imposed accountability would be counterproductive (existing measurements are imperfect; unintended consequences likely) Federal resources could provide incentives for institutional improvement, innovation and assessment Federal resources needed to develop research infrastructure to advance scientific knowledge of learning in higher education Accountability should operate at lower levels in the system

Promote organizational cultures emphasizing student learning – both symbolically and substantively: Evaluate internal incentive structures Support ongoing assessment of program quality and student learning outcomes Develop plans for improvement Monitor implementation of improvement plans Align resource allocation decisions with academic goals Work collaboratively – improvement of academic rigor and undergraduate learning are issues that faculty, students and administrators should be able to work on together.

Faculty must assume individual and collective responsibility for ensuring adequate academic rigor across programs and classes – with reviews at course, department and school level: course requirements (e.g., levels of reading and writing) course expectations (i.e., study hours) grading standards core curriculum Faculty should have high expectations for their students and communicate expectations clearly and consistently

Internal deliberations warranted to review criteria used for decisions related to tenure, promotion and compensation: Do we have the right balance in our weighting of faculty teaching, research and service? Are we using multiple indicators to assess teaching quality (e.g., syllabi review, peer observation, samples of student work)? Are the measures of instructional quality used properly aligned with the goal of promoting academic rigor and student learning outcomes (i.e., not simply measures of student satisfaction)?

Institutional research required for ongoing assessment of student academic experiences and learning outcomes. [Since students move across programs, institutional-level mechanisms required to monitor overall student academic experiences/outcomes]. Institutional teaching and learning support services for faculty improvement efforts. [Since faculty often are not trained to teach]. Align student support services with goal of promoting student academic performance, not just social engagement or student retention, wellbeing and consumer satisfaction.

Communicate clearly and consistently to students the value of academic engagement and the goal of promoting attitudes, dispositions and higher order skills (i.e., not just subject specific knowledge) essential for economic success, civic engagement and adult status. Communicate clearly and consistently high expectations and that students ultimately have to take responsibility for their own learning.

Richard Arum Josipa Roksa