Research review and resources to help to make your case Northern California Convening of MMAP Pilot Colleges October 28, 2015.

Slides:



Advertisements
Similar presentations
East Bay Career Pathways Regional Convening #2 December 4, 2014
Advertisements

Promising pathways Understanding and predicting preparation and success and its potential as a model for change for the California Community Colleges and.
Success is what counts. A Better Way to Measure Community College Performance Presentation about the ATD Cross-State Data Workgroup NC Community College.
Early Assessment Program: A Valid Solution for CCC Math and English Placement? KC Greaney, Director Jeanne Fadelli, Research Analyst Santa Rosa Junior.
Briefing for States January 28, 2015 EMBARGOED Not For Release Before Thursday, January 29, 2015.
Barbara Brown Assistant Vice Chancellor for Transitional and General Education Office of Educational Access & Success Transforming.
Gordon Associates Why Do We Do What We Do, And How Do We Know if it’s Working? By Ron Gordon, EdD.
Principles of Reform Bruce Vandal, Education Commission of the States April 12, 2012.
Remedial Education Reform Bruce Vandal, Education Commission of the States September 25, 2012.
Principles of Remedial Education Reform Bruce Vandal, Education Commission of the States October 24, 2011.
COMPASS National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010.
Let Them In: Increasing Access, Completion, and Equity in College English.
Harnessing the Power of Alternative Assessment to Advance the Completion Agenda Eloy Ortiz Oakley Superintendent-President, Long Beach City College Terrence.
Graduation Conversation Doug Kosty Assistant Superintendent, Office of Learning Derek Brown Manager, Assessment of Essential Skills Cristen McLean Operations.
The Board of Governors Motion on Assessment: An Update Mark Wade Lieu Academic Senate for California Community Colleges.
Academic Attainment in California Community Colleges: Racial And Ethnic Disparities in the ARCC 2.0/Scorecard Metrics Tom Leigh Alice van Ommeren.
ARCC /08 Reporting Period Prepared by: Office of Institutional Research & Planning February 2010.
ASSESSING DEVELOPMENT ASSESSMENT IN COMMUNITY COLLEGES: A REVIEW OF THE LITERATURE Katherine Hughes Community College Research Center Judith Scott-Clayton.
Mark Wade Lieu, ASCCC President Jane Patton, ASCCC Vice President Janet Fulks, ASCCC Curriculum Chair 1.
PRESENTATION TITLE IN HEADER / MONTH XX, COMMUNITY COLLEGE RESEARCH CENTER MONTH XX, 2012 Presenter name, title - set in 20pt Arial Bold Based on.
Presenters Rogeair D. Purnell Bri C. Hays A guide to help examine and monitor equitable access and success Assessing and Mitigating Disproportionate Impact.
Implementing Systemwide Advanced Placement (AP) Equivalencies Academic Senate for the California Community Colleges Spring Plenary; April 18, 2008 Dave.
2009 Closing the Expectation Gap Fourth Annual 50-State Progress Report on the Alignment of High School Policies with the Demands of College and Careers.
Multiple Approaches to Multiple Measures: Four Three Approaches to Improving Student Achievement Through More Holistic, Evidence-Based Multiple Measures.
Student Performance Profile Study: An Examination of Success and Equity Matt Wetstein, Interim Vice President of Instruction Office of Planning, Research,
SSTF Update: ARCC Score Card Phil Smith — ASCCC Leadership Development Committee Chair Craig Rutan — Santiago Canyon College.
Basic Skills Across the Curriculum Barbara Illowsky, Project Director ASCCC Curriculum Institute, July 2008.
Stephanie Curry-Reedley College James Todd- ASCCC Area A Representative.
Placement Testing Dr. Edward Morante Faculty to Faculty Webinar April 19, 2012 Sponsored by Lone Star College.
Making the Case for Revising the Placement Process: A Lit Review Deborah L Harrington, Executive Director, 3CSN.
Kale Braden, ASCCC North Representative Michelle Grimes-Hillman, ASCCC Curriculum Committee Chair James Todd, Interim Vice-President, Student Services,
Janet Fulks, ASCCC Curriculum Chair Jane Patton, ASCCC President 1.
Prerequisites Changed My Life ASCCC Fall Plenary Session November 2013 Greg Burchett, Riverside City College John Stanskas, San Bernardino Valley College.
College Preparatory Course Certification Pilot May 5th,
Student Equity and the Re-imagination of Student Capacity Building a bridge to the future of the California Community Colleges.
Janet Fulks, Chair, ASCCC Curriculum Committee Jane Patton, President, ASCCC Michelle Pilati, Vice President, ASCCC 1.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Sierra College ERIK COOPER. A Brief History  The RP Conference 2013  STEPS Project  Sierra College  Paul Neal & Craig Kelly  Working with the English.
Let Icarus Fly : Multiple Measures, Assessment, and the Re- imagination of Student Capacity John J. Hetts, Ph.D. Senior Director of Data Science Educational.
Enhanced Multiple Measures for Placement: The Multiple Measures Assessment Project (MMAP)
Multiple Measures Project  California Community Colleges are required to use multiple measures Title 5 § 55502(i) clearly mandates that California community.
English Assessment Validity February 2012 Michael Orkin, Jo Ann Phillips, Hui Zhang, Sheryl Queen Office of Institutional Research Peralta Community College.
Looking at Successful Student Trajectories Janet Fulks, Bakersfield Colleges.
MULTIPLE MEASURES, ASSESSMENT, AND THE RE-IMAGINATION OF STUDENT CAPACITY February 16, 2016 Craig Hayward Director of Research, Planning and Accreditation.
Institutional Effectiveness at CPCC DENISE H WELLS.
1 College Board Overview in Minnesota Christine Silvis Kathie Montognese Jeff Peterson.
ReadiStep and PSAT/NMSQT Summary of Answers and Skills & Advantage: SAT PSAT/NMSQT.
Admission and Transfer Policy Review Task Force 1.
MULTIPLE MEASURES, ASSESSMENT, AND THE RE-IMAGINATION OF STUDENT CAPACITY March 2, 2016 Loris Fagioli, PhD Research and Planning Analyst, Irvine Valley.
Friday, March 18 2:30 PM – 3:45 PM Breakout Session Block II Exemplary Program Winner 2015 – Bakersfield College MIH (Making it Happen) Program Bakersfield.
Strategy 2: Corequisite Remediation — English 1A + 1 unit co-req Josh Scott, English Instructor, BSI Coordinator.
Changes in placement in the California Community Colleges John J. Hetts Senior Director of Data Science, CalPASS Plus/Educational Results Partnership Former.
Pre-requisites, Co-Requisites, and Advisory Preparation
Cheryl aschenbach, north representative
Student Success Scorecard: 2016 Report
CMC3 South Conference October 15, 2016 Loris Fagioli, PhD
Basic Skills Innovation
September Strengthening Student Success Conference, Oakland, CA
Strengthening Student Success Conference October 8, 2015
Assessment of Student Course Competency Readiness: Ensuring proper placement of students Andrew LaManque, Associate Vice President of Instruction, Foothill.
Multiple measures and accurate student placement
Student Success Scorecard & Other Institutional Effectiveness Metrics
Implementation requirements for ab 705
AB 705 and You: Your Program and Your Students – Noncredit, ESL, and Basic Skills Ginni May, Area A Representative, Math and Quantitative Reasoning Task.
What we know about ab 705 Cheryl Aschenbach, North Representative
Transforming Remediation in the USG -- How We Worked Together to Increase Student Success February 9, 2018 Georgia Association for Developmental Education.
Unit 7: Instructional Communication and Technology
What we know about ab 705 Cheryl Aschenbach, North Representative
Impact of AB 705 and Guided Pathways on Part-Time Faculty
Hard-blocked Placement Prerequisites
Presentation transcript:

Research review and resources to help to make your case Northern California Convening of MMAP Pilot Colleges October 28, 2015

Being prepared

Excellent overviews  MMAP Status Report coming very soon  In the meantime –Burdman, 2012: Where to begin? The evolving role of placement exams for students starting college. –Bracco et al, 2014: Exploring the use of multiple measures for placement into college-level courses: Seeking alternatives or improvements to the use of a single standardized test.

Know the law KKnow your matriculation handbook, esp. Chap 2: –“–“ Assessment is a holistic process through which each college collects information about students to facilitate their success by ensuring their appropriate placement into math, English, and ESL curricula. Student assessments should reflect a variety of informational sources that create a profile of a student’s academic strengths and weaknesses.” p. 2.3 CColleges must adhere to the following regulations and guidelines when implementing and managing any assessment instrument used for course placement: –…–… –C–Course placement recommendations must be based on multiple measures (sections 55502(i) and 55522(a)). Additional indicators of student readiness for math, English, and ESL course content must be used together with placement test results. p. 2.4

Know the law - II  Know your Title 5, esp. Division 6 (CCCs), Subchapter 6 (Matriculation programs): –55502.(i) “Multiple measures” are a required component of a district's assessment system and refer to the use of more than one assessment measure in order to assess the student. Other measures that may comprise multiple measures include, but are not limited to, interviews, holistic scoring processes, attitude surveys, vocational or career aptitude and interest inventories, high school or college transcripts, specialized certificates or licenses, education and employment histories, and military training and experience. (See also 55522(a) –55502 (e) “Disproportionate impact” in broad terms is a condition where access to key resources and supports or academic success may be hampered by inequitable practices, policies, and approaches to student support or instructional practices affecting a specific group. For the purpose of assessment, disproportionate impact is when the percentage of persons from a particular racial, ethnic, gender, age, or disability group, who are directed to a particular service or course placement based on an assessment test or other measure is significantly different from the representation of that group in the population of persons being assessed, and that discrepancy is not justified by empirical evidence demonstrating that the assessment test or other measure is a valid and reliable predictor of performance in the relevant educational setting. –(also (d)(2): Prerequisites or corequisites may be established only for any of the following purposes: o (2) the prerequisite will assure, consistent with section 55002, that a student has the skills, concepts, and/or information that is presupposed in terms of the course or program for which it is being established, such that a student who has not met the prerequisite is highly unlikely to receive a satisfactory grade in the course (or at least one course within the program) for which the prerequisite is being established)

Know the Academic Senate’s position  Academic Senate for California Community Colleges (ASCCC) passed resolution in strong support of using multiple measures for placement (ASCCC, 2013).  ASCCC task force concluded that “inclusion of multiple measures in our assessment processes is an important step toward improving the accuracy of placement processes” (Grimes-Hillman, Holcroft, Fulks, Lee, & Smith, 2014, p. 7).

Know what advocates of standardized tests say  “’We’ve been advocating for almost everything that’s been indicated in the report [Pamela Burdman’s Where To Begin? The Evolving Role Of Placement Exams For Students Starting College] for quite a few years now,’ said David Parmele, executive director in the ACCUPLACER program for the College Board. …’ We do not believe that the placement score alone should be the only factor used to decide a student’s placement into college-level classes,’ Parmele said, echoing a key aspect of the report—namely, how some systems are weighing the merits of moving away from the widespread practice of using the test scores as the only basis for assigning students to remedial classes and toward using multiple measures, such as high school grades.  Mr. Parmele and Mr. Sconing [ACT assistant vice president for applied research] said both Accuplacer and Compass include tools to allow colleges to weigh test results along with other academic indicators, such as high school grades and course credits, and work with colleges to use broader measures of student readiness than just the test. Neither testing representative, however, knew how many of its client colleges actually use those tools.

Know what advocates of standardized tests say - II  “The College Board agrees that the most successful placement models are those that take a comprehensive approach. This means utilizing the extensive range of tools available within ACCUPLACER to assess multiple variables, including high school GPA, to develop a more robust picture of a student’s preparation for college and careers.”  But it [the US Department of Education] also said that tests should be “just one of multiple measures” of student achievement, and that “no single assessment should ever be the sole factor in making an educational decision about a student, an educator or a school.”

Know current state of affairs  >92% of two year institutions administer high-stakes placement exams (Hughes & Scott- Clayton, 2011):  Only 21% of two year institutions use anything other than an admissions or placement test in mathematics, 13% in reading (Fields & Parsad, 2012) –Wide variability in cut scores with those at 2-year institutions typically higher than at 4-year institutions  68% of students in two year institutions take at least one developmental education course (Scott-Clayton & Belfield, 2015).  Placement below transfer level is significant barrier to completion (Bailey, 2009; Bailey, Jeong, & Cho, 2010) –<50% complete the sequence, ~30% never attempt a course in the sequence and ~10% fail to re-enroll after successfully completing at least one course in the sequence  50-60% of equity gap in college completions occur during assessment and matriculation (Stoup, 2015)

Conventional Wisdom  It is a problem with today’s students –Students are simply, vastly unprepared for college –Kids these days ….

That seems awfully familiar

Too familiar (Bye Bye Birdie – 1963)

The conventional wisdom is likely wrong  National Assessment of Educational Progress: at all-time highs in virtually every demographic category:bit.ly/NAEPInfobit.ly/NAEPInfo

The evidence mounts  Research increasingly questions effectiveness of current standardized assessment for understanding student capacity –Little relation to college course outcomes o (e.g., Belfield & Crosta, 2012; Edgescombe, 2011; Jaggars & Stacey, 2014; Scott-Clayton, 2012; Scott-Clayton & Rodriguez, 2012): bit.ly/CCRCAssess and bit.ly/CCRCAssesshttp://bit.ly/DevEdOutcomes –20-35% of students in developmental education sequences are severely underplaced (e.g., would likely earn a B or better in the transfer-level course) with many more underplaced. (Scott- Clayton and Belfield, 2015).bit.ly/CCRCPlacementAccuracybit.ly/CCRCPlacementAccuracy –Underestimates capability of students of color, women, first generation college students, low SES ( Hiss & Franks, 2014) bit.ly/DefiningPromisebit.ly/DefiningPromise –May increasingly be confounded with income (Geiser, 2015) o Controlling for SES, the utility drops meaningfully

Potential change in placements

Implementing Multiple Measures Placement: Transfer-level Placement Rates LBCC F2012

SDCCD MMAP F2015 Pilot (N = ~1000)

But doesn’t that just flood transfer-level courses with unqualified students? …

Comparison against traditional sequence: Success rates in transfer-level courses Neither of these differences approach significance, p >.30

Cohort 1 English 1 Success Rates by Placement (vs. 4 year completion)

Cohort 3: Success rates in transfer-level courses English difference, p <.001

Overall Success Rate in Transfer Level English by Method of Qualification (among students with high school data available)

Success Rate by Method of Qualification in Transfer Level English

Sierra College F2014 Transfer-Level English Success Rates by Placement

… what about grade inflation/social promotion in HS?

Concerns about grade inflation and social promotion do not fit evidence  Suggests that there should be little to no relation between HS grades and college grades because HS grades unrelated to performance –Everyone gets As and Bs would mean no variation to predict outcomes  Yet, predictive utility strongly observed –Stronger than standardized tests –Even by standardized test companies

Westrick & Allen, 2014: ACT COMPASS Validation Median Logistic R (Table 4) CourseCompass TestCompassHSGPA HSGPA + Compass English 1Writing Skills ArithmeticPre-Algebra AlgebraPre-Algebra Intermediate AlgebraAlgebra College AlgebraAlgebra College Algebra

Westrick & Allen, 2014: Conditional Success Rates for English 1 (Table 6) Compass Score (30 extremely low to 90 extremely high) HSGPA %26%28%29%32% %47%49%51%55% %69%70%72%75% Compass Score (30 extremely low to 90 extremely high) HSGPA %26%28%29% 32% %47%49%51% 55% %69%70% 72%75%

Evidence for grade inflation low at best  Little evidence for grade inflation over last decade  Earlier observations of grade inflation may have been partly artifactual – adjustments to GPA for AP/IB/Honors  Zhang & Sanchez, 2014:

… didn’t that work only because Long Beach is special/has special relationship between LBCC and LBUSD?

Not just Long Beach  LBCC now includes multiple additional districts including ABC Unified and Los Alamitos Unified  Long thread of research in the CCCs alone –2008: Willett, Hayward, & Dahlstrom o 11 th grade HS variables as early alert mechanism for discipline assessment –2011: Martinez o self-reported HS variables as more powerful predictors of college completion –2014: Willett & Karanjeff o replication of LBCC research with 12 additional colleges (STEPS)  Replication of implementation –Bakersfield College and Sierra College began similar implementation in 2014 o  CCRC research  MMAP Statewide Research & local replications: bit.ly/MMAP2015bit.ly/MMAP2015 –MMAP Pilot colleges:

… we’re happy with our placement. Why should we change?

Powerful reasons for change: 1) Basic assessment theory and methods  Self-reported satisfaction with assessment by instructors and students is most common measure and has grave methodological flaws: –Selection bias –Confirmation bias –Effort justification –System justification –Self-fulfilling prophecy effects and stereotype threat  HSGPA is effectively gold standard of assessment/measurement theory –Triangulates capacity across assessment methods, content domains, evaluators, and time eliminating most sources of systematic and random error

Powerful reasons for change: 2) It’s poorly assessing students  Substantial evidence of systemic & severe underplacement –placing students in developmental education who could get a B or better in the transfer level course –Up to 36% of students placed into development English and 25% of students placed into developmental Math  Using multiple measures reduces error and has clear potential to increase success rates and sequence completion –

Powerful reasons for change: 3) Transformational impacts for students  Potential for dramatic increases in rates and time to completion of –Transfer-level course in discipline –Subsequent courses in discipline –Other early educational milestones.

F2012 Promise Pathways vs. Fall year rates of achievement

Equity impact LBCC: F2011 Baseline Equity Gaps for 2-year rates of achievement

Equity impact LBCC: F year rates of achievement

… what about students for whom high school transcript data aren’t available/easy to get?

Just ask: self-reported HSGPA appears to be reliable alternative  College of the Canyons Research (Gribbons, 2014) –Self-report of last course and grade in Fall term very accurate –Errors that do occur in part because of timing of assessment  University of California admissions –Uses self-report HSGPA but verifies after admission –2008: 9 campuses, students. No campus had more than 5 discrepancies b/w reported grades and student transcripts: o  Much of the ACT research uses self-report GPA and finds it to be a more powerful predictor than students actual scores on the standardized tests –ACT, 2013: r(1978) =.84

ACT, 2013: Actual HSGPA Level N Mean HSGPA Mean diff. Accuracy Actual Student- reported % within 0.25% within – –0.0487%98% 3.00– –0.0160%90% 2.50– %82% 2.00– %73% 1.50– %55% 0.00– %35% Total1, %83%

… what about non-traditional students?

Multiple measures continues to have utility for delayed matriculants  HSGPA continues to be predictively useful up to the point where we have data we can meaningfully connect –Delay of 9-10 years.

How long is High School GPA good for?

Westrick & Allen, 2014: ACT COMPASS Validation Standardized Logistic Regression Coefficients(Table 5) CourseCompass TestStudent TypeCompassHSGPADiff English 1Writing SkillsTraditional Nontraditional ArithmeticPre-AlgebraTraditional Nontraditional AlgebraPre-AlgebraTraditional Nontraditional Int. AlgebraAlgebraTraditional Nontraditional Coll. AlgebraAlgebraTraditional Nontraditional Coll. Algebra Traditional Nontraditional