Presentation is loading. Please wait.

Presentation is loading. Please wait.

Current Movement Regarding Assessment for Placement

Similar presentations


Presentation on theme: "Current Movement Regarding Assessment for Placement"— Presentation transcript:

1 Current Movement Regarding Assessment for Placement
DIANNA CHIABOTTI, NAPA VALLEY COLLEGE MARK WADE LIEU, CHANCELLOR’S OFFICE

2 Context for Student Success
Assessment as an instrument vs. assessment as a process that includes policies and practices Variation of assessment instruments and processes across California The limitations of validation studies Chronic underfunding for matriculation services – and severe cuts recently Important to distinguish assessment instrument vs assessment process. Each college is responsible for selecting its own assessment instruments, and determining its own cutoff scores. Within established state law, each college is also responsible for its own assessment practices and policies. The result is a wide variation across the state—but also some common ground, as we will report. Many validation studies of specific instruments examine the appropriateness of student placement at various course levels, based on their success in those courses and other factors. There are no studies, that I’m aware of, that link the use of a particular assessment to student success rates beyond that course. I think we’d all agree that curriculum and instruction—and other factors as well—may be more important for student success than the particular bank of questions asked at entry. However, there are studies that examine a host of other college practices and policies associated with the assessment and placement processes—including counseling, educational planning—that have been shown to be effective in supporting student success. Moreover, good assessment practice requires transparency to students about issues such as purpose, its format and procedures, and methods for studying and preparing. Yet funding for matriculation services in California is chronically short. As you know, matriculation services have never been fully funded in California. What this means is that matriculation officers, counselors, staff members, and others, including deans and faculty, have been working against odds to support underprepared students. Over the past years, the CCC experienced substantial cuts, along with authority granted by the legislature for the community colleges to have more flexibility in the use of state funds, relieving them from adhering to state regs in the areas of assessment, placement, and counseling. Part of the genesis of this work came out of the Task Force Report on Assessment done by community college stakeholders for the Board of Governors. One of the areas of interest for that report was the mandatory nature of assessment. Neither assessment nor placement is mandatory in California. The Task Force Report found that: of the students directed to assessment, approximately 9% were not assessed. Of the students who were directed to orientation, 25% were not assessed. Most disturbing – of the students who were directed to counseling, 56% did not receive it. Assessment without orientation and counseling does not provide students with the ability to prepare for assessments or to make an informed choice about their course-taking.

3 Two Recent Publications
One-Shot Deal? Students’ Perceptions of Assessment and Course Placement in California’s Community Colleges WestEd (2010) Andrea Venezia, Kathy Reeves Bracco, Thad Nodine Assessing Developmental Assessment in Community Colleges Community College Research Center (CCRC) at Teachers College, Columbia University (February 2011) Katherine L. Hughes and Judith Scott-Clayton

4 One Shot Deal? Study Purpose Methods and Data
Study of Students’ Perceptions of Assessment and Course Placement in CCC Study Purpose Methods and Data 2-year research study by WestEd about students’ perceptions about assessment and course placement in the CCC. Purpose: Examine and describe policies and practices concerning assessment and placement across the CCC Hear directly from CCC students about their experiences with assessment and placement Compare program intent/purposes with student experiences Methods: Survey of matriculation officers statewide (N=73) Student focus groups at 5 community colleges (28 focus groups, 257 students) Community college counselor interviews (N=12) Review of 5 community colleges’ assessment and placement materials and websites

5 Assessing Developmental Assessment
National review of the role of assessment, evidence of its effectiveness, and potential alternatives Former debate over how strictly to impose assessment and placement procedures supplanted by the view of mandatory assessment and placement as a best practice On-going debate as to who should determine the process: the local institution or the state – national trend towards standardization

6 Findings: Assessment Instruments
Mathematics (CA) 49% use Accuplacer 35% use MDTP 13% use COMPASS 7% use a locally developed test 4% use self-assessment ESL (CA) 48% use CELSA 21% use COMPASS 18% use Accuplacer 16% use a locally developed test English (CA) 49% use Accuplacer 21% use CTEP 17% use COMPASS 20% use a locally developed test Nationwide 62% use Accuplacer 46% use COMPASS 92% use an exam for placement CA – One Shot Deal | Nationwide - CCRC This shows some variation from the June 2007 survey conducted for the Consultation Council Assessment Task Force. Math: MDTP was nearly equal (and slightly ahead) with Accuplacer. English: Accuplacer has increased in usage, and COMPASS moved ahead of CTEP. ESL: Remains essentially the same. CCRC: found that while Accuplacer and Compass were reasonably good predictors where grades are B or higher, validity is much weaker for C or higher.

7 Findings: Policy & Practice Variation
Uneven transparency about the tests Cut score variation Uneven enforcement of rules Inconsistent policies across colleges Lack of clarity about retake policies Confusion about “multiple measures” Uneven transparency of messaging: Focus on students being unaware of the stakes: Students’ words describe this best. One said, “I thought that it was one of those tests that you take just to see what kind of field they were going to recommend, And then I found out it places you in classes.” Another said, “The woman at the test center said, ‘It doesn’t matter how you place. It’s just to see where you are.’ Looking back, that’s not true. It’s really important.” Students in basic skills often reported that they were bouncing around, not getting out, as did their counselors. Counselors at one college said that they have a flier about practice questions, but only hand it out if students specifically request it. Cut Scores: Variation can send mixed signals to high school students and counselors as to what qualifies a college readiness. Some of the students interviewed received different placements at two different colleges based on the same test scores. Uneven enforcement of rules: Students knew that some other students had exemptions and were confused. Why were some things mandatory for some people and not for others? Inconsistent policies across colleges: Some students receive scores immediately, some have to wait two weeks and then get locked out of classes that they want. Some students talked about going to another campus to retake the test and try to enroll, but some then got locked out at campus #2. The problems we’re discussing are more challenging for students who attend more than one institutions. Lack of clarity about retake policies: Survey: in math, retake ranged from 0 days to The average was 160. Median was 73. It’s often just easier and less risky for students to accept their placement and not challenge it. If they challenge, they might end up losing a semester and getting placed at the same level as the first time they took the test. Retakes are costly for colleges. One college we studied was considering dropping the retake option to save $5K per year. Confusion about multiple measures: Students and counselors were confused about what they are and why and how they are used. 65% of the colleges surveyed have questions embedded on their placement tests. We became concerned that some of those questions might pose test threat/anxiety concerns. As one student said, “I was surprised that, when you’re starting the test, the test asks, ‘How prepared are you?’ And then it says, ‘Really good,’ ‘Bad,’ I put ‘Really Bad’ because I was not at all prepared for the math and I did score very low.” CCRC: Many states rely on single test scores for placement. Researchers are demonstrating the need to factor in noncognitive measures, which are shown to have strong correlations to student success. These include self-appraisal, long-term planning, and a support network. May also be useful for direction to student support services. Student frustration and impact on aspirations: Many students in basic skills and ESL reported feeling stuck and/or considering dropping out. Counselors often discussed their fears that their colleges’ basic skills courses churn students around, but do not help students succeed.

8 Findings: Common Ground
Counseling High student-counselor ratios Long wait lines and limited attention More satisfaction with dedicated programs High student-counselor ratios:  There is little individual contact. 41% of matriculation officers reported that fewer than ½ of the students at their college had individual access to a counselor re: assessment and placement. Students thought that contact with counselors was fragmented. As one student said, “They don’t go over test results with you. They just give the score to you.” Another said, “They should at least guide me through some classes I want to take. The counselor just asked me, ‘what classes do you want? Sign this paper.’ I was out of there in 10 minutes.” And yet another said, “You have a question and the counselors just give you a website. You’re like, ‘Well then what are you here for?’” There needs to be some determination of what needs to be high touch and what can be low or no touch. Students, by and large, do not want to use technology for their counseling needs. They want to connect with a person – to be heard and to discuss their aspirations and needs. Long wait lines and limited attention: Students at one college we visited reported that an 8 hour wait is common if you go in with no appointment and appointments are always scheduled at least two weeks in advance (the computer system does not allow for appointments to be made earlier than two weeks out). At another college, students reported that they commonly made appointments the term prior, especially if they wanted to talk with a counselor about selecting courses. Most students who talked about this discussed that there is no counseling post-assessment. There is some good news: More satisfaction with dedicated programs: Puente, Umoja, athletics, learning communities – students were much happier with their counseling experiences. They thought counseling was more personalized, responsive, easier to schedule, that the counselor “knows me,” and that they were with a group of supportive and similar students. As one student said, “I used to have different counselors. Every single one was giving me different direction and I was so lost. I didn’t understand so many things. Then I got into the Puente program. The counselor I have is the best and I now know my direction. I understand it and it’s really good.” One of the ironies is that cc’s, which serve the most at-risk students of the higher education systems, have the worst student-counselor ratios due to budget constraints. The students most in need of comprehensive counseling are often least likely to get it. CCRC: Points to use of noncognitive assessment to direct students to college services, such as Ed Plans – but these are also impacted by the lack of availability of counseling services.

9 Findings: Common Ground
Student Success Courses Summer Prep Courses Counselors, faculty, matriculation officers working against the odds with large numbers of students in need End on a positive note: 97% of colleges surveyed said they offered student success courses to provide students with college information and teach them study skills and habits of mind associated with college success. 81% reported offering summer preparatory courses for incoming students. According to matriculation officers, an average of about 22 percent of incoming students participate in such programs. Research indicates that these kinds of courses do have positive effects on students’ chances of earning a credential. Dedication of counselors/faculty/matriculation officers. Budget crisis creates challenges, but also makes it more important to ensure that cc processes are easy to navigate for students.

10 Two Additional Recent Publications
MAPPING THE TERRAIN: Language Testing and Placement for US-Educated Language Minority Students in California’s Community Colleges University of California Santa Cruz, Education Department (January 2011) George Bunch, Ann Endris, Dora Panayotova, Michelle Romero, and Lorena Llosa WHAT’S IN A TEST? ESL and English Placement Tests in California’s Community Colleges and Implications for US-Educated Language Minority Students George Bunch and Lorena Llosa George

11 The English/ESL Issue Should the student take the English placement test or the ESL placement test? Information Curriculum Multiple Measures George Bunch et. al. Focus on whether students should take the English or ESL test may be the wrong question. The answer may more appropriately rest in the curriculum that addresses students’ needs. (This was echoed in a discussion with the System Matriculation Advisory Committee I had earlier in the month.) Bunch also emphasizes that developmental/ESL curriculum must be “part of the larger context of engagement in language and literacy for authentic academic, social, and professional purposes.” Breaking up language development into small chunks goes against this necessity. (contextualization/integration/communities of learning) For the time being, Bunch suggests that US-LM students need more information about what the advantages and DISADVANTAGES of each test are. Whether or not the student sees the “ESL” label as a stigma, providing more information about what the different sequences can provide and what the consequences of choosing one over the are is crucial. At one college surveyed, there were NINE levels of ESL before Freshman Composition. The English sequence only had TWO pre-collegiate courses. Almost always more ESL levels than English (p25). (In some ways, this connects with the issue of messaging to high school students being done with the EAP – however, it goes beyond the current focus which concentrates on why students want to be prepared to take the EAP.) Perception that ESL courses are only for immigrants or international students. Bunch also cites statistics that show that students do much better when they are retested. This supports input we have gotten from the field regarding the need for preparatory test questions and review modules with assessment tests (CCCAssess). The use of K-12 data might also help here, something that is currently not very accessible to community colleges. This includes California English Language Development Test (CELDT) scores and language proficiency designations (5 English Language Development levels + Fluent)

12 Common Core Common Core State Standards Adopted by 41 states
Federal funding to develop K12 assessments aligned with Common Core Implications for community colleges Stems from work of the National Governors Association PARCC (Partnership for Assessment of Readiness for College and Careers summative Smarter: Balanced Assessment Consortium ( – summative/formative CA has signed on with both for the time being Implications The K12 assessment will supplant the EAP (in four years) – provide more information – still not a placement instrument With strengthening of the standards over CA K12 – a need to investigate alignment with higher education standards (e.g. ICAS competency standards for mathematics and academic literacy) CCCs objected to “dumbing down” to meet current K12 standards – these standards are rising to meet us – how do we respond and work with K12s coordinate and inform, including assessment A major decision facing the state is whether to become a “governing” state – means aligning with only one of the consortia and contributing some money (which is in short supply) – however, it means that the higher ed segment has input into the development of the assessment

13 CCCAssess What is CCCAssess Feasibility Study Next Steps
Budget ESL/English Issue Common Core State Standards Centralized Assessment Delivery Project Provide and Deliver Testing Instruments Centrally Let the Community College System Select the Instruments  Collect and Store Scores Centrally Web Portal Access to a Statewide Data Warehouse of Scores Leverage the Purchasing Power of the System (buy as a group, not as individuals) Don’t Mandate, provide a great service (participation is optional) Centrally Funded Gates and Hewlett grants for feasibility study What went into the feasibility study (submitted to Legislature on March 28): faculty/matriculation input into desired features of a statewide assessment test faculty/matriculation review of Request for Interest applications; follow up questions faculty/matriculation vendor interviews; what are the options and features currently available revision of list of requirements, cost estimates Future Phases Cut Score Optimization Tool (using other colleges’ data to help refine local cut scores) Multiple Measures Optimization Tool (ditto) Next Steps Persuading legislature to fund the project How to engage in the ESL/English issue How to engage in the work on Common Core – even as this project is separate from that project messaging regarding preparation use of Common Core like EAP use of Common Core assessment results as a multiple measure – store in data warehouse

14 A Big Question What are the implications of moving to a common set of assessment tests for: Local matriculation policies and practices? Curriculum? Lots of legislative pressure to move to some definition of a “common assessment” WestEd SLO study – suggests that Freshman Composition is pretty consistent; work of C-ID, LDTP, and SB1440: commonalities in curriculum are not hard to find Common goal rather than common curriculum: aim for the same ends (e.g. same cut score for “college-level” classes – consistent readiness signal), but give colleges the flexibility for how best to get students to that goal – this is very much what is being discussed under SB1143 (student success metrics) The faculty (aka the ASCCC) needs to tackle the more difficult question of the curriculum that students enter into after taking the assessment; by itself, the assessment does very little. Curricular issues include: ESL/English question; can diagnostics be used to target instruction; review of lengthy sequences that provide a challenge for completion; interventions such as learning communities and contextualization.


Download ppt "Current Movement Regarding Assessment for Placement"

Similar presentations


Ads by Google