+ General Education Assessment Spring 2014 Quantitative Literacy.

Slides:



Advertisements
Similar presentations
Aligning VALUE Rubrics to Institutional Purposes: Sharing Practical Approaches that Promote Consistent Scoring and Program Improvement Linda Siefert
Advertisements

Challenge to Lead Southern Regional Education Board Kentucky Challenge to Lead Goals for Education Kentucky is On the Move Progress Report 2008 Challenge.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
How College Shapes LivesFor detailed data, see: trends.collegeboard.org. SOURCE: National Center for Education Statistics, 2013, Tables 222, 306, and.
Multi-State Collaborative 1.  Susan Albertine Vice President, Office of Diversity, Equity, and Student Success, AAC&U Faculty Engagement Subgroup, MSC.
February  Founded  Located in Crystal Lake, IL.  offers six associate's degrees and 17 Associate of Applied Science degrees.  About 400.
Brock’s Gap Intermediate School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Mark DeCandia Kentucky NAEP State Coordinator
GENERAL EDUCATION ASSESSMENT Planning the assessment ( ): During the school year , the GE assessment was planned during multiple.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
TCSG to USG Transfer Study Preliminary Results from Pilot Survey Research & Policy Analysis University System of Georgia Board of Regents.
PIAAC results tell a story about the systemic nature of the skills deficit among U.S. adults. Overview of U.S. Results: Focus on Numeracy.
Summary of Emerging Trends in Higher Education Spring 2011.
Agricultural Careers By: Dr. Frank Flanders and Anna Burgess Georgia Agricultural Education Curriculum Office Georgia Department of Education June 2005.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
Overview of U.S. Results: Focus on Literacy PIAAC results tell a story about the systemic nature of the skills deficit among U.S. adults.
Maryland Department of Health and Mental Hygiene WB&A Market Research Executive Summary THE 2003 MARYLAND MEDICAID MANAGED CARE CUSTOMER SATISFACTION SURVEY.
Student Performance Profile Study: An Examination of Success and Equity Matt Wetstein, Interim Vice President of Instruction Office of Planning, Research,
Palomar College General Education/Institutional Learning Outcomes Overview Fall, 2015.
Assessment in General Education: A Case Study in Scientific and Quantitative Reasoning B.J. Miller & Donna L. Sundre Center for Assessment and Research.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
Pilot Training for Volunteers General Education Assessment Committee.
MDC Quality Enhancement Plan Mathematics Discipline Meeting Update March 5, 2009.
Indiana Profile of Adult Learning Adults with No High School Diploma (%) Age Age Speak English Poorly or Not at All – Age 18 to 64 (%) High.
De Anza Math Pilot Results, Evaluating a NonRandomized Trial: A Case Study of a Pilot to Increase Pre- Collegiate Math Course Success Rates Andrew.
Self-assessment Accuracy: the influence of gender and year in medical school self assessment Elhadi H. Aburawi, Sami Shaban, Margaret El Zubeir, Khalifa.
Perceptions of Distance Learning: A Comparison of On-line and Traditional Learning Maureen Hannay Troy University Tracy Newvine Troy University.
Graduation, Completion, and Transfer ATD Data Team Subcommittee report January 2011.
DCCCD Pre-Core Requirements. Texas Success Initiative (TSI) 3 Components 1.A pre-assessment activity 2.An assessment to diagnose students basic skills.
Competency Assessment Advisory Team (CAAT) QUANTITATIVE REASONING DEPARTMENT OF MATHEMATICS REP – ROB NICHOLS 1.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
 Purpose:  Identify skills builder students & determine their course taking patterns  Background:  Student course taking patterns is current method.
Welcome Aboard! CCC-QEP Carteret Community College Quality Enhancement Plan.
2013 CHEA AWARD FOR OUTSTANDING INSTITUTIONAL PRACTICE IN STUDENT LEARNING OUTCOMES Dr. Michael Heathfield Chair Applied Sciences COMITÉ DE PARES UNIVERSIDAD.
College Comparisons. Mean Total Score by College (Possible Score Range 400 to 500) SSD = Total Scores for Colleges of Business, Education, Health and.
Assessment Committee. Communicate what we Think about Assessment. Learn to Integrate assessment into our teaching practice and Act on it toward continuous.
USING CCSSE DATA TO IDENTIFY EQUITY GAPS Office of Student Equity and Success Student Equity Committee December 11, 2015.
Cañada Noel-Levitz Results Spring 2010 Semester. What is the Noel-Levitz Survey? National survey of students conducted by hundreds of colleges every year.
Berry Middle School/Spain Park High School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Kentucky Profile of Adult Learning Adults with No High School Diploma (%) Age Age Speak English Poorly or Not at All – Age 18 to 64 (%) High.
Institutional Effectiveness at CPCC DENISE H WELLS.
Personal Assessment of the College Environment (PACE) Survey Summary of Fall 2014 Results Presentation to College Council Executive Cabinet August 5, 2015.
Quantitative Literacy Assessment At Kennedy King College Fall 2013 DRAFT - FOR DISCUSSION ONLY 1 Prepared by Robert Rollings, spring 2014.
Gender Differences in Statistical Anxiety Dennis Pearl & Hyen Oh (Penn State University), Larry Lesser (University of Texas, El Paso) & John Weber (Perimeter.
General Education Assessment Report Assessment Cycle.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
SUPPORTING DATA 1 Pipeline Subcommittee June 29, 2010 DRAFT.
On-line Instruction in the Los Angeles Community Colleges George Prather, Ph.D. Stanislav Levin, M.S. Edward Pai, Ph.D Office of Institutional Research.
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
Quantitative Literacy Assessment
“Excuse Me Sir, Here’s Your Change”
Bridging the Gap Between Physics 33 & Math 14
A V I D College Readiness System A Bright Future for California Pupils
A V I D College Readiness System A Bright Future for California Pupils
Jones Hall.
Institutional Learning Outcomes Assessment
Refinement Year Results ( ) – 1/19/2018
Board of Trustees Review
Beth Perrell Arri Stone
Access Center Assessment Report
Problem Solving Institutional Learning Outcome Assessment
Spring 2018 College Algebra Assessment
Global Awareness Institutional Learning Outcome Assessment
Refinement Year Results ( ) – 3/6/2018
PART TWO: Assessing Program Outcomes
John Symons, Department of Philosophy
Disproportionate Impact Study
How online discussion forum format influences student learning
USG Dual Enrollment Data and Trends
Presentation transcript:

+ General Education Assessment Spring 2014 Quantitative Literacy

+ GE/ILO Assessment A total of 31 randomly selected courses participated in the assessment of Quantitative Literacy. Assessors were trained on the process and discussed the rubric. After scoring the student work, the assessors submitted the results to the college’s Institutional Research and Planning office and took an online survey. Participating faculty earned $250 for participating in the training and assessment. A workgroup made up of 14 full & PT faculty and administrators met in June to discuss the findings.

+ Age years olds mean scores were statistically significantly lower than and Our youngest students did not perform as well as older students

+ Gender No difference in performance by gender

+ Ethnicity White, Non-Hispanic mean scores were statistically significantly higher than Hispanic and Asian groups

+ Units Earned Students with and 60+ units performed better than all other groups. Possible --- Students who have completed more units perform better than students with less. (Concern about the unit group)

+ Course Level Students enrolled in the AA courses that were assessed performed better than the Basic Skills group. Students enrolled in the Transfer courses that were assessed performed better than students enrolled in AA and Basic Skills courses that were assessed. This does not necessarily mean that transfer level students performed better than basic skills students because you could have basic skills students enrolled in the AA and transfer level courses assessed. Caution: N’s Basic Skills = 38 AA Degree = 95 Transfer = 500

+ Overall Results More students (61%) rated as proficient on first factor, Numerical and Calculation /Computational Skills, than other factors About 50% rated as proficient on each of the remaining four factors Representation Analysis/Synthesis Interpretation Communication About half (47%) of the sample earned mean score in the range of (emerging to close to proficient)

+ Overall Results No difference in performance by gender White, Non-Hispanic student group performed better on average than Hispanic student group Youngest students did not perform as well on average than older students Units completed possibly impact performance Transfer courses assessed performed better than AA and Basic Skills courses assessed

+ Feedback from Summer Workgroup Many of the workgroup members who also participated in the assessment had concerns about how the rubric was used (multiple choice test, problems) Most members thought that only having a 0-3 was a problem – should consider adding an exemplary category Group members said our students are having problems with reading graphs, causation and critical thinking.

+ Overall suggestions Put a group together to re-structure rubric this fall. Improve rubric training Put together some learning modules that would help students with some basic aspects of quantitative literacy. For example, the module might have a unit on reading and analyzing data from multiple representations, such as tables, bar graphs, circle graphs and formulas. These could be delivered through Dashboard and/or Blackboard.

+ Process changes & suggestions New selection process - strata and quotas were used to make sure we had representation across the disciplines while still maintaining the 40% enrollment/section in Math Therefore, more departments and programs were represented in the sample. Still difficult to get faculty to participate. Need to utilize discipline-specific specialists to help with rubric training and norming.