Assessing General Education: Options, Choices and Lessons Learned

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Meeting MSCHE Assessment Expectations
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
General Education Assessment 2013: Introduction
ACADEMIC DEGREE ASSESSMENT & GENERAL EDUCATION ASSESSMENT Nathan Lindsay Arts & Sciences Faculty Meeting March 12,
Welcome to the College of Liberal Arts and Sciences Advising 1001.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.
Assessment in the Biology Department in AY Caroline Solomon Friday December 5.
February  Founded  Located in Crystal Lake, IL.  offers six associate's degrees and 17 Associate of Applied Science degrees.  About 400.
SLAs – MAKING THE SHIFT. Session Goals Deepen understanding of Inspiring Education, Literacy and Numeracy Benchmarks (embedded in Curriculum Redesign)
New Web-Based Course Evaluation Services Available to Schools and Departments Presentation to Faculty Council November 6, 2009.
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Welcome to the College of Liberal Arts and Sciences Advising 1001.
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
Grants Pass High School Four Year Plan Tracking System Celena Shouse-Bland.
IB Diploma Program Exams – Semester Report Cards
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Data on Student Learning Office of Assessment University of Kentucky.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
FLCC knows a lot about assessment – J will send examples
Understanding our First Years Two studies and a comparison.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Program Level Outcomes Jessica Carpenter Elgin Community College.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Becoming a Teacher Ninth Edition
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
FACULTY RETREAT MAY 22, H ISTORY 2006 Middle States Self-Study Reviewer’s Report Recommendations: The institution is advised that General Education.
MAPP as an Accountability Assessment Brent Bridgeman Educational Testing Service Center for Validity Research.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
Assessment of Curriculum Outcomes Dale Whittaker Associate Dean and Director of Academic Programs Purdue University May 21, 2008.
1 General Education Assessment Institutional Portfolio Model.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Marilyn Kurata, Ph.D. Director of Core Curriculum Enhancement Marilyn Kurata, Ph.D. Director of Core Curriculum Enhancement Faculty Forum March 24, 2012.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Understanding the TerraNova Test Testing Dates: May Kindergarten to Grade 2.
Organizing for General Education Assessment Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
Council on Teacher Education
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
Connecting Course Goals, Assignments, and Assessment Faculty Development for Student Success at Prince George’s Community College William Peirce
Competency Assessment Advisory Team (CAAT) QUANTITATIVE REASONING DEPARTMENT OF MATHEMATICS REP – ROB NICHOLS 1.
VSA & Assessment of Learning Outcomes Sally Frazee, Temple University Criss Gilbert, University of Minnesota-Twin Cities Susan Morgan – Appalachian State.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
WALNUT HIGH SCHOOL CLASS OF 2018 Sophomore to Junior Year.
Transitioning from 8 th to Freshman Graduation Requirements Twenty-four credits are required for graduation from CHS. A credit of 0.5 is given for each.
2016 SAT Redesign overview.
Assessment Plans Council of Deans, March 22, 2007.
Incorporating Program Assessment into Your Annual Program Review June 29, 2006.
Understanding the Common Core Standards Adopted by Nevada in 2010 Our State. Our Students. Our Success.
Laboratory Science and Quantitative Core Requirements.
1 Embracing Math Standards: Our Journey and Beyond 2008.
An Introduction to the Advanced Placement Program®
Information Literacy Requirement Charter Oak State College
Consider Your Audience
ASSESSMENT OF STUDENT LEARNING
SUMMARY OF Teacher Preparation In US
EPAS Educational Planning and Assessment System By: Cindy Beals
HART RESEARCH A S O T E C I AAC&U Members On Trends In Learning Outcomes, General Education, and Assessment Key findings from online survey among 433 Chief.
EDUC 2130 Quiz #10 W. Huitt.
Be Advised: Academic Advisement & General Education
Suggested TALKING POINTS:
Presentation transcript:

Assessing General Education: Options, Choices and Lessons Learned Jo-Ellen Asbury, Ph.D. Rebecca Kruse Office of Institutional Research and Assessment Stevenson University

Assessing General Education We don’t have all the answers We invite audience input and insights We are not cheerleaders for national tests, it was the decision that we made at that time  No, we get no kick-back from ETS! * We are not here to advocate use of the MAPP ETS Proficiency Profile or any specific test or assessment. We want to share our experience and generate a conversation.

SU Core Curriculum Requirements (Bachelor’s Degree) (General “Cafeteria” Style) Min. 16 academic courses in liberal arts and sciences and 1 course in phys ed. All students must complete the following: Skills Courses: Three writing courses One communication course One physical education course Computer literacy requirement Distribution Courses: One fine arts course Two social science courses Three math and science courses (at least one lab) Four humanities courses humanities Core Electives (2 courses, 6 credits) Foreign Language (Bachelor of Arts only) 2 courses

The problem: How to assess the General Education program Unlike a major (psychology, math, etc.) does not have: A firm fairly prescribed list of requirements. A faculty member (or group of faculty members) who take sole responsibility for oversight. A capstone project/paper/experience that could be used to assess student learning outcomes. Student learning outcomes for gen ed were evolving. Currently, no centralized oversight.

Possible General Education Assessment Approaches Individual Course-Based Approach Information collected about learning in individual courses. Faculty demonstrate that students acquiring knowledge, skills, values associated with one or more gen ed goals. Assignments, exams, portfolios, etc. Multicourse (Theme-Based) Approach Focus on faculty from number of disciplines rather than individual courses. Review of syllabi, focus groups. Noncourse-Based Approach Campuswide focusing on individual or groups of students rather than courses. Gen ed assessment given to all or a sample of students. Standardized testing, student and alumni surveys, transcript analysis. Source: Palomba, C. A., & Banta, T. W. (1999). Assessing general education. In Assessment essentials: Planning, implementing, improving (pp. 239-268). San Francisco: Jossey-Bass.

Selecting a Gen Ed Assessment Method Method(s) used needs to match learning goals Because gen ed programs include a broad range of learning goals and objectives, critical thinking, communication, values, attitudes…. Need to be careful that the methods used will address all of these objectives May need more than one method Settled on some type of nationally-normed instrument.

Use of Published Tests / Assessments ~ from the paper, “The Role of Published Tests and Assessments in Higher Education”, March 2006, by Linda Suskie, MSCHE Vice President Pros Developed by testing professionals (test design, quality of questions better) Can provide comparison data Provide detailed, diagnostic feedback Variety of published tests to reflect diversity among schools and programs Longitudinal data confidence

Use of Published Tests / Assessments Consider the distinct set of knowledge, skills and competencies your institutions seeks to instill and should be used in combination with other evidence of student learning. Examples of Tested Writing Skills Examples of Tested Critical Thinking Skills ETS Measure of Academic Proficiency & Progress (MAPP) Discriminate between appropriate and inappropriate use of parallelism. Recognize redundancy Evaluate competing casual explanations. Determine the relevance of information for evaluating an argument or conclusion. ACT Collegiate Assessment of Academic Proficiency (CAAP) Formulate an assertion about a given issue. Organize and connect major ideas. Generalize and apply information beyond the immediate context. Make appropriate comparisons. Council for Aid to Education Collegiate Learning Assessment (CLA) Support ideas with relevant reasons and examples. Sustain a coherent discussion. Deal with inadequate, ambiguous, and/or conflicting information. Spot deceptions and holes in the arguments made by others. “The Role of Published Tests and Assessments in Higher Education” Linda Suskie, Middles States Commission on Higher Education March 25, 2006

Use of Published Tests / Assessments Cons If no compelling incentive, students may not give best effort. Challenge to get students to take and to give best effort. Published tests for higher ed have less evidence of quality than K-12 tests. Smaller # of students, may not be representative, less funds, etc. Certain published tests may not yield enough useful feedback . from “The Role of Published Tests and Assessments in Higher Education”, Linda Suskie, Middles States Commission on Higher Education, March 25, 2006

Use of Published Tests / Assessments Chosen Assessment Should: Match goals for student learning set by the institution Specific content must correspond with institution’s concepts (how does institution define critical thinking for example) Provide rich, detailed feedback that can be used to identify areas for improvement Have evidence of validity and reliability Provide some incentive for students to do their best from “The Role of Published Tests and Assessments in Higher Education”, Linda Suskie, Middles States Commission on Higher Education, March 25, 2006

ETS Proficiency Profile / MAPP Test Selected MAPP by ETS: Measure of Academic Proficiency and Progress. (now called ETS Proficiency Profile) Corresponds well with university core and measures what we want to measure Several different formats to choose from (online, standard, abbreviated) Can add up to 50 of our own supplemental questions Rich reporting features including comparative data and diagnostic feedback, norm-referenced scores and criterion-referenced scores SU has changed so rapidly and is still changing – important for us to be able to do comparisons, benchmarking, see differences between cohorts, etc.

More on ETS Proficiency Profile Measure of Academic Proficiency and Progress (now called ETS Proficiency Profile…) Assesses four core skill areas – critical thinking, reading, writing and mathematics at three levels Measures academic skills developed, as opposed to subject knowledge taught, in general education courses

Results and Reporting Multitude of reporting options available Comparison between cohorts/subgroups (separate out specific groups - majors, schools w/in University, commuters vs. noncommuters, etc. Can ask different cohorts different suppl. questions.) Identify specific proficiency level (1-3) of core skill deficiencies (ETS has specific definitions at each level) External and internal benchmarking Value-Added – compare against other metrics such as GPA, SAT, etc. Identify patterns (e.g. do students do better in certain areas if certain courses are taken in a certain order? Etc.)

Next Issue: Schedule for Administration

Pre-Post Test Test students when then enter, then test again at a later point in their Stevenson career. WHEN should the second testing take place? Internal validity threats History Maturation Mortality Selection Testing

Cohort-Sequential Design Compensates for (most of) the internal validity threats Provides both between subject and within subject data.

How the cohort-sequential design is being used SU COHORT #1 (entered F ‘08) COHORT #2 (enter F ‘10) AY 2008-2009 Fall, 2008 AY 2009-2010 Spring, 2010 AY 2010-2011 Fall, 2010 AY 2011-2012 Spring, 2012

Next Issue: Planning for Administration

Our Plan Administer to incoming freshman Test same students again in the end of sophomore year

Freshmen How do we get a large number of freshmen to take the test? Commitment from Director of First Year Experience to administer in First Year Seminars (all incoming freshman take a FYS) Goes on the syllabus Peer leaders (not us) to administer

Freshmen – Issues/Challenges Test version? (long, abbreviated, online) 2007 – used long version (2 hrs) switched to abbreviated (40 mins) Cost (tests, materials) Student leader instructions for administering Very specific instructions / script Customize instruction book Materials to and from student leaders Tests, pencils, instructions, ID Cards, calculators

Retesting as Sophomores 384 freshmen took in fall 2008 Where and how can we test that amount of students now as sophomores? Do we test all 384 at same time on same day in same location? Do we have the room on campus? Do we have enough supplies to test all at one time? What’s the best time during the semester? Who would proctor the tests? How do we get sophomores to volunteer to take test? No way to capture – no one class that all take.

Recruiting Sophomores Used to use scholarship hours Pizza lunch Gift card drawings Offered choice of two different days Marketed through emails, plasma screens in student union, faculty

Recruiting Sophomores A week before, response still not great Added more gift cards Opened up to ALL sophomores, not just ones who took it as freshmen 46 students out of 384 signed up 27 showed up split between both days

Other Recruiting Ideas: Gift certificates or pay for all students who take the test Change test format – use online format Reward those with high scores so test is taken seriously and they do their best ETS reports that most effective is combination of extrinsic and academic reward – something to get them there and something to get them to take it seriously Put high scores on an honor roll Make it a requirement for registration for junior year Withhold grades until test is taken

Latest Plan Try online non-proctored version. Recruit 100 random students from the 384 tested as freshman in 2008 who didn’t retake it in spring. Give each one $10 gift card to take online

Data Received Closing the Loop/Going Forward

Cohort 1: Summary of Stevenson University Proficiency Classifications (natl. comparison in parenthesis)   Proficient - Freshmen (FA08) Proficient - Sophomore (SP10) Marginal - Freshmen (FA08) Marginal - Sophomore (SP10) Not Proficient - Freshmen (FA08) Not Proficient - Sophomore (SP10) Reading Level 1 56% (57%) 67% (59%) 24% (22%) 26% (23%) 20% (20%) 7% (18%) Reading Level 2 26% (28%) 44% (27%) 19% (19%) 26% (20%) 56% (53%) 30% (53%) Reading Level 3 (Critical Thinking) 3% (4%) 7% (3%) 9% (12%) 7% (10%) 88% (84%) 85% (86%) Writing Level 1 62% (61%) 70% (59%) 26% (25%) 12% (14%) 4% (13%) Writing Level 2 15% (16%) 19% (14%) 38% (35%) 37% (34%) 48% (49%) 44% (52%) Writing Level 3 6% (7%) 4% (5%) 24% (25%) 30% (24%) 70% (68%) 67% (71%) Math Level 1 52% (50%) 59% (45%) 22% (29%) 22% (22%) 19% (26%) Math Level 2 22% (19%) 26% (26%) 37% (26%) 41% (54%) Math Level 3 7% (6%) 7% (4%) 14% (15%) 11% (12%) 78% (79%) 81% (84%)

Distribution of Individual Student Scores and Subscores Cohort 1: Distribution of Individual Student Scores and Subscores   Possible Range SU Mean Score Freshmen (FA08) n=380 Natl. Comparison (Freshmen) SU Mean Score Sophomore (SP10) n=27 Natl. Comparison (Sophomore) SU Score Increase/ Decrease (pts) SU Score Increase/ Decrease (%) Total Score 400-500 439.81 441.1 443.41 439.6 3.60 0.82% Skills Subscores: Critical Thinking 100-130 110.23 110.3 110.26 110.0 0.03 0.03% Reading 116.34 117.1 118.89 2.55 2.19% Writing 113.75 113.8 114.67 113.5 0.92 0.81% Mathematics 112.80 113.0 112.81 112.0 0.01 0.01% Context-Based Subscores: Humanities 113.59 113.9 114.30 0.71 0.63% Social Sciences 112.15 112.6 112.59 112.5 0.44 0.39% Natural Sciences 114.17 114.0 115.93 1.76 1.54%

Closing the Loop/Going Forward Determine the mechanism for internal decision-making and the process used for identifying deficiencies and implementing change Share results Other measures of same core skills Content mapping

Other Ideas for…. - assessing general education? - recruiting students? using data and closing the loop? other?

References Suskie, L. (2006, March 25). The role of published tests and assessments in Higher Education. In Middle States Commission on Higher Education [Report]. Retrieved from http://www.msche.org/publications/ published-instruments-in-higher-education.pdf ETS® Proficiency Profile Case Studies. (2008). Educational Testing Services. Retrieved from http://www.ets.org/proficiencyprofile/case_studies/ ETS® Proficiency Profile Content. (n.d.). Educational Testing Service. Retrieved from http://www.ets.org/proficiencyprofile/about/content/ Walvoord, B. E. (2004). For general education. In Assessment clear and simple: A practical guide for institutions, departments and general education (pp. 67-79). San Francisco: Jossey-Bass. Palomba, C. A., & Banta, T. W. (1999). Assessing general education. In Assessment essentials: Planning, implementing, improving (pp. 239-268). San Francisco: Jossey-Bass.