CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.

Slides:



Advertisements
Similar presentations
SLO Assessment Departmental Tools and Examples from the Field Sacramento City College Fall Flex Workshop 8/21/08 8/21/08Presenters: Alan Keys Faculty Research.
Advertisements

Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Institutional Effectiveness (ie) and Assessment
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Best Practices in Assessment, Workshop 2 December 1, 2011.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
An Assessment Primer Fall 2007 Click here to begin.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Consistency of Assessment
Assessment and Evaluation Considerations in Distance Education.
Evaluating and Revising the Physical Education Instructional Program.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Types of Evaluation.
Analyzing and Improving College Teaching: Here’s an IDEA Alan C. Lacy, Associate Dean College of Applied Science and Technology Illinois State University.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
The Role of Assessment in the EdD – The USC Approach.
Program Level Outcomes Jessica Carpenter Elgin Community College.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Assessment COURSE ED 1203: INTRODUCTION TO TEACHING COURSE INSTRUCTOR
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ASSESSMENT Formative, Summative, and Performance-Based
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Mapping Student Learning Outcomes
STUDENT LEARNING OUTCOMES STATE CENTER COMMUNITY COLLEGE DISTRICT BOARD REPORT, DECEMBER 7, 2010 SLO Coordinators: Maggie Taylor (FCC) and Eileen Apperson(RC)
The Voluntary System of Accountability (VSA SM ).
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
LEARNING OUTCOMES WORKSHOP Dr. Jan Hillman University of North Texas January 8, 2007.
Annual Professional Performance Review (APPR). What are the components of APPR? Teacher Evaluation –60 points (observation*/goal setting) –20 points (State.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Institutional Learning Outcomes: Assessing Community and Global Responsibility Convocation – Fall 2015 College of the Redwoods Angelina Hill & Dave Bazard.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Educational Effectiveness Fall Faculty Retreat 2006 Leanne Neilson Halyna Kornuta.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Improving the Institutional Effectiveness Process 1.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Keeping Up With Demand: Measuring Labor Market Alignment in TAACCCT Programs Michelle Van Noy and Jennifer Cleary TCI Research Symposium: Evidence of What.
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
Developing Program Learning Outcomes To help in the quality of services.
Overview of Student Growth and T-TESS. Keys of Appraisal Student growth is a part of the appraisal process: Formative Ongoing and Timely Formalize what.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Reading and Literacy M.Ed. Program.  Graduate programs across the university require some sort of exit option that shows that the student has mastered.
Education.state.mn.us Principal Evaluation Components in Legislation Work Plan for Meeting Rose Assistant Commissioner Minnesota Department of Education.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Consider Your Audience
Director of Policy Analysis and Research
The Assessment Toolbox: Capstone Courses & Portfolios
Exploring Assessment Options NC Teaching Standard 4
Preparing Educators in Classroom Assessment
An Introduction to Evaluating Federal Title Funding
Presentation transcript:

CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College

START WITH A BASIC QUESTION What do you want to know?

SOME BASIC ASSESSMENT QUESTIONS 1.Are our students meeting our standards? 2.How do our students compare to their peers? 3.How much are we impacting student learning? 4.Have our changes made a difference? 5.Are we maximizing student learning potential?

1. ARE OUR STUDENTS MEETING OUR STANDARDS? This question calls for standards-based assessment approach A.k.a. competency-based or criterion-based assessment Examples of standards: Course student learning outcomes (SLOs) Program SLOs CNM General Education SLOs NMHED General Education Common Core Competencies Accreditation standards Professional standards Licensure Certification Discipline-based association standards

STANDARDS-BASED ASSESSMENT Often driven by a need for accountability Evaluated based on established targets for achievement May be internally derived: What is good enough? May be externally referenced, defined, and/or mandated Typically summative Done at the end of the unit, course, series, or program May be internal (such as a final exam) or external (such as an evaluation done by a supervisor in a field placement)

STANDARDS-BASED ASSESSMENT May be an add-on assessment (such as a licensure exam or a program portfolio) or may be embedded within course assignments, quizzes, or tests Typically a direct assessment of student proficiency (such as a test) but may be indirect (such as an employer rating)

2. HOW DO OUR STUDENTS COMPARE TO THEIR PEERS? This question calls for benchmark comparisons Benchmark: something that can be used to judge the quality or level of other, similar things; a point of reference from which measurements may be made Examples of benchmarks: Published norms from standardized assessments National means and percentiles on standardized exams Statistically derived target response rates established by survey producers (based on national or comparison-group means, percentiles, effect size, etc.) Statistics from or about comparable or ‘aspirational’ institutions or programs

BENCHMARK COMPARISONS Often driven by a need for accountability A ‘best practices’ expectation Also supports institutional self-assessment and marketing interests Often limited by the availability of benchmarking data Usually evaluated based on where scores fall relative to those of the peer comparison groups Though not technically a benchmark, looking at whether the majority of institutions or programs have a similar characteristic can also inform peer comparisons

BENCHMARK COMPARISONS May be formative (such as testing or surveying incoming students) or summative (such as testing or surveying students who are graduating or otherwise moving to a next level) Typically external (administered outside of the curriculum) E.g., comparing mean standardized test scores to national means Typically an add-on (not part of the instructional process) May be direct (as in tests) or indirect (as in surveys)

3. HOW MUCH ARE WE IMPACTING STUDENT LEARNING? This question calls for a value-added assessment approach Uses either longitudinal or cross-sectional comparisons Longitudinal: measuring something at the beginning and again at the end of the instructional period (unit, course, series, or program) Statistical analysis looks at case-by-case gains Cross-sectional: comparing advanced students’ outcomes to those of beginning students Both groups can be assessed at the same time Assumes the two groups come in with comparable characteristics and have equivalent experiences (which is rarely entirely true) Statistical analysis compares the means of the two groups

VALUE-ADDED ASSESSMENT Often driven by a need for accountability Sometimes considered a fair way of assessing educational impact irrespective of students’ prior academic preparation Looks at achievement gains versus closing the achievement gap Can be useful in quantifying the responses of students to instruction Data interpretation can be complicated by lack of student motivation on pre-assessments and by incoming transfers Evaluated based on the size of the gains observed

VALUE-ADDED ASSESSMENT Involves, by definition, both formative and summative assessment, E.g.: Pre- and post-tests using the same measurement/instrument Surveys comparing responses of entering and exiting students May be internal or external May be an add-on or embedded assessment May be direct (as in a test or assignment) or indirect (as in a survey or interview)

4. HAVE OUR CHANGES MADE A DIFFERENCE? This question calls for an investigative assessment approach Follows a scientific model, with varying degrees of formality May range from evaluating minor instructional alterations at the course level to conducting full-fledged, publishable research IRB approval may be necessary if investigation is not purely operational in nature, i.e., is experimental and/or may be used for a thesis or any other purpose outside of CNM Often driven by faculty interest in improving student learning outcomes (and/or desire for professional advancement)

INVESTIGATIVE ASSESSMENT Evaluated based on observed differences between groups Usually compares outcomes following a change to baseline outcomes obtained prior to the change The measurement/instrument needs to be consistent Inferring a causal relationship between a change made and any difference in learning outcome may be supported if: A statistically significant correlation exists The difference in the outcome occurs only after the change is implemented There are no confounding factors, such as introduction of other interventions, notable differences in student demographics, or environmental changes

INVESTIGATIVE ASSESSMENT Typically (but not always) summative May be internal (as in a writing assignment) or external (as in looking at next-level success) May be an add-on or embedded assessment May be direct (as in evaluations of student performance) or indirect (as in surveys, interviews, or next-level success rates)

5. ARE WE MAXIMIZING STUDENT LEARNING POTENTIAL? This question calls for a process-oriented assessment approach Examines the dynamics of student learning in relation to a desired outcome, analyzing inputs and context to determine under what conditions students learn best Often driven by faculty interest in improving student learning outcomes

PROCESS-ORIENTED ASSESSMENT Evaluated based on increases in proportions of students demonstrating learning gains Involves both formative and summative assessment Typically internal Typically embedded May be direct or indirect

GIVE IT YOUR OWN TWIST To Fit the Circumstance

WHAT’S YOUR ANGLE? Rather than adopt an approach, adapt what’s useful to fit your needs: Building on the basic assessment question, what specifically do you want to find out? What options are available to you? Does the general approach associated with the basic question fit your interests? Will a somewhat different or more eclectic approach work better? Are you assessing at the program level or the course level?

PROGRAM-LEVEL RECAP Basic Questions 1.Are our students meeting our standards? 2.How do our students compare to their peers? 3.How much are we impacting student learning? 4.Have our changes made a difference? 5.Are we maximizing student learning potential? General Approaches 1.Standards-based assessment 2.Benchmark comparisons 3.Value-added assessment 4.Investigative assessment 5.Process-oriented assessment

BEYOND COURSE OUTCOMES Program-level assessment can employ extra-curricular measures, such as: Surveys, interviews, and focus groups involving students, faculty, field supervisors, employers, and/or community members Pre-and post assessments (tests or surveys) Benchmark comparisons on commercially developed surveys and/or standardized tests (or other types of data comparisons with peer programs) Analyses of factors associated with differences in outcomes (demographics, service utilization, enrollment and attendance patterns, etc.) Analysis of next-level success indicators (transfer, employment)

COURSE-LEVEL ASSESSMENT Basic Questions Re-Phrased 1.Are the students in my class meeting the necessary standards? 2.How do the students in my class compare to their peers? 3.How much of what I teach are the students learning? 4.Have my changes made a difference? 5.Am I maximizing student learning potential? Same General Approaches 1.Standards-based assessment 2.Benchmark comparisons 3.Value-added assessment 4.Investigative assessment 5.Process-oriented assessment

OUTCOME ALIGNMENTS Base decisions about what to assess at the course level on alignments between course-level student learning outcomes and program-level student learning outcomes Program SLO Course 1 Aligned SLO Course 2 Aligned SLO Course 3 Aligned SLO

RELEVANCE Base decisions on how to assess at the course level on what the instructor wants to know Program SLO Identifying a Course SLO that’s aligned to the Program SLO The instructor deciding what he/she wants to know The instructor planning the course-level assessment The instructor conducting the course- level assessment The instructor interpreting and applying the findings at the course level The program faculty pooling and interpreting all related findings

TAKE IT UP A STEP Capture a More Complete Portrait through Multiple Assessments

BUILD A COLLAGE Student learning is dynamic, and assessments are like snapshots of student learning More than one of the basic assessment questions may apply A variety of snapshots viewed together gives a more comprehensive impression than a single snapshot Having multiple points of evidence taken from various perspectives over time supports better decisions about the usefulness of any individual bit of assessment data