What “Counts” as Evidence of Student Learning in Program Assessment? CCUA Assessment Workshop April 5, 2007 What “Counts” as Evidence of Student Learning in Program Assessment? Sarah Zappe Research Assistant Testing and Assessment Specialist Schreyer Institute for Teaching Excellence
CCUA Assessment Workshop April 5, 2007 Workshop Goals To provide information and guidance on the processes of: Identifying sources of evidence of student learning Mapping evidence to program outcomes Developing reports for stakeholders
Definition of Assessment CCUA Assessment Workshop April 5, 2007 Definition of Assessment “Assessment is an ongoing process aimed at understanding and improving student learning. It involves making expectations explicit and public; setting appropriate criteria and high standards for learning quality; systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards, and using the resulting information to document, explain, and improve performance.” Angelo, T. A. (1995)
CCUA Assessment Workshop Assessment Loop April 5, 2007 Goals and Outcomes Maki (2001)
Student Learning Outcomes Measurable and specific goals for what we want our students to know, feel, or be able to do following the program Knowledge, skills, and attitudes Drives all other stages of assessment
University Guidelines for the Internal Review of Academic Programs Background, purpose, and goals Specify evaluation areas Data collection plan Data collection and analysis Recommendations
Do we already have data that provides evidence of student learning? Probably, let’s see… Direct evidence of student learning Measures of student performance that demonstrate actual learning What did students learn and NOT learn? Indirect evidence of student learning Measures of perception or demographic indicators that imply learning has occurred
Direct Measures of Student Learning Capstone projects, senior theses, exhibits Portfolios Standardized tests Concept inventories Employer/internship ratings of students’ performance Middle States Commission, (2003)
Limitations of Direct Evidence No evidence of why students have learned or not learned Does not indicate “value-added” Did students already have the knowledge or skills before completing the program?
Indirect Measures of Student Learning Focus groups/interviews Employer surveys Alumni surveys Registration/course enrollment information Department or program review data Job placement indicators Graduate school placement rates Comparisons with other institutions Middle States Commission, (2003)
Limitations of Indirect Evidence Do not evaluate student learning per se Should not be the only means of assessing outcomes
Does all evidence need to be quantitative? No… In fact, good practice in assessment suggests collecting multiple types of information Both direct and indirect Both qualitative and quantitative
Quantitative Evidence Represented numerically Examples Scores on tests Survey scales Advantages Ease of collection Ease of analysis Ease making calculations and comparisons (across time or between groups) Generalizability Limitations Often doesn’t answer the question of “why”
CCUA Assessment Workshop April 5, 2007 Qualitative Evidence Data represented in narrative or prose format Examples Interviews Focus groups Open-ended questions on surveys Advantages Provides very “rich” information Limitations More difficult to analyze and to make direct comparisons Not generalizable Methods of ensuring reliability are difficult and time-consuming
Brainstorm Activity Brainstorm existing types of evidence for your program Direct evidence Indirect evidence What is missing but should be collected? Discuss these with your table
Isn’t sampling somehow cheating? No, but… Sampling should be representative of population Population– students in your program Sample should embody important characteristics of population Stratified random sample Avoid convenience or accidental sampling
Do Grades Count as Evidence? Yes! But… Only if they are linked to learning goals Score/grade alone does not express the content of what students have learned Need to define what each score means Match course assessment to outcomes Syllabi Test blueprints
Do Grades Count (Cont.) “If the grades of individual students can be traced directly to their respective competencies in a course, the learning achievements of those students are being assessed in a meaningful fashion.” Middle States, 2003
Embedded Course Assessment Questions or problems relevant to outcomes are embedded within course assessment Examples Specific course projects Capstone projects Test and blueprints matched to outcomes Advantages: No extra time for student or faculty Student motivation is greater Provides both formative and summative data
CCUA Assessment Workshop April 5, 2007 Linking Outcomes Course Assessment Program Assessment Institutional Assessment Bakersfield College (2006)
Activity: Aligning Courses to Program Outcomes Using the matrix provided, identify sources of evidence and match to your outcomes. Evidence embedded in courses Other evidence
How should we decide what to present in our report? Consider the stakeholders External stakeholders Internal audience Consider a short and a long form Get feedback Sample assessment report
Where can we get help if we need it? Schreyer Institute for Teaching Excellence http://www.schreyerinstitute.psu.edu Office of Institutional Planning and Assessment http://www.psu.edu/president/pia/index.htm
7 Common Misperceptions about Assessment We’re doing just fine without it. We’re already doing it. We’re far too busy to do it. The most important things can’t be measured. We’d need more staff and money. They’ll use the results against us. No one will care about or use what we find. Angel (2005)
Mini-Evaluation of Session Please complete the mini-evaluation form provided so that we can work on improving OUR efforts! Thank you for your time!