Program Assessment Kelly Aune Office of the Vice Chancellor for Academic Affairs, UH-Mānoa.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Performance Assessment
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
4.01A ACQUIRE FOUNDATIONAL KNOWLEDGE OF MARKETING- INFORMATION MANAGEMENT TO UNDERSTAND ITS NATURE AND SCOPE WF SPORTS & ENTERTAINMENT MARKETING II.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Making Your Assessments More Meaningful Flex Day 2015.
Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Assessment of information literacy skills circle at Higher Education Institutions in South Africa is it a reality or not? Presented by Gerda Ehlers University.
The Research Consumer Reviews the Measures
Chapter 10 Human Resource Management and Performance: a Review and Research Agenda David E. Guest.
Effective Training: Strategies, Systems and Practices, 2 nd Edition Chapter Eight Evaluation of Training.
Preview of Today l Review next paper l Cover Chapter Three Get into groups.
Chapter 10 Collecting Quantitative Data. SURVEY QUESTIONNAIRES Establishing Procedures to Collect Survey Data Recording Survey Data Establishing the Reliability.
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Assessing and Evaluating Learning
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
FLCC knows a lot about assessment – J will send examples
Assessment of Student Learning At OSU-OKC. Opening Remarks Bill Pink, PhD Vice President Academic Affairs.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Assessment Report Department of Psychology School of Science & Mathematics D. Abwender, Chair J. Witnauer, Assessment Coordinator Spring, 2013.
Questionnaires and Interviews
Presented by Jennifer Fager For University of Wisconsin-Superior Enhancement Day 1/19/2011.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
David Gibbs and Teresa Morris College of San Mateo.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Understanding Meaning and Importance of Competency Based Assessment
Using the Capstone Course to Generate Student Learning Outcomes Texas State University Thomas E. Castleberry Patricia M. Shields Hassan Tajalli.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Pilot Training for Volunteers General Education Assessment Committee.
Quality Assessment July 31, 2006 Informing Practice.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
LSAC Academic Assistance Training Workshop June 13 – 16, 2012 OUTCOMES ASSESSMENT – THE BASICS Janet W. Fisher Suffolk University Law School.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Performance-Based Assessment Assessment Literacy.
SIUC Instructor Workshop Introduction to Assessment and Evaluation.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
The Gold Standard… Faculty are Key.  Annual Assessment based on  Address each SLO  Be specific, measurable, student- focused  Align the new.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Office of Service Quality
Identifying Outcomes Peggy Maki Senior Scholar Assessing for Learning American Association for Higher Education
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Reliability EDUC 307. Reliability  How consistent is our measurement?  the reliability of assessments tells the consistency of observations.  Two or.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Principles of Language Assessment
CRITICAL CORE: Straight Talk.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Qualitative and Quantitative Data
Chapter Six Training Evaluation.
Creating Assessable Student Learning Outcomes
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Program Assessment Kelly Aune Office of the Vice Chancellor for Academic Affairs, UH-Mānoa

Initial Questions  What have we done to our students as a result of their exposure to our program?  Problem – You should know what you did to them, or at least what you intended to do to them.  Did we do to our students what we intended to do to them?  Advantage – Requires learning outcomes.  Problem – All or nothing approach.

Better Questions  To what extent have we changed our students in the ways we intended to change them?  Advantage – Allows for range and variability of expected changes, i.e., more sensitive to change.  To what extent have we changed our students on X, Y, and Z?  Advantage – Sensitive and more specific.

Obvious First Step  If you have not done so yet, sit down as a faculty and articulate specific learning outcomes or objectives, specific effects your program is designed to produce in students.  Make certain your learning objectives lend themselves to measurement. “Be a better person” is tough to measure.

Nature of Effects  Cognitive effects – changes in:  Knowledge & beliefs  Understandings  Skills and Capabilities (e.g., calculate; introspect; evaluate; analyze; synthesize; ability to apply)  Attitudinal effects – changes in:  Predispositions; preferences; opinions.

Nature of Effects  Behavioral effects – changes in:  Various written communication skills  Various oral communication skills  Social skills  Organizational skills

Assessment  “Cheap” assessment -- e.g., paper/pencil surveys and testing.  Advantages – low dollar cost; minimal labor requirements; fast & efficient.  Can be used for low-level cognitive data.  Disadvantages – often limited to student self-report and/or attitudinal data.

Assessment  “Expensive” assessment  Disadvantages – can require significant investment to cover expenses such as incentives and labor costs.  Advantages – can provide data on higher order cognitive effects and behavioral performance indicators.

Cheap Assessment  Attitudinal surveys  Instructor evaluations  Course content evaluations  Advantages – fast & easy  Disadvantages – tells us what students think/believe about a course or instructor but not what the class has done for them.

Cheap Assessment  Self report surveys  Perceptions of capabilities  Perceptions of understandings  Perceptions of experiences – e.g., communication apprehension  Advantages – fast & easy; perceptions of experiences can be very useful.  Disadvantages – self reports of one’s capabilities and understandings are always suspect.

Better Cheap Assessment  Knowledge based assessment (a.k.a. multiple choice test).  Create a comprehensive “test” consisting of sample questions from all core or required courses.  Administer to selected groups at selected times.  Advantages – fast & easy; actually provides an indication of learning.  Disadvantages – provides a most superficial aspect of learning, a narrow bandwidth of cognitive change.

Expensive Assessment  Recording/Assessment of behavior  Recording of speech/presentation for subsequent analysis by coders.  Written responses to stimuli materials for subsequent analysis by coders.  Portfolio collection for subsequent analysis by judges/experts.  Recording of performance for subsequent assessment by judges/experts.  Track your alumni

Expensive Assessment  Advantages – provides actual performance indicators.  Disadvantages – time intensive and labor intensive; costs may be generated for incentives (for students); pay for coders/judges; coders need to be trained to ensure reliability and validity of assessments.

Design  Single Data Set  Advantages – fast & easy; yields one set of numbers.  Disadvantages – what can that single set of numbers tell you? How do you interpret those numbers?  Comparison Data Sets – Time 1 vs. Time 2 or comparison group.  Disadvantages – more labor intensive; analysis more complicated.  Advantages – allows comparison of data relative to learning stimuli.

What To Do With Your Data  Report Descriptive Statistics  Advantages – fast & easy; requires little expertise to compile.  Disadvantages – how do you argue for effectiveness?  Report Statistical Comparisons  Advantages – allows for strong arguments concerning learning outcomes.  Disadvantages – requires some expertise.

Closing the Loop  Interpreting the results  What do the data tell you about the effectiveness of your program?  Altering your program  Adjust curriculum content, sequencing of courses, prerequisites, etc., in response to results.  Test Again!

Final Thoughts  Forget WASC  Yes, really truly forget WASC!  Do this for yourselves. The only way to establish the “culture of evidence” that WASC is looking for is to find value in the assessment process. Do it thoughtfully, with scholarly curiosity, with an expectation of value and you will find this a rewarding process.