Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
What “Counts” as Evidence of Student Learning in Program Assessment?
Best Practices in Assessment, Workshop 2 December 1, 2011.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
Curriculum Maps Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost.
Making Your Assessments More Meaningful Flex Day 2015.
Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
An Assessment Primer Fall 2007 Click here to begin.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Program Assessment Kelly Aune Office of the Vice Chancellor for Academic Affairs, UH-Mānoa.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
The Academic Assessment Process
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Business research methods: data sources
FLCC knows a lot about assessment – J will send examples
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 C-99 Assessing Student Learning in Graduate Degree Programs C-99 Assessing Student Learning in Graduate Degree Programs Bob Smallwood, University of.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
The Role of Assessment in the EdD – The USC Approach.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Department of Physical Sciences School of Science and Technology B.S. in Chemistry Education CIP CODE: PROGRAM CODE: Program Quality Improvement.
Authentic Assessment Principles & Methods
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration – Initiated in 2003 – Low-residency—one weekend per month (over.
Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Authentic Assessment Lynne E. Houtz, Ph.D. Associate Professor of Education Creighton University.
Mapping Student Learning Outcomes
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Building Your Assessment Plan Esther Isabelle Wilder Lehman College.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Assessment Workshop College of San Mateo February 2006.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
September 11, Assessment Primer A. Defining the Assessment Loop B. Establishing Departmental-level learning goals and objectives C. Assessment Terminology.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Student and Faculty Perceptions of Goal Achievement in General Education Courses C. “Griff” Griffin Director, General Education Grand Valley State University,
Strategies for Conducting Curriculum Review
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
Module 3: Programmatic Assessment Strategies
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
Program Level Assessment for Continuing Studies Programs.
Road To Success Kathleen Bolland, PhD Javonda Williams, PhD, LCSW SUCCESS Next Exit Aligning curricular planning, teaching, and program evaluation to facilitate.
CAEP Standard 4 Program Impact Case Study
Direct vs Indirect Assessment of Student Learning: An Introduction
CRITICAL CORE: Straight Talk.
Consider Your Audience
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Creating Assessable Student Learning Outcomes
The Heart of Student Success
Assessing Academic Programs at IPFW
Presentation transcript:

Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness

ARTICULATE MISSION/ GOALS IDENTIFY SPECIFIC OUTCOMES DETERMINE PRACTICES USED TO ACHIEVE OUTCOMES GATHER EVIDENCE REVIEW & INTERPRET RESULTS RECOMMEND ACTIONS Location in Assessment Cycle

Measures Within the Context of Program Assessment For program assessment, measurement tools should capture student learning that occurs as a result of the program curriculum Many measurement tools can be used for multiple levels of assessment –But how you use them differs Some assessment tools are not appropriate for program-level assessment

Categories of Measures Direct Measures –Look at student work products or performances that demonstrate level of learning Indirect Measures –Capture students’ perceptions of their learning and the educational environment that supports learning

Categories of Measures Direct Measures –Published, standardized tests (e.g., GRE Subject Test, ETS Major Field Test) –Locally developed tests –Systematic evaluation of student work (papers, presentations, creative work, performances) May or may not be embedded within courses Usually involves scoring rubrics Indirect Measures –Published surveys –Locally developed surveys and interviews –Alumni surveys

Properties of Good Assessment Techniques Reliable – internally consistent; consistent across raters Valid – measures what it is supposed to; appropriate Actionable – results point reviewers toward challenges to address (and how to address them) Efficient and cost effective in time and money Interesting and meaningful – people care about the results and are willing to act on them Convergence – multiple lines of evidence point to the same conclusion

Evaluating Measures: Direct PUBLISHED TESTS PROSCONS Provide direct evidence of student mastery of content Some are designed specifically to assess major programs Generally highly reliable Validity established (within a specific context) Usually easy to implement and obtain results Norms/comparisons available May be difficult to motivate students to perform at their best level May not align with program outcomes Often focus more on content knowledge than higher-order skills Can be expensive

LOCALLY DEVELOPED TESTS PROSCONS Provide direct evidence of student mastery of content or skills More flexible in terms of content and format; easier to align with program outcomes If embedded in courses, student motivation is higher Faculty are more likely to be interested in and use results Likely to be less reliable Validity unknown Norms/comparisons not available Can take several iterations (and several years) to work out the “bugs” Scoring tests and tabulating results can be cumbersome Evaluating Measures: Direct

EVALUATION OF STUDENT WORK PROSCONS Provides direct evidence of student mastery of content or skills If embedded in course, student motivation is higher Faculty are more likely to be interested in and use results Data collection is usually unobtrusive to students Requires time to develop, conduct training, implement – Creating flexible rubric, at appropriate for program-level assessment can be tricky Validity unknown; takes time to establish reliability Requires faculty trust that the program will be assessed, not individual instructors Norms/comparisons not available Evaluating Measures: Direct

PUBLISHED SURVEYS PROSCONS Minimal effort to implement and tabulate results Can be administered to a large group of respondents Demonstrated reliability and validity Can address a variety of outcomes Norms/comparisons available Provides indirect evidence of student learning May not be aligned with program outcomes Potential for biased results if sample is not representative Can be expensive Evaluating Measures: Indirect

LOCALLY DEVELOPED STUDENT SURVEYS & INTERVIEWS PROSCONS Flexible in terms of content and format; easy to align with program outcomes Usually have face validity Can add open-ended questions that allow you to flesh out quantitative results – More actionable Can be administered to a large group of respondents Can address a variety of outcomes Relatively easy to implement Provide indirect evidence of student learning Their validity depends on the quality of questions and response options Potential for biased results if sample is not representative Can be time-consuming to construct, implement, and tabulate results Norms/comparisons not available Open-ended responses can be difficult and time-consuming to analyze Evaluating Measures: Indirect

ALUMNI SURVEYS PROSCONS Same advantages as student surveys Can gather more “direct” evidence than current student surveys – E.g., employment, enrollment in graduate programs Can ask questions of alumni that are not appropriate for current students – E.g., the extent to which the program prepared them for their career Many of the same disadvantages of student surveys – It’s particularly difficult to get a good response rate The timing can be tricky – Alumni should be far enough out to see an impact of the program on their life/career but not so far out that the program they experienced is very different from current one Evaluating Measures: Indirect

A Word on Embedded Assessment Various types of measurement tools can be embedded within courses Only carefully constructed measures, used in certain types of courses, are appropriate for program-level assessment –Must go beyond individual course content –Some should occur in upper-level courses that are taken only after several other courses in the major

QUESTIONS?