Refining Your Assessment Plan: Session 4 Use of the Results Ryan Smith and Derek Herrmann University Assessment Services.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Teaching, Learning, and Assessing Peggy Maki Senior Scholar, Assessing for Learning AAHE
Becoming a High Impact Board Susan Salter Director of Board Development Alabama Association of School Boards.
Workshop April 14, Why are you here? Why are we here?
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
EQuIP Rubric and Quality Review Curriculum Council September 26, 2014.
Rubrics in the Classroom Miguel Guhlin
Mary Allen Qatar University September Workshop participants will be able to:  draft/revise learning outcomes  develop/analyze curriculum maps.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
Making Collaboration Work Module VII: Assessment and Grading
Assessment Strategies Visual Tools EDUC 4454 P/J Methods.
Grade 12 Subject Specific Ministry Training Sessions
How to Write Goals and Objectives
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 5 Informal Assessments.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Science & Technology Grades Spring 2007
BY Karen Liu, Ph. D. Indiana State University August 18,
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
CONNECT WITH CAEP | | Three-Year-Out Review of Assessments (Pending Accreditation Council and CAEP.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessing Standards Through Rubrics Milton Hershey School Houseparent Professional Development Session #1 Denise G. Meister, Ph.D. Penn State.
R EFINING Y OUR A SSESSMENT P LAN : S ESSION 1 Goals and Learning Outcomes Ryan Smith and Derek Herrmann University Assessment Services (UAS)
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
Assessment Workshop College of San Mateo February 2006.
Customizing Rubrics A GENERAL HOW-TO OVERVIEW By Dr. Paula Roberson, Outcomes Assessment Coordinator.
Monitoring Definitions: Derived from Latin monere “to advise” Ineffective PracticeEffective Practice
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Agenda Introductions Objectives and Agenda Review Principal Evaluation: Different? One Year Later Coaching Principals Collect evidence Support your local.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Agenda Introductions Objectives and Agenda Review Research Review Taking Stock Collect evidence Principal Practices & the Rubric End-of-the-Year Looking.
Before we get started: Have you located your standards AND curriculum map(s)? Have you read Chapter 1 of Classroom Assessment for Student Learning? Have.
PORTFOLIO ASSESSMENT OVERVIEW Introduction  Alternative and performance-based assessment  Characteristics of performance-based assessment  Portfolio.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
The Achievement Chart Mathematics Grades Note to Presenter:
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
Identifying Assessments
GREAT EXPECTATIONS: THE POWER OF SETTING OBJECTIVES September 2014 Ed Director Meeting.
Program Assessment: Understanding and Using Results Specify intended outcomes Measure whether students are meeting those outcomes Improve your program.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Reviewing Syllabi to Document Teaching Culture and Inform Decisions Claudia J. Stanny Director, Center for University Teaching, Learning, & Assessment.
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
If I hear, I forget. If I see, I remember. If I do, I understand. Rubrics.
New Haven, A City of Great Schools MOVING FROM COMPLIANCE TO COHERENCE IN EVALUATION AND DEVELOPMENT: THE IMPACT OF THE E3 PROGRAM NEW HAVEN PUBLIC SCHOOLS.
The IEP: Progress Monitoring Process. Session Agenda Definition Rationale Prerequisites The Steps of Progress Monitoring 1.Data Collection –Unpack Existing.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Designing Quality Assessment and Rubrics
Performance-Based Accreditation
Statistics & Evidence-Based Practice
EVALUATING EPP-CREATED ASSESSMENTS
Classroom Assessments Checklists, Rating Scales, and Rubrics
Consider Your Audience
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessment planning: It’s the parts that make up the whole!
Student Learning Outcomes Assessment
Rubrics.
CBEI Essentials for Residents, Fellows, Advanced Practice Providers, and Faculty A 10-minute primer on student performance assessment in required clerkships.
PLCs Professional Learning Communities Staff PD
Grantee Guide to Project Performance Measurement
Assessing Academic Programs at IPFW
Presentation transcript:

Refining Your Assessment Plan: Session 4 Use of the Results Ryan Smith and Derek Herrmann University Assessment Services

INTRODUCTIONS Refining Your Assessment Plan: Session 4

Introductions Ryan Smith, Director, UAS Derek Herrmann, Coordinator, UAS Assessment Advisory Council (AAC) Assessment Academy Team

Introductions Who you are In which Department/School you work Why you are here

OVERVIEW OF PRAAP Refining Your Assessment Plan: Session 4

Overview of PRAAP P rocess for the R eview of A cademic A ssessment P lans Two years before Program Review and one year before Self-Study AAC members and UAS staff Use an established rubric

Overview of PRAAP ElementsUndevelopedDevelopingEstablishedExemplary Goals and Learning Outcomes Direct Evidence of Student Learning Indirect Evidence of Student Learning Use of the Results

USE OF THE RESULTS Refining Your Assessment Plan: Session 4

Use of the Results Who is involved? – Chairperson/Director (or Assistant/Associate) – Program Coordinator – Faculty Individuals Committees

Use of the Results Where is it done? – Department/School meetings – Retreats (general or assessment) – Informal conversations

Use of the Results Examining the data – Rubrics A scoring guide; a list or chart that describes the criteria used to evaluate (or grade) completed student assignments Using a rubric to evaluate (or grade) assignments… – Makes your life easier – Improves student learning

Use of the Results Holistic rubric: short narrative descriptions of the characteristics of different levels of work Level 4: Description of the performance at this level Level 3: Description of the performance at this level Level 2: Description of the performance at this level Level 1: Description of the performance at this level

Use of the Results Checklist rubric: simple list indicating the presence of the things you’re looking for in an assignment CriteriaPresent Criterion 1 Criterion 2 Criterion 3 Criterion 4 Criterion 5

Use of the Results Rating scale rubric: checklist with a rating scale added to show the degree to which the things you’re looking for are present in an assignment CriteriaNot PresentDevelopingEstablishedAdvanced Criterion 1 Criterion 2 Criterion 3 Criterion 4 Criterion 5

Use of the Results Descriptive rubric: replaces the checkboxes of rating scale rubrics with brief descriptions of the performances that merit each possible rating CriteriaNot PresentDevelopingEstablishedAdvanced Criterion 1 Description of the performance that merits this rating Criterion 2 Description of the performance that merits this rating Criterion 3 Description of the performance that merits this rating Criterion 4 Description of the performance that merits this rating Criterion 5 Description of the performance that merits this rating

Use of the Results Examining the data – Content analysis Involves making sense of the material being reviewed Summarize common themes and the extent of consensus concerning those themes Should be approached with an open mind and a willingness to “hear” what respondents are saying

Use of the Results Examining the data – Content analysis Should begin with a clear understanding of the goals associated with the review Sometimes coding categories are predetermined; sometimes, they emerge as materials are reviewed

Use of the Results Inter-rater reliability – Indicates the extent of agreement among different reviewers – Generally want to verify that scores based on subjective judgments are reliable (consistent)

Use of the Results Inter-rater reliability – Two approaches Have at least two people independently review data, compare notes, and come to agreement Have reviewers work together on all decisions, collaborating as they go to resolve discrepancies

Use of the Results Establishing standards – Generally set own standards – Should take the campus and program mission into account when establishing standards

Use of the Results Establishing standards – Sometimes faculty want to know how their students compare to students on other campuses – Can set expectations based on the relative performance of their students

Use of the Results Summarizing the results – Frequency table: a visual depiction of data that shows how often each value occurred

Use of the Results ItemResponse optionsFrequencyPercent What is your present attitude toward your degree program? Strongly negative00.0 Negative15.0 Somewhat negative15.0 Somewhat positive315.0 Positive735.0 Strongly positive840.0 TOTAL

Use of the Results Summarizing the results – Frequency table: a visual depiction of data that shows how often each value occurred – Measures of central tendency Mean: arithmetic average Median: 50 th percentile Mode: most common score

Use of the Results ItemResponse optionsFrequencyPercent What is your present attitude toward your degree program? Strongly negative (1)00.0 Negative (2)15.0 Somewhat negative (3)15.0 Somewhat positive (4)315.0 Positive (5)735.0 Strongly positive (6)840.0 TOTAL Mean = 5.0Median = 5.0Mode = 6.0

Use of the Results Summarizing the results – Qualitative summary: “The majority of respondents indicated that they hold positive attitudes toward their degree program.”

Use of the Results Sharing the results – Be open, honest, balanced, and fair – Understand your audiences and their needs – Help your audiences see the big picture

Use of the Results Sharing the results – Celebrate good results – Address areas needing improvement Goals and/or outcomes? Direct and/or indirect evidence? Curriculum and/or instruction? – Incorporate results into planning and decision-making processes

FINAL THOUGHTS Refining Your Assessment Plan: Session 4

Final Thoughts Goals and learning outcomes define how students demonstrate their mastery of what faculty want them to learn A cohesive curriculum is aligned with goals and learning outcomes

Final Thoughts Assessment involves collecting direct and indirect evidence concerning student development Embedding assessment in the curriculum has many advantages Assessment should not focus on individual students or faculty

Final Thoughts A recurring theme in assessment is collaboration Three major criteria apply to assessment; it should be meaningful, manageable, and sustainable

QUESTIONS, COMMENTS, AND DISCUSSION Refining Your Assessment Plan: Session 4