Farris Child Dan Chandler

Slides:



Advertisements
Similar presentations
Best Practices in Assessment, Workshop 2 December 1, 2011.
Advertisements

Farris Child Cynthia Wong Dan Chandler FF. 1. Describe the differences between direct and indirect evidences of learning. 2. Identify instances where.
BRIGHAM YOUNG UNIVERSITY DAN CHANDLER CYNTHIA WONG FARRIS CHILD Unleashing the Power of Rubrics in Assessment.
No Man is an Island How to foster collaboration and teamwork in assessment Brigham Young University Dan Chandler Cynthia Wong Farris Child.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Robin S. Perez University of Minnesota.  Advising as Coaching  Motivational Interviewing.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Allison Bridgeman Director of Residence Life Elizabethtown College.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
What are students able to demonstrate upon completion of a course? (Not about what instructors provide) Use specific and simple action verbs. Example:
EVALUATION An introduction: What is Audience Data and
Senior culminating project
What Difference Can Portfolio Make in Radiographer Work Practice
Chapter 8: Developing Your Speech
Assessment in student life
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Planning my research journey
How to Research Lynn W Zimmerman, PhD.
Consider Your Audience
Evaluation Emma King.
Career Portfolios Building Your Own Personal Career Portfolio
Effective Outcomes Assessment
Logistics OUTCOMES EVALUATION.
Qualitative Data Collection: Accuracy, Credibility, Dependability
Intermediate Small Business Programs, Part B SBP 202 Lesson 1: Introduction February 2017 Lesson 1: Introduction.
Online Focus Groups A How-To Guide in 8 Steps
In-Service Teacher Training
SAT Notes: Please get out your notebook and turn to the writing section. We are taking notes today.
Clear & Convincing Evidence
Senior Exit Interview Template (Junior Practice Edition)
Regional EHC plan Peer Moderation Group
Research Methodology: Instruments and Procedures
Transforming Grading Robert Marzano
Senior Exit Interview Template (Junior Practice Edition)
Welcome and Announcements
Changes to the Educator Evaluation System
The Assessment Toolbox
IB Environmental Systems and Societies
Student-driven IEP Learning Objectives:
Assessment Design Essential Question Key Understandings
Basic Research Terms Research—the process of finding information relevant to a particular topic Source—any medium that provides information relevant to.
Student Learning Outcomes Assessment
Introduction to Assessment of Student Learning
Forging the Innovation Generation
Data and Data Collection
Advanced Program Learning Assessment
M.A.T.C.H. Professional Series: Module 11
Montessori- What is it all about?
Creating Assessable Student Learning Outcomes
Program Assessment Processes for Developing and Strengthening
Guide to Intern Assessment Processes for Interns
Logic Model, Rubrics & Data collection tools
HANDOUT Page for facilitators that lists all the hand outs needed for the workshop and the meanings of icons used on the slides in this workshop. SLIDE.
Component 4 Effective and Reflective Practitioner
Update from ECO: Possible Approaches to Measuring Outcomes
ON-DEMAND Overview Elementary and Middle
What to do with your data?
بسم الله الرحمن الرحيم.
Seminar in Economics Econ. 470
Assessment Literacy: Test Purpose and Use
The Global Community for Academic Advising
Writing Instructional Objectives
Changing the Game The Logic Model
This lesson is for both investigation and artefact projects.
MEng projects 2012 week 10 update
Nareit Brand Guidelines: How to Draft a Powerful Presentation
The resources for this candidate support has been created and provided by CERRA utilizing materials from the National Board of Professional Teaching Standards.
SimVenture Evolution Employability Skills
Writing competence based questions
Presentation transcript:

Farris Child Dan Chandler Transforming Assessment With Direct Evidences of Learning Brigham Young University Farris Child Dan Chandler Introductions FF

Session Outcomes Attendees will be able to: Describe the differences between direct and indirect evidences of learning. Identify instances where either direct or indirect evidences would be appropriate. Explain why direct evidences of learning are important. Show how changing the assessment of an outcome can lead to direct verses indirect evidence. How many of us are involved with advisement assessment? How many of us are currently assessing outcomes? Who already knows the difference between direct and indirect evidences of learning? Who is using both evidences with their advisement assessment? Who isn’t sure what we’re talking about? Two fold reason for these questions 1. To help us get started 2. As an example – indirect assessment tool gathering indirect evidence Those who raised your hand – DO I have any proof I can assume that most likely yes- but I don’t know for sure One of you who raised your hand- Could you please tell me what is the difference between direct and indirect Example of Direct Seasoned Assessor – Consider Direct Evidences of Learning Beginning Assessor – Considerations and some ideas of how you can assess your outcomes FD

Summary of Indirect and Direct Evidences of Learning Capture perceptions or opinions Capture signs that students/ advisors are probably learning or have the skill sets being assessed Less clear and less convincing Requires those being assessed to demonstrate their knowledge or skills Tangible, visible, self explanatory evidence of what has and hasn’t been learned or what has or hasn’t been done Clear and convincing Suskie (2009) defines indirect evidences of learning as, “evidence consisting of proxy signs that students are probably learning”. Aiken-Wisniewski et. al. (2010) states that. Indirect measures (another word for evidences of learning) often capture the opinions and perceptions about student learning or process outcomes but are not used to directly measure the desired outcome. Suskie defines direct evidence of learning as, “Tangible, visible, self- explanatory, and compelling evidence of what exactly students have and haven’t learned.” Aiken-Wisiewski et.al explain that, “Direct measures capture student learning directly through concrete behavioral expressions of the desired outcome. They prompt students to demonstrate their knowledge or skills.” DD

Examples of Direct and Indirect Evidences of Student Learning Outcomes Surveys Questionnaires Interviews – asking questions which responses can’t be validated Questions asking if a student has the skills or knowledge regarding an outcome (Yes / No type questions) Direct Observations Pre test / Post test Evidence we can see which clearly demonstrates skill sets or knowledge = no doubt Questions asking the student to explain, identify, define, list, etc. Possibly multiple choice questions (depends on how the outcome is written) Examples (PDOS Indirect Focus Groups Usage Statistics Surveys & Questionnaires Direct Internal review- observations Performance on case student problems External review by qualified individuals *Aiken-Wisniewski et.al. (2010) +After discussing this slide have audience explain to their neighbor the difference between direct and indirect evidences of learning- OBSERVE- As I’m observing this is a direct evidence of learning DD

Examples of Indirect and Direct Evidences of Outcomes Ask advisors if they participated in professional development Or ask how many hours of professional development they acquired. Look at an advisor’s transcript, signed CEU form, certificate of attendance, etc. Please see you handout for more specific definitions DD

Examples of Indirect and Direct Evidences of Outcomes Use a Likert scale asking students the extent to which they know what needs to be done in order to get off academic probation Have the student explain to their advisor how to get off academic probation OR ask a student to correctly identify which statement describes how to get off academic probation (out of several options) OR ask the student to explain how to get off academic probation on a survey with an open ended question Please see you handout for more specific definitions DD

Examples of Indirect and Direct Evidences of Outcomes Ask student if they know what they need to do in order to move forward with their career goals Advisors use a rubric with defined levels identifying how prepared the student is with their career goals (graduate school, internship, and employment preparedness, etc.) Ask student to list the next steps for their career goals Please see you handout for more specific definitions DF

How do you decide which tools to use? Focus Groups Questionnaires Interviews How do you decide which tools to use? Direct Observations Pre-test/post-tests Surveys FF

Importance of Direct Evidence Fusch (2013) suggests we should move beyond surveys by having students create learning products which allow us to form a more direct assessment of what students have learned Campbell (2005) states, “We must gather evidence from multiple sources. Evidence must reflect both direct and indirect measures, and be both quantitative and qualitative.” Suskie (2009) proposes that, “No assessment of knowledge, conceptual understanding, or thinking or performance skills should consist of indirect evidence alone.” Both indirect and direct assessment methods are important. I propose that if we have evidence from both types- stronger more viable assessment portfolio Much of advisement assessment tends to be more indirect Stronger evidence - FF

Transforming Indirect to Direct Questions Indirect – Ask yes/no, Likert type, etc. questions Direct – Ask students to identify, locate, interpret, etc. OR- use open ended questions requiring the student to explain, list, recall, describe, etc. Direct- can be more difficult to go through all the open ended responses but can be a great evidence. If you would like more the direct, but without going through the responses asking students to identify, locate, interpret, etc. may work depending on the outcome FD

Question Type Examples Indirect – Outcome = Students can identify what they must do in order to get off probation Question = Do you know how to get off probation (Yes or No) OR- Please indicate the level to which you understand what you need to do in order to get off probation Direct Question = How does a student get off probation? (open ended or have them find and select the correct response out of several possibilities). Because the outcome key word is identify – Have a multiple choice question with several possibilities and one correct option. If the student can identify the correct option then this would be an example of a direct evidence. Direct ? Outcome = Students can explain / identify what they must do in order to get off probation DD

Question Type Examples Indirect – Outcome = Students can find their progress report Question = Do you know where to go to find your progress report? Direct Question = Which of the following steps (in order) would you take in order to access your progress report? (This question would list several possibilities and require the student to identify which steps they would take.) OR we could ask the student to, “Please show me how to find your progress report.” Indirect = They are telling us they can but we can’t prove it- no concrete evidence Direct= They had to correctly identify or explain Direct Outcome = Students can find their progress report DF

Challenge Think about a Student Learning Outcome or Process and Delivery Outcome that you trying to assess: How could we gather direct evidence for this outcome? How could we gather indirect evidence for this outcome? FF

References Aiken-Wisniewski, S., Campbell, S., Higa, L., Kirk-Kuwaye, M., Nutt, C., Robbins, R. & Vesta, N. (2010). Guide to Assessment in Academic Advising Second Edition. Monograph Series Number 23. National Academic Advising Association. Suskie, L. (2009). Assessing Student Learning. 2nd Ed. San Francisco, CA. John Wiley & Sons, Inc. Campbell, S. (2005, December). Why do assessment of academic advising? Academic Advising Today, 28(4). Retrieved from Academic Advising Today: http://www.nacada.ksu.edu/Resources/Academic-Advising-Today/View-Articles/Why- Do-Assessment-of-Academic-Advising.aspx Fusch, D. (2013). Assesing Student Learning Outcomes: Surveys Aren’t Enough. Retrieved from Academic Impressions Web Site: http://www.academicimpressions.com/news/assessing-student-learning-outcomes- surveys-arent-enough?qq=18798a430987mW Robbins, R. & Zarges, K.M. (2011). Assessment of Academic Advising: A Summary of the Process. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web Site: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Assessment- of-academic-advising.aspx