Farris Child Cynthia Wong Dan Chandler FF. 1. Describe the differences between direct and indirect evidences of learning. 2. Identify instances where.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Meeting MSCHE Assessment Expectations
Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Objectives Explain the purpose of the RIME feedback method.
Frameworks for Assessment of Student Learning: Questions of Concept, Practice, and Detail Christine Siegel, Ph.D. Associate Professor of School Psychology.
Placement Workshop Y2, Sem 2 Professional Practice Module (PPM)
Best Practices in Assessment, Workshop 2 December 1, 2011.
Module 2 Learning More about the Summary of Functional Performance Every day, we are honored to take action that inspires the world to discover, love and.
BRIGHAM YOUNG UNIVERSITY DAN CHANDLER CYNTHIA WONG FARRIS CHILD Unleashing the Power of Rubrics in Assessment.
1 Module 2: Tasks that Prompt Students’ to Represent Your Outcome Statements “ Every assessment is also based on a set of beliefs about the kinds of tasks.
Technology for Data Collection While Facilitating Assessments and Evaluations Technology for Data Collection While Facilitating Assessments and Evaluations.
OCR GCSE Humanities Get Ahead - improving delivery and assessment of Unit 3 Unit B033 Controlled Assessment Approaches to Preparing Candidates for the.
SURVEYS, OBSERVATIONS, AND RUBRICS OH MY! ASSESSING CAREER SERVICES Jessica M. Turos Bowling Green State University Career Center.
Module code: RES503 Date: March 2, 2013 Student ID: Name: Marwa Hamdi El Tanahy Master of Education 1 Creativity, Inquiry, or Accountability? Scientists.
No Man is an Island How to foster collaboration and teamwork in assessment Brigham Young University Dan Chandler Cynthia Wong Farris Child.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes Workshop Strand 3.
I am ready! A look at how career classes are preparing students for career success Katy Hinz, Program Coordinator, Office for Student Engagement. Career.
Evaluation 101 Everything You Need to Know to Get Started Evaluating Informal Science Education Media Everything You Need to Know to Get Started Evaluating.
Professional Portfolios
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Robin S. Perez University of Minnesota.  Advising as Coaching  Motivational Interviewing.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Bill Knight Institutional Effectiveness Student Affairs Assessment: Developing an Effective Assessment Plan.
ASSESSMENT SYED A RIZVI INTERIM ASSOCIATE PROVOST FOR INSTITUTIONAL EFFECTIVENESS.
LEARNING OUTCOMES WORKSHOP Dr. Jan Hillman University of North Texas January 8, 2007.
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Building Your Assessment Plan Esther Isabelle Wilder Lehman College.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Quality Assessment July 31, 2006 Informing Practice.
Allison Bridgeman Director of Residence Life Elizabethtown College.
Peter FitzGerald University of Oregon 2011 OAVSNP Conference Hood River, Oregon 1February 19-20, 2011.
Core Curriculum Proposal Workshop. Overview  Defining Assessment  Steps in Assessment  Time to Practice Developing an Assessment Plan  Q&A and Wrap.
TAA2 Assessor Workshop Learning and Innovation. PROGRAM OUTLINE Workshop Introduction Overview of the TAA Scheme Outline of the TAA2 Activity Break TAA.
TAA1 TEACHER WORKSHOP Learning and Innovation. PROGRAM OUTLINE Workshop Introduction Overview of the TAA Scheme Outline of the TAA1 Process TAA 1 Action.
Reviewer Training 5/18/2012. Welcome & Introductions Co-Chairs: NHDOE Representative:Bob McLaughlin.
TAA2 TEACHER WORKSHOP Learning and Innovation. PROGRAM OUTLINE Workshop Introduction Overview of the TAA Scheme Outline of the TAA2 Activity Break TAA.
Primary Research Options Interview – One-on-one questions/answers with an expert – Often focuses on open-ended questions – Personal, Phone, Survey.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Guide to Evidence for WASC Accreditation Dr. Robert Gabriner City College of San Francisco Student Learning Outcomes.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Research methods 16 th January Research methods Important to have a clear focus for your research. Hypothesis Question Grounded data.
Using Performance Criteria to Assess Student Outcomes Teaching & Learning Symposium May 20, 2009.
Basic Terms Research—the process of finding information relevant to a particular topic Source—any medium that provides information relevant to a particular.
Business Project Nicos Rodosthenous PhD 08/10/2013 1
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Certificate in Employability Skills Customer Advisor Day 4 Preparing for an interview.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Assessment 101: Or Why and How to Assess ACRAO Spring Conference 2011 Presenter: Gladys Palma de Schrynemakers.
 The area of focus consists of teachers creating an environment to help express how children learn through play.  The purpose of this study is to.
The Assessment Process: A Continuous Cycle
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Matchmaker, Matchmaker Make Me a Match
Effective Outcomes Assessment
Qualitative Data Collection: Accuracy, Credibility, Dependability
Research Methodology: Instruments and Procedures
Student Learning Outcomes Assessment
Assessing Student Learning
Farris Child Dan Chandler
Assessing Student Learning
Presentation transcript:

Farris Child Cynthia Wong Dan Chandler FF

1. Describe the differences between direct and indirect evidences of learning. 2. Identify instances where either direct or indirect evidences would be appropriate. 3. Explain why direct evidences of learning are important. 4. Show how changing the assessment of an outcome can lead to direct verses indirect evidence. FD

IndirectDirect  Capture perceptions or opinions  Capture signs that students/ advisors are probably learning or have the skill sets being assessed  Less clear and less convincing  Requires those being assessed to demonstrate their knowledge or skills  Tangible, visible, self explanatory evidence of what has and hasn’t been learned or what has or hasn’t been done  Clear and convincing DD

Indirect Direct  Surveys  Questionnaires  Interviews – asking questions which responses can’t be validated  Questions asking if a student has the skills or knowledge regarding an outcome (Yes / No type questions)  Direct Observations  Pre test / Post test  Evidence we can see which clearly demonstrates skill sets or knowledge = no doubt  Questions asking the student to explain, identify, define, list, etc.  Possibly multiple choice questions (depends on how the outcome is written) DD

Indirect Direct  Ask advisors if they participated in professional development  Or ask how many hours of professional development they acquired.  Look at an advisor’s transcript, signed CEU form, certificate of attendance, etc. DD

Indirect Direct  Use a Likert scale asking students the extent to which they know what needs to be done in order to get off academic probation  Have the student explain to their advisor how to get off academic probation ◦ OR ask a student to correctly identify which statement describes how to get off academic probation (out of several options) ◦ OR ask the student to explain how to get off academic probation on a survey with an open ended question DD

Indirect Direct  Ask student if they know what they need to do in order to move forward with their career goals  Advisors use a rubric with defined levels identifying how prepared the student is with their career goals (graduate school, internship, and employment preparedness, etc.)  Ask student to list the next steps for their career goals DC

Surveys Questionnaires Pre-test/post-tests Direct Observations Focus Groups Interviews

 Fusch (2013) suggests we should move beyond surveys by having students create learning products which allow us to form a more direct assessment of what students have learned  Campbell (2005) states, “We must gather evidence from multiple sources. Evidence must reflect both direct and indirect measures, and be both quantitative and qualitative.”  Suskie (2009) proposes that, “No assessment of knowledge, conceptual understanding, or thinking or performance skills should consist of indirect evidence alone.” CF

◦ Indirect – Ask yes/no, Likert type, etc. questions ◦ Direct – Ask students to identify, locate, interpret, etc.  OR- use open ended questions requiring the student to explain, list, recall, describe, etc. FD

◦ Indirect –  Outcome = Students can identify what they must do in order to get off probation  Question = Do you know how to get off probation (Yes or No) OR- Please indicate the level to which you understand what you need to do in order to get off probation ◦ Direct  Question = How does a student get off probation? (open ended or have them find and select the correct response out of several possibilities).  Because the outcome key word is identify – Have a multiple choice question with several possibilities and one correct option. If the student can identify the correct option then this would be an example of a direct evidence. DC

◦ Indirect –  Outcome = Students can find their progress report  Question = Do you know where to go to find your progress report? ◦ Direct  Question = Which of the following steps (in order) would you take in order to access your progress report? (This question would list several possibilities and require the student to identify which steps they would take.)  OR we could ask the student to, “Please show me how to find your progress report.” CF

Think about a Student Learning Outcome or Process and Delivery Outcome that you trying to assess:  How could we gather direct evidence for this outcome?  How could we gather indirect evidence for this outcome? FF

 Aiken-Wisniewski, S., Campbell, S., Higa, L., Kirk-Kuwaye, M., Nutt, C., Robbins, R. & Vesta, N. (2010). Guide to Assessment in Academic Advising Second Edition. Monograph Series Number 23. National Academic Advising Association.  Suskie, L. (2009). Assessing Student Learning. 2 nd Ed. San Francisco, CA. John Wiley & Sons, Inc.  Campbell, S. (2005, December). Why do assessment of academic advising? Academic Advising Today, 28(4). Retrieved from Academic Advising Today: Do-Assessment-of-Academic-Advising.aspx Do-Assessment-of-Academic-Advising.aspx  Fusch, D. (2013). Assesing Student Learning Outcomes: Surveys Aren’t Enough. Retrieved from Academic Impressions Web Site: surveys-arent-enough?qq=18798a430987mW surveys-arent-enough?qq=18798a430987mW  Robbins, R. & Zarges, K.M. (2011). Assessment of Academic Advising: A Summary of the Process. Retrieved from NACADA Clearinghouse of Academic Advising Resources Web Site: of-academic-advising.aspxhttp:// of-academic-advising.aspx