Outcomes assessment Basics

Slides:



Advertisements
Similar presentations
Goals-Based Evaluation (GBE)
Advertisements

ACADEMIC DEGREE ASSESSMENT & GENERAL EDUCATION ASSESSMENT Nathan Lindsay Arts & Sciences Faculty Meeting March 12,
Using WEAVE Online Nathan Lindsay & Dan Stroud September 4, 2013.
Erica Schurter and Molly Mead Department of Information Access.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Measuring Learning Outcomes Evaluation
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
Nathan Lindsay, UM Associate Provost MUS Assessment Workshop
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013.
Differentiated Supervision
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
Periodic Program Review for Academics Affirming Excellence in Education LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
University of Central Florida S.O.S.: Student Outcomes Solutions for Program Assessment Paula S. Krist, Ph.D. Director, OEAS December 5, 2005 CS-55.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
TWS Aids for Student Teachers & Interns Overview of TWS.
Florida Tech’s University Assessment Committee For A Continuing Culture of Assessment.
1 Support Provider Workshop # East Bay BTSA Induction Consortium.
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Goals, Objectives, Outcomes Goals—general purpose of curriculum Objectives—more specific purposes that describe a learning outcome Outcomes—what learner.
Teaching and Learning with Technology ick to edit Master title style  Allyn and Bacon 2005 Teaching and Learning with Technology  Allyn and Bacon 2002.
Setting Your Goals For TTESS Memorial HS Training September 11, 2015.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Training Week, August 2016 Assessment for
Teacher Work Sample. Lectures Objectives: 1.Define the teacher work sample. 2.Integrate lesson plans with a practice Teacher Work Sample in terms of the.
Performance Management
Classroom Assessments Checklists, Rating Scales, and Rubrics
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
Assessment of Learning 1
Consider Your Audience
Step 0: Common Assessments
Effective Outcomes Assessment
Assessment of Student Learning
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Classroom Assessments Checklists, Rating Scales, and Rubrics
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Discovery Presentations
COMPETENCIES & STANDARDS
Target Setting for Student Progress
Jo Lynn Autry Digranes Coordinator for Assessment Updated 10/2017
Institutional Effectiveness Presented By Claudette H. Williams
Assessment and Accreditation
Introduction to Student Achievement Objectives
Component 4 Effective and Reflective Practitioner
Assessments TAP 1- Strand 5.
Teacher Roles and Responsibilities
Analyzing Student Work Sample 2 Instructional Next Steps
Creating Meaningful Student Learning Outcomes
Presented by: Skyline College SLOAC Committee Fall 2007
2019 Local School District Charter Application Process
Exploring Assessment Options NC Teaching Standard 4
Unit 7: Instructional Communication and Technology
What to do with your data?
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
February 21-22, 2018.
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
CLASS KeysTM Module 6: Informal Observations Spring 2010
Writing a Measurable Student Learning Outcome
Institutional Self Evaluation Report Team Training
Presentation transcript:

Outcomes assessment Basics Dan Stroud May 25, 2018 Image taken from: http://www.everyday-thai.com/learn_thai_online/speak/intermediate_lessons.html

Word Cloud image take from http://www.assessment.uconn.edu/

A Vision for Assessment Commitment to the Outcome To provide sufficient support and guidance to help you realize the dividends for the time/effort invested Enhanced learning Improved programs/degrees Greater communication about teaching/learning among faculty To create a culture of learning, where striving to enrich our students’ learning is ALWAYS what is PROJECTED

Some Guiding Assumptions… Teaching and learning can be improved through systematic inquiry Assessment is always a work in progress, and it’s ok if things don’t go perfectly Assessment is about lessons learned in the efforts to enhance learning/teaching Goal of the Assessment Annual Report = To demonstrate concerted effort on the part of faculty to examine student outcomes and make appropriate adjustments to improve program

FIVE “Big Picture” questions to ask When conducting assessment How do you define a successful student in your program? What have you learned about your students’ learning? What Evidence do you have that proves their skill level? How are you using that evidence to improve Student understanding in your program? After implementing a plan of action, how did it change the resulting skill level? Slide taken from Debbie Smith’s 2010 presentation at UMKC, “I Survived Assessment”

Assessing Our University’s (& Your Department’s) Assessment Efforts Compliance Commitment External Questions Internal Questions Number & Amount Quality & Utility Reporting Interpreting Slide taken from Susan Hatfield’s Presentation on “Coaching Assessment: Student Learning Outcomes” Collecting it Using it Accreditation Learning

Assessment Components for Academic Degree Programs Mission statement Learning Outcomes (usually 4-6) Remember: SMART Specific Measurable Attainable Relevant/Results- Oriented Time-bound Specific -Clear and definite terms describing the abilities, knowledge, values, attitudes, and performance Measurable -It is feasible to get data: data are accurate and reliable; it can be assessed in more than one way Aggressive and Attainable -The outcome has the potential to move the program or unit forward Results – oriented -Describe what standards are expected from students or the functional area being assessed Time-bound -Describe a specified time period for accomplishing the outcome

Measurements Complete Measurements Process What instrument? why? formative and summative assessment? direct and indirect measure? When possible, Multiple measures can enhance assessment and make it more meaningful to program How conduct measurement? which students? when measured? where? how administered? by whom? often good to use smaller samples of students; capstone courses How collect and store data? Who analyzes data? how? when?

Achievement Targets What kind of performance do you expect from your students Learning? What should the Outcomes be? What is the desirable level of performance for your students ? Rubrics can clarify this (see the next slides) What percentage of students do you expect to achieve this?

Using Rubrics A rubric is: “a set of criteria and a scoring scale that is used to assess and evaluate students’ work” (Cambell, Melenyzer, Nettles, & Wyman, 2000). Addresses performance standards in a clear and concise manner (which students appreciate!) Clearly articulates to students the areas of improvement needed to meet these standards To find examples, Google rubrics for your discipline Cambell, D. M., Melenyzer, B. J., Nettles, D. H., & Wyman, R. M. (2000). Portfolio and performance assessment in teacher educations. Boston: Allyn and Bacon.

Example of a Rubric Foreign Languages and Literatures Assessment Tool for Oral Proficiency Interview adapted from “Interpersonal Mode Rubric Pre-Advanced Learner” 2003 ACTFL Category Exceeds Expectations Meets Expectations Does Not Meet Expectations Comprehensibility Who can understand this person’s meaning? How sympathetic must the listener be? Does it need to be the teacher or could a native speaker understand the speaker? How independent of teaching situation is the conversation? Easily understood by native speakers, even those unaccustomed to interacting with language learners. Clear evidence of culturally appropriate language, Although there may be some confusion about the message, generally understood by those unaccustomed to interacting with language learners. Generally understood by those accustomed to interacting with language learners. Language Control Accuracy, form, appropriate vocabulary, degree of fluency High degree of accuracy in present, past and future time. Accuracy may decrease when attempting to handle abstract topics Most accurate with connected discourse in present time. Accuracy decreases when narrating and describing in time frames other than present.   Most accurate with connected sentence-level discourse in present time. Accuracy decreases as language becomes complex.

How to build a rubric Answer the following questions: Given your broad course goals, what determines the extent of student understanding? What criterion counts as EVIDENCE of student learning? What specific characteristics in student responses, products or performances should be examined as evidence of student learning? Slide from Molly Mead and Erica Schurter presentation entitled, “Using Rubrics to Make Assessment for Efficient”

For example: Can our students deliver an effective Public Speech? Developing a rubric helps you to clarify the characteristics/components of your Learning Outcomes: For example: Can our students deliver an effective Public Speech? Slide taken from Susan Hatfield’s Presentation on “Coaching Assessment: Student Learning Outcomes” eye contact gestures volume sources transitions style rate poise examples verbal variety appearance evidence conclusion organization attention getter

More rubric help AACU Rubrics http://www.aacu.org/value/rubrics Rubrics from Susan Hatfield (Accreditation Mentor) www.winona.edu/air/rubrics.htm Rubistar http://rubistar.4teachers.org/

Findings Part I: specific findings What do the data tell you? Compare new data to achievement targets Did students meet or deviate from expectations? Important: Include specific numbers/percentages when possible Do not use course grades or pass rates.

Findings (cont.) Part II: general findings Conversations what do the data tell you? Part II: general findings What lessons did your faculty learn from this evidence about your students? What broader implications do you draw about your program? Ex: curriculum, admissions, administration, policies, requirement, pedagogy, assessment procedures, and so on Conversations The more people involved, the better!

Action Plans Concrete Steps for Change list of specific innovations that you would like to introduce in AY 2013-14 to address lessons learned in AY 2012-13. Again, in curriculum, admissions, administration, policies, requirement, pedagogy, assessment procedures, and so on Resources? Time Period? Point Person? It is best to have documentation of the changes made through these Action Plans (e.g., in syllabi, the course catalogue, meeting minutes)

Don’t Forget the Assessment Annual Report Part I: Detailed Assessment Report “Assessment Narrative All items (mission -> action plans) submitted in Assessment Management System folder Assessment Deadline December 1 Annually

Assessment Plan Narrative Part II: Timeline/Account of Activities “Assessment Plan Narrative” In 1-2 pages, tell the story of all the work and careful consideration you and your colleagues accomplished in your assessment work this year (Ex.: meetings, mentoring, experiments, setbacks, lessons learned) Plug into your Assessment Folder for that year Please follow the four outlined questions (see next slide)

Four Questions for the Assessment Narrative Process: Please describe the specific activities and efforts used to design, implement, and analyze your assessment plan during this academic year. This narrative might be organized chronologically, listing meetings, mentoring sessions, and experiments at each stage of the developmental process including the names of people involved in various capacities, with each event given one paragraph. Positives: Please describe what was most useful about the assessment process, or what went well.  What did you learn about your faculty, students, or program through this experience?  Challenges: Please describe the challenges you encountered in terms of the development or implementation of your assessment procedures, as well as the lessons you learned from this experience and your efforts or plans for overcoming them. This section might be organized topically. Support: Please describe your program’s experience during the past year with the support and administrative structures in place at MSU for Assessment:  the Provost’s Office, the University Assessment Committee, the Institutional Research and Assessment office, and so on.

Submission: December 1st No edits allowed after 1st of December After this date, the Peer Review process will commence with feedback returned by approximately March 1st to the programs. Also After this date, information gathered and reported can be loaded into the next year’s assessment folder. The previous year’s Narrative should be held in the new folder until end of year reporting is complete for that cycle.

After December 1st Assessment entries for AY 2013-14 begin Assessment Cycle runs from Approximately August 15 (End of Summer II for a 12 month cycle. Necessary to implement the Action discussed in the Narrative and recorded in the previous year’s assessment plan. Update mission statements, goals, learning outcomes, and measurements based on feedback from UAC. Items in the new assessment folder will carry over from last year unless changed, with the exception of previous year’s documentation. Begin Entering new findings and formulating new action plans.

Questions? Image taken from http://expeditedselling.com/guiding-tips/questions-to-ask-before-using-crm/

Contact Information For assistance with assessment, please contact Dr. Dan Stroud, Assessment Specialist at 397-4742 or daniel.stroud@mwsu.edu.