Assessment 101 Elizabeth Bledsoe.

Slides:



Advertisements
Similar presentations
Outcomes and Standards. Outcome Curricular statements describing how students will integrate knowledge, skills, and values into a complex role performance.
Advertisements

STUDENT SUCCESS CENTERS : WORKING BETTER TOGETHER TO ENSURE STUDENT SUCCESS.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Student Learning Outcomes Curriculum Change Request Academic Council Presentation Gary Howard Rosemary Hays-Thomas October 22, 2004.
AQIP: “Academic Quality Improvement Program” Same Great Quality, Less Filling.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
The Academic Assessment Process
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
Writing an Effective Assessment Plan
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
Formulating objectives, general and specific
Learning Outcomes at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
From Learning Goals to Assessment Plans University of Wisconsin Parkside January 20, 2012 Susan Hatfield Winona State University
“WRITING EDUCATIONAL (a.k.a.: STUDENT LEARNING) OBJECTIVES”
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
Departmental Assessment Process.  The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Lesson Planning. Teachers Need Lesson Plans So that they know that they are teaching the curriculum standards required by the county and state So that.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Writing Learning Outcomes David Steer & Stephane Booth Co-Chairs Learning Outcomes Committee.
BUILDING A CULTURE OF ASSESSMENT AT ALBANY STATE UNIVERSITY Ian Sakura-Lemessy Ruth Salter Amitabh Singh.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
1 Assessment Gary Beasley Stephen L. Athans Central Carolina Community College Spring 2008.
Human Learning Asma Marghalani.
Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C.
Academic Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Student Learning Outcomes
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
D ESCRIBING Y OUR L EARNING Unit 5 Seminar. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions 2.
A Decision-Making Tool.  Goal  Educational Objectives  Student Learning Outcomes  Performance Indicators or Criteria  Learning Activities or Strategies.
Does this learning goal focus on what the student will do? Objective: Conservation of energy A.Yes B.No C.Depends on context.
CREDIT REQUESTS.  Credit Requests  Learning Statement Recap  Importance of Verbs  Creating Credit Requests in PDAS  Technical Support  Questions.
Performance Management A briefing for new managers.
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
INSTRUCTIONAL OBEJECTIVES PURPOSE OF IO IO DOMAINS HOW TO WRITE SMART OBJECTIVE 1.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Unit 5 Seminar D ESCRIBING Y OUR L EARNING. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
D ESCRIBING Y OUR L EARNING Unit 5 Seminar. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
Bloom’s Taxonomy Dr. Middlebrooks. Bloom’s Taxonomy.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Facilitating Higher Order Thinking in Classroom and Clinical Settings Vanneise Collins, PhD Director, Center for Learning and Development Cassandra Molavrh,
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Assessment 101: Or Why and How to Assess ACRAO Spring Conference 2011 Presenter: Gladys Palma de Schrynemakers.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment to Administrative Services Mohawk Valley Community College February 7, 2006.
The assessment process For Administrative units
Institutional Program Review 2017 Update
Institutional Effectiveness USF System Office of Decision Support
Presented by: Skyline College SLOAC Committee Fall 2007
What to do with your data?
What you assess makes a statement about what you value
Assessing Administrative and Educational Support Services
Presentation transcript:

Assessment 101 Elizabeth Bledsoe

Agenda Introduction to Assessment Components of an Assessment Plan Mission Outcomes Measures Achievement Targets Question and Answer Session

SACS Comprehensive Standard 3.3.1 SACS Expectations SACS Comprehensive Standard 3.3.1 3.3 Institutional Effectiveness 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) 3.3.1.1 educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate

SACS Comprehensive Standard 3.3.1 SACS Expectations and provides evidence of improvement based on analysis of the results… SACS Comprehensive Standard 3.3.1 3.3 Institutional Effectiveness 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) 3.3.1.1 educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate

The Assessment Circle Develop Program Mission & Outcomes Design an Assessment Plan Implement the Plan & Gather Information Interpret/ Evaluate Information Modify & Improve Adapted from: Trudy Banta, IUPUI

Develop Mission and Outcomes Develop Program Mission & Outcomes

Mission Statement The mission statement links the functions of your unit to the overall mission of the institution. A few questions to consider in formulating the mission of your unit: What is the primary function of your unit? What core activities are involved? What should those you serve experience after interacting with your unit?

Characteristics of a Well-Defined Mission Statement Brief, concise, distinctive Clearly identifies the program’s purpose Clearly aligns with the mission of the division and the University Explicitly articulates the essential functions/activities of the program Clearly identifies the primary stakeholders of the program: i.e., students, faculty, parents, etc.

Example of a Mission Statement “The primary purpose of the Office of Academic Advising is to assist students in the development and implementation of their educational plans. To this end the Office of Academic Advising subscribes to the philosophy of developmental advising; advising is a cooperative effort between advisor and student that consists not only of course planning and selection, but the development of the person as a whole. This includes the selection of career and life-long goals.” (University of La Verne)

Outcomes Learning Outcomes Program Outcomes There are two categories of outcomes: Learning Outcomes Program Outcomes

Learning Outcomes When writing Learning Outcomes, the focus must be on the students and what they will think, know, do, or value as a result of participation in the educational environment.

Cognitive Learning Affective Learning Knowledge - to recall or remember facts without necessarily understanding them articulate, define, indicate, name, order, recognize, relate, recall, reproduce, list, tell, describe, identify, show, label, tabulate, quote Comprehensive - to understand and interpret learned information classify, describe, discuss, explain, express, interpret, contrast, associate, differentiate, extend, translate, review, suggest, restate Application - to put ideas and concepts to work in solving problems apply, compute, give examples, investigate, experiment, solve, choose, predict, translate, employ, operate, practice, schedule Analysis - to break information into its components to see interrelationships analyze, appraise, calculate, categorize, compare, contrast, criticize, differentiate, distinguish, examine, investigate, interpret Synthesis - to use creativity to compose and design something original arrange, assemble, collect, compose, construct, create, design, formulate, manage, organize, plan, prepare, propose, set up Evaluation - to judge the value of information based on established criteria appraise, assess, defend, judge, predict, rate, support, evaluate, recommend, convince, conclude, compare, summarize Affective Learning appreciate, accept, attempt, challenge, defend, dispute, join, judge, praise, question, share, support

Examples of Learning Outcomes Students who participate in career counseling will be able to define the next step(s) in their career development process. Students will identify various aspects of architectural diversity in their design projects.

Program Outcomes Process statements Satisfaction statements Relate to what the unit intends to accomplish Level or volume of activity Efficiency with which you conduct the processes Compliance with external standards of “good practice in the field” or regulations Satisfaction statements Describe how those you serve rate their satisfaction with your unit’s processes or services

Examples of Program Outcomes Process statements The Registrar’s office will promptly process transcript requests. Students will utilize the University Writing Center. Satisfaction statements Students will report satisfaction in usefulness of the registration system. Transfer students will report satisfaction with admissions application processing.

When Writing Outcomes… Consider questions such as: What are the most important results or impacts that should occur as a result of your unit’s activities? What are your critical work processes and how should they function? What does the end user experience through interaction with your unit? What should students in your degree program know, do, and/or value?

When Writing Outcomes… Outcomes should be: linked to the unit’s mission realistic and attainable limited in number (manageable) something that is under the control of the unit measurable and/or observable meaningful

When Writing Outcomes… Outcomes should also: target key services or change points use action verbs

Student Learning Outcome Example: Students receiving a degree from this program will be effective communicators. Students receiving a degree from this program will be able to effectively communicate their research findings both verbally and in writing. Student Learning Outcome A Student Learning Outcome B

Program Outcome Example: The Office of Student Financial Aid will respond to all meeting requests within two business days. The Admissions Office will process applications within a timely manner. Program Outcome A Program Outcome B

Design an Assessment Plan

Assessment Measures After establishing your outcomes… Define and identify the sources of evidence you will use to determine whether you are achieving your outcomes. Detail what will be measured and how Identify or create measures which can inform decisions about your program’s processes and services.

Characteristics of an Effective Assessment Measure Measurable and/or observable You can either observe it, count it, quantify, etc. Meaningful It captures enough of the essential components of the objective to represent it adequately It will yield vital information about your program Manageable It can be measured without excessive cost or effort

Types of Assessment Measures (Palomba and Banta, 1999) There are two basic types of assessment measures: Direct Measures Indirect Measures

Direct Measures Direct measures are those designed to directly measure: what a stakeholder knows or is able to do (i.e., requires a stakeholder to actually demonstrate the skill or knowledge) The effectiveness and/or value of the program or process

Common Direct Measures Participation data Observation of behavior Collection of work samples (student work) Volume of activity Level of efficiency (average response time) Measure of quality (average errors)

Indirect Measures Indirect measures focus on: stakeholders’ perception of their level of learning stakeholders’ perception of the benefit of programming or intervention stakeholders’ satisfaction with some aspect of the program or service

Common Indirect Measures Surveys Exit interviews Retention/graduation data Demographics Focus groups

Choosing Assessment Measures Some things to think about: How would you describe the end result of the outcome? How will you know if this outcome is being accomplished? What will provide you with this information? Where are you currently delivering the outcome? Are there any naturally occurring assessment opportunities? What measures are currently available?

Choosing Assessment Measures Some more things to think about: Will the resulting data provide information that could lead to improvement of your services? Who will analyze the information and how easily will it fit into their regular responsibilities? How will it fit into your budget and timeline?

Achievement Targets An achievement target is the result, target, benchmark, or value that will represent success at achieving a given outcome. Achievement targets should be specific numbers or trends.

Examples of Achievement Targets 95% of our users will be “very satisfied or satisfied” with our services. 90% of the transcripts will be sent within three days. Each employee will participate in a minimum of two training/development programs per year. Students will score a 2.5 out of 4 on the writing rubric.

Implement & Gather Information Implement the Plan & Gather Information

Findings In a WEAVEonline Assessment Report, “Findings” refer to a concise summary of the results you gathered from a given assessment measure. The language of this statement should parallel the corresponding achievement target. Results should be described in enough detail to prove you have met, partially met, or not met the achievement target.

Examples of Findings Statements Achievement Target Overall mean score of students from program will meet or exceed the state average score of 79. Findings The achievement target was met. The overall mean score of students from the Teaching, Learning, and Culture program exceeded that of the state average score of the state certification exam. Results: Program overall mean scaled score—91.50, State overall mean scaled score—79.13.

Examples of Findings Statements Achievement Target 90% of the survey results will indicate the highest level of satisfaction with each of the five services provided by the Office of Student Success. Findings PM 3/5—edit--- 94% of the survey respondents indicated the highest level of satisfaction with services provided by the Office of Student Success.

Interpret/Evaluate Information

Analyzing Findings Reflect on what has been learned during an assessment cycle: Based on the analysis of the findings, what changes could or should be made to improve the program? What specific findings led to this decision?

Analyzing Findings Three key questions at the heart of the analysis: What did you find and learn? So What does that mean for your academic program or support unit? Now What will you do as a result of the first two answers?

Analysis Questions In WEAVEonline, the Analysis Question responses provide an opportunity to explain the Findings analysis process.

Modify/Improve Modify & Improve

Action Plans After reflecting on the findings, you and your colleagues should determine appropriate action to improve the program. Actions outlined in the Action Plan should be specific and relate to the outcome and the results of assessment. Action Plans should not be related to the assessment process itself

Action Plans Establish a plan for using evidence of student learning achievement or service quality at the program level Establish a decision-making process for approving/implementing recommendations Clearly identify the parties responsible for implementing the approved recommendations

Examples of Assessment Plans

Putting It All Together Outcome: Demonstrate timeliness in processing admission applications. Measure 1 (Direct): Random sample audit of 100 Applications for Admission received by both the Reception Desk and electronically. Achievement Target: 90% of all Applications for Admission will be processed within two weeks of receipt of application.

Putting It All Together Outcome: Undergraduate students who attend the "Registration Overview" during Orientation will be able to successfully utilize the University Registration system. Measure 1 (Indirect): Online post-registration survey gathered, compiled and released to the Registrar's Office in late September. Achievement Target: 80% of the undergraduate students who participate in the "Registration Overview" presentation will answer “Agree” and “Strongly Agree” to the statement: “The "Registration Overview" helped me understand how to register for classes” on an online post-registration survey.

Putting It All Together Outcome: Faculty and staff members who participate in FERPA Rules and Regulations training will be able to demonstrate fundamental knowledge of FERPA rules and regulations that pertain to their roles. Measure 1 (Direct): FERPA training posttest given to faculty and staff members who participate in the online FERPA training webcourse. Achievement Target: 80% of the faculty and staff members will achieve a score of 90% or better on their first attempt at the FERPA training posttest.

Putting It All Together Outcome: The Office of Admissions provides useful university information to high school counselors. Measure 1 (Indirect): An Evaluation Survey is distributed to all participants who attend the Counselor Update seminar. Achievement Target: 90% of high school counselors attending Counselor Update seminars will indicate that the information provided is a “5” (excellent) on a 5 level scale.

Putting It All Together Outcome: University has the optimal number of newly enrolled students to achieve our enrollment goals for the total number of students and credit hour production. Measure 1 (Direct): The number of new freshmen and transfers enrolled at the conclusion of Late Registration will be used to calculate our success in achieving this outcome for the fall of 2013. Achievement Target: A total of 2200 incoming freshmen and 1600 incoming transfer students are the enrollment targets set by the Committee for fall 2013 with a total enrollment goal of 28,000. The Enrollment Management Committee sets targets for the optimal number of incoming students. The optimal number of newly enrolled students for each term is determined by projecting the number of continuing students who will enroll, the capacity of the University to provide instructional resources and the needs of the University for tuition revenue.

Putting It All Together Outcome: Improve the quality of commencement ceremony for future participants. Measure 1 (Indirect): An online post-Commencement survey administered in September 2013 (Summer 2013 graduates), January 2014 (Fall 2013 graduates), and June 2014 (Spring 2014 graduates). Achievement Target: 95% of the recent graduates for Summer 2013, Fall 2013, and Spring 2014 will answer “Yes” to the questions: “Was the information accurate on these aspects of your diploma: (1) your name, (2) your degree type, (3) your major?” on an online post- Commencement survey

Continuous Improvement Office of Institutional Assessment 2012-2013 Assessment Report   Continuous Improvement To fulfill the 2011-12 action plan to address the unmet target of 80% of conference respondents indicating satisfaction with the variety of poster sessions offered, the Office of Institutional Assessment (OIA) along with the Assessment Conference Committee (ACC) sought more variety in the posters for the 2012 Assessment Conference. As a result, the percentage of respondents satisfied with the variety of posters increased from 74% to 78%. Although the 85% target was still not met during the 2012-13 cycle, this result shows improvement towards the target. To complete the other 2011-12 action plan, OIA enhanced the Assessment Review Guidelines to include more practical and applicable “good practices” for assessment liaisons to pass along to their programs as formative assessment. Additionally, the Assessment Review Rubric was modified to be more exhaustive in its evaluation of assessment reports. As a result, less variance was observed in the quality of assessment reports. Lastly, the Vice Provost of Academic Affairs supplied each dean with a college-specific, personalized memo addressing the strengths and weaknesses of assessment reports in each college. This process was well received and will continue as a service to colleges from the Office of the Vice Provost. Outcome/Objective Measure Target Finding Action Plan O 5: Provide Excellent Concurrent and Poster Sessions Provide excellent concurrent and poster sessions for participants at the Annual Assessment Conference. M 8: Overall Assessment Conference Survey 85%, or more, of the Annual Assessment Conference attendees will report satisfaction with the Concurrent and Poster Sessions. Status: Partially Met Following the end of the 13th Annual Texas A&M Assessment Conference, an on-line conference evaluation survey was sent out to all attendees. Information gained from this survey was organized into the 13th Annual Conference Survey Report, and was distributed to the Assessment Conference Committee for review. Results from the survey questions relating to Concurrent and Plenary Sessions are below: Concurrent Sessions - Question 16: "How satisfied were you with the quantity of Concurrent Sessions?" - 90.58% were "Very Satisfied" or "Satisfied" Question 17: "How satisfied were you with the variety of Concurrent Sessions?" - 83.71% were "Very Satisfied" or "Satisfied" Poster Sessions - Question 19: "How satisfied were you with the quantity of Poster Sessions?" - 77.78% were "Very Satisfied" or "Satisfied" Question 20: "How satisfied were you with the variety of Poster Sessions?" - 77.06% were "Very Satisfied" or "Satisfied" Although we improved from the 2011-2012 findings of 73%, based on our findings from the 2012-2013 Assessment Cycle, 77% of respondents indicated that they were satisfied with the variety of poster sessions offered. In response, the Office of Institutional Assessment will seek posters from each track to provide a greater variety of posters during the 14th Annual Texas A&M Assessment Conference. Use of Results Although the satisfaction results from the conference survey related to the variety of poster sessions increased from 74% to 78%, the 85% target was still not met. In response, the Office of Institutional Assessment (OIA) and the Assessment Conference Committee (ACC) will ensure that each of the conference “tracks” has coverage in the poster session. OIA and the ACC have traditionally ensured track coverage in concurrent session offerings but has never paid close attention to track coverage in the poster session offerings. This strategy includes contacting specific authors of concurrent session proposals in underrepresented tracks and inviting them to consider a poster presentation, perhaps in addition to the concurrent session. Next, as referenced in the “Enhance Workshop Presentations” action plan for this cycle, according to our “One Minute Evaluations” results, some workshop attendees have requested to see more examples of quality assessment during the workshop. In response, OIA is enhancing our workshop presentations to include screenshots of actual assessment plans and reports from WEAVEonline to allow attendees to work through and critique assessment reports with our staff to gain a better understanding of quality assessment and reporting. One Minute Evaluations will be analyzed again next year to ensure the examples added to the workshops improve our attendees’ reported satisfaction.

Take-Home Messages You do not have to assess everything every year Modify something already being done that is meaningful to the program Be flexible—this is an iterative process

One-Minute Evaluation What was the most valuable thing you learned? What is one question that you still have? What do you think is the next step that your program needs to take in order to implement course embedded assessment?

February 16-18, 2014 College Station, TX For more information on the conference and registration, visit http://assessment.tamu.edu/conference

References The Principles of Accreditation: Foundations for Quality Enhancement. SACS COC. 2008 Edition. Banta, Trudy W., & Palomba, C. (1999). Assessment Essentials. San Francisco: Jossey- Bass. Banta, Trudy W. (2004). Hallmarks of Effective Outcomes Assessment. San Francisco: John Wiley and Sons. Walvoord, Barbara E. (2004). Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education. San Francisco: Jossey-Bass. Assessment manuals from Western Carolina University, Texas Christian University, the University of Central Florida were very helpful in developing this presentation. Putting It All Together examples adapted from Georgia State University, the University of North Texas, and the University of Central Florida’s Assessment Plans