Beginning with the End in Mind: Choosing Outcomes and Methods

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

What “Counts” as Evidence of Student Learning in Program Assessment?
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
NCCEA Annual Conference Waynesville, NC Assessment Basics: Implementing and Sustaining a Comprehensive, Outcomes-Based Assessment Plan October 19, 2006.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
Apples to Oranges to Elephants: Comparing the Incomparable.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
The Academic Assessment Process
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
PPA 503 – The Public Policy Making Process
SURVEYS, OBSERVATIONS, AND RUBRICS OH MY! ASSESSING CAREER SERVICES Jessica M. Turos Bowling Green State University Career Center.
How to Write Goals and Objectives
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Strategic Planning with Appreciative Inquiry
Evaluation and Attitude Assessment BME GK-12 Fellows 12/1/10, 4-5pm 321 Weill Hall Tom Archibald PhD Student, Adult & Extension Education GRA, Cornell.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
NAVIGATING THE WATERS: USING ASSESSMENT TO MAKE A DIFFERENCE Amy Harper, Area Coordinator, Fordham University Greer Jason, PhD, Assistant Dean of Students,
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
Outcomes Assessment and Program Effectiveness at Florida Atlantic University : Student Affairs Gail Wisan, Ph.D. July 13, 2010.
Foundations of Assessment II Methods, Data Gathering and Sharing Results.
Tutorial 5 Service-Learning Assessment Office of Academic Excellence and Assessment Creighton University © 2007 Donna R. Pawlowski, Ph.D. Associate Professor.
ASSESSMENT Formative, Summative, and Performance-Based
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
LEARNING OUTCOMES AS BLUEPRINTS FOR DESIGN. WELCOME o Facilitator name Position at university Contact info.
David Gibbs and Teresa Morris College of San Mateo.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.
October 31, Dialog about SLOs, assessment, and existing practices at TC Identify course level SLO to assess this semester Align SLO with TC’s institutional.
Evidence of Success: Assessing Student Learning Outcomes in International Education Dr. Darla K. Deardorff Association of International Education.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Assessment of Curriculum Outcomes Dale Whittaker Associate Dean and Director of Academic Programs Purdue University May 21, 2008.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Student Services Assessment Workshop College of the Redwoods Angelina Hill & Cheryl Tucker Nov 28 th & 30 th, 2011.
Assessment Workshop College of San Mateo February 2006.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Foundations of Assessment I Understanding the Assessment Process.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Second session of the NEPBE I in cycle Dirección de Educación Secundaria February 22, 2013.
Quantitative and Qualitative Approaches
Student Services Learning Outcomes and Assessment Jim Haynes, De Anza College Sept. 17, 2010.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
CARMARTHEN and CEREDIGION HEADS OF SCIENCE ASSESSMENT – Moderation and Verification at KS3.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Student Affairs Assessment Council Wednesday, October 28, 2015.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
Copyright 2007 Assessment 101: A Guide for Student Affairs ACPA National Convention Atlanta, GA March 29, 2008 Becki Elkins Nesheim Dianne Timm Kimberly.
Intentionally Targeting Student Learning Outcomes through Course Assessment and Design Patty Roberts, Ph.D. Patricia Noteboom, Ph.D NASPA INTERNATIONAL.
Usage Numbers Track participation in programs or services Methods : existing data, tracking system, calendar system, Key Performance Indicator (KPI) Student.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Assessment 101: Or Why and How to Assess ACRAO Spring Conference 2011 Presenter: Gladys Palma de Schrynemakers.
Chris Sweet Illinois Wesleyan University LOEX Annual Conference 4/30/2010.
Writing Effective Learning Outcomes Joe McVeigh Jenny Bixby TESOL New Orleans, Louisiana, USA March 19, 2011.
Learning Assessment Techniques
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Highline community college
Creating Assessable Student Learning Outcomes
Creating Meaningful Student Learning Outcomes
Dr. Ron Atwell Ms. Kathleen Connelly
Presented by: Skyline College SLOAC Committee Fall 2007
Presentation transcript:

Beginning with the End in Mind: Choosing Outcomes and Methods Dr. Kim Yousey-Elsener Associate Director of Assessment Programs StudentVoice kyouseyelsener@studentvoice.com

Goals of this session: Understand the importance of assessment goals/objectives Define key terms related to methods Explain different types of assessment Determine key factors in choosing methods

Begin with the end in mind… Key questions to ask: Why are you doing this assessment? What do you hope to learn from doing the assessment? Who is your audience for you assessment results? Does that audience like numbers, stories, or both? Time to think…..answer these questions as best you can for your project.

Start with the “Why” and “What” Learning Outcomes Program Outcomes Goals or sub-goals Objectives Questions An outcome is the desired effect of a service or intervention, but is much more specific than a goal. It is participant or output centered.

Good Outcome Statements Translate intentions into actions Describe what participants should demonstrate or produce Use action verbs Align with other intentions (institutional, departmental) Map to practices Are collaboratively authored Reflect/complement existing national criteria Are measureable Maki, P. L. (2004). Assessing for learning: Building a sustainable commitment across the institution.

Aggressive but Attainable: Consider stretch targets to improve program SMART Outcomes Specific: Clear and definite terms describing expected abilities, knowledge, values, attitudes, and performance Measurable: It is feasible to get the data, data are accurate and reliable, it can be assessed more than one way Aggressive but Attainable: Consider stretch targets to improve program

Results-oriented: Describe what standards are expected of students SMART Outcomes Results-oriented: Describe what standards are expected of students Time-bound: Describe where you would like to be within a specified period of time Adapted from Paula Krist, Director of Operational Effectiveness and Assessment Support, University of Central Florida, May 2006.

If you’re not sure, this may help…. Some things to think about: What causes it? Who is especially involved in it? When does it occur? What effects does it have? What types are there? How do various groups perceive it? In what stages does it occur? What will make it better? What makes it effective? What relationship does it have to other phenomena?

Clues to Help You Find a Direction: Usage Numbers – tracks participation in programs or services Student needs - keeps you aware of student body or specific populations Student satisfaction/Perceptions – Level of satisfaction with or percept of campus Learning Outcomes – show a specific program is meeting objectives (Blooms Taxonomy)

Clues to Help You Find a Direction: Cost Effectiveness – how does a program/service being offered compare with cost Comparable (Benchmarking) - Comparing a program/service against a comparison group Using National Standards (i.e. CAS) – Comparing a program/service with a set of pre-established standards Campus Climate or Environment – assess the behaviors/attitudes on campus

But what about the who? An “Quickie” Outcome: By the end of this program student will… Through interacting with this office people will…. This office plans to….by…. By participating in…….students will…. But what about the who? Keep the “Who” in mind when phrase outcomes

Choosing Your Method: Matches: Measure directly matches to the outcome it is trying to measure Appropriate methods: Uses appropriate direct and indirect methods Targets: Indicates desired level of performance Useful: Measures help identify what to improve Reliable: Based on tested, known methods Effective and Efficient: Characterize the outcome concisely

Consider These Key Factors: What tools are in your toolbox? What are the strengths/challenges of each tool? What is your timeline? What resources (time, $$, people) do you have? Is there potential for collaboration? Does the data already exist? What politics are involved? (internal vs. external method) Who is your audience and what type of data would they find useful? (Quantitative vs. Qualitative) Do you need indirect or direct measures? Do you need formative or summative data? Or both?

What type of data do you need? Quantitative Qualitative Focus on numbers/numeric values Easier to report and analyze Can generalize to greater population with larger samples Less influenced by social desirability Sometimes less time, money Examples of Quantitative methods: Survey Usage numbers Rubrics (if assigning #’s) Tracking numbers Focus on text/narrative from respondents More depth/robustness Ability to capture “elusive” evidence of student learning and development Specific sample Examples of Qualitative methods: Interview Focus Group Portfolios Rubrics (if descriptive) Photo Journaling

Direct vs. Indirect Methods Direct Methods - Any process employed to gather data which requires students to display their knowledge, behavior, or thought processes. Indirect Methods - Any process employed to gather data which asks students to reflect upon their knowledge, behaviors, or thought processes.

Example: Direct vs. Indirect INDIRECT: Please rate your level of agreement with the following…. I know of resources on campus to consult if I have questions about which courses to register for in the fall. Strongly agree Moderately agree Moderately disagree Strongly disagree DIRECT: Where on campus would you go or who would you consult with if you had questions about which courses to register for the fall? Open text field

Formative vs. Summative Formative Assessments: Conducted during the program Purpose is to provide feedback Use to shape, modify or improve program Summative Assessment: Conducted after the program Makes judgment on quality, worth, or compares to standard Can be incorporated into future plans

Validity and Reliability The “thing” we are trying to measure The Questions on the Test or Instrument It’s often easy to think of reliability and validity in this way: Imagine the questions on a test are the red dots and the bulls eye is the thing we’re trying to measure For example, the bulls eye might be some aspect of leadership like group cohesion, and the items might ask things like “how well do you get along well with others in a group?” and “when in a group setting, to what degree to others see you as a positive member of the group?” Special Thanks to: Peter Swerdzewski, pswerdz@me.com

Results are reliable.  Results are valid.  Can have reliability without validity! Results are reliable.  Results are NOT valid.  We’re consistently measuring something, but what is it? In the first example: The red dots that resemble our questions are all grouped together, suggesting that students are answering them consistently, thus the results are reliable. The red dots are all sitting right on our bulls eye. Remember that the bulls eye represents the thing we’re trying to measure, like leadership, so the results are all valid. In the second example: The red dots—the questions on our test—are all grouped together, so the results are consistent…there is a high degree of reliability. HOWEVER, the results are not ‘hitting’ the bulls eye! We think we’re measuring something like leadership, but we’re not, so the results are not valid. In the third example: The red dots—our test questions—are all over the place. They are not consistent. An example of this would be a student reading the question “how well do you get along well with others in a group?” and indicating “Not at all”, then reading the question “when in a group setting, to what degree to others see you as a positive member of the group?” and answering “Always”. These two questions measure a very similar thing, but the results are not consistent! Also, the red dots—the questions—are not hitting the bulls eye—the thing we’re trying to measure. Basically, this would be a test or instrument with a bunch of random questions that don’t really mean anything when taken together. You wouldn’t want to add up the results from this test and make decisions based on the total score…it wouldn’t make sense! Results are NOT reliable.  Results are NOT valid.  Slide courtesy of Dr. Dena Pastor, James Madison University Special Thanks to: Peter Swerdzewski, pswerdz@me.com

Tips to Choosing Methods: Build up your assessment toolbox and know your options Assessment is to inform practice (KISS), Start off small (especially if resistent) Too much data can slow you down Assessment is an ongoing process. Reflect on process/results, don’t be afraid to change Read literature/attend conferences through a new lens Talk and get feedback Ask questions

Always ask if the data already exists Include stakeholders from the beginning, use external sources as needed Start with the ideal design, then work backwards to what is possible Decide what you will accept as sufficient evidence, but keep your audience in mind Always interpret your results in light of your design

Questions?

Resources: Maki, P. L. (2004). Assessing for learning: Building a sustainable commitment across the institution. Palomba, C.A. & Banta, T.W. (1999). Assessment essentials: Planning, implementing and improving assessment in higher education. San Francisco: Jossey-Bass Schuh, J.H. (2009). Assessment methods for student affairs. San Francisco: Jossey-Bass. Stage, F.K. and Manning, K. (2003). Research in the college context: Approaches and methods. New York: Brunner-Routledge. Upcraft, M.L., Schuh, J.H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco: Jossey-Bass

Contact Information Kim Yousey-Elsener, PhD. Associate Director, Assessment Programs 210 Ellicott Street, Suite 200 Buffalo, NY 14203 716-652-9400, press 1 when you hear the recording kyouseyelsener@studentvoice.com T 716.652.9400 210 Ellicott Street, Suite 200 F 716.652.2689 Buffalo, New York 14203