Download presentation
Presentation is loading. Please wait.
1
Highline community college
Assessment Methods Highline community college
2
Learning outcomes Identify the 5 considerations of method selection
Demonstrate understanding of the difference between direct and indirect methods Name a strength and weakness of survey methodology Identify whether focus groups and interviews are qualitative or quantitative in nature
3
7 Elements of Assessment Planning
Mission Statement Goals Outcomes (Learning & Service/Program) Methodology Implementation Results Decisions/Recommendations
4
Diverse methodologies
Highline community college
5
Choices, choices…. Surveys Existing Data Document Analysis Rubrics
Focus Groups & Interviews Visual Methods Tracking System Observation Case Study
6
surveys Asking open and closed-ended questions on a questionnaire type format. A survey is a self report of anything, including opinion, actions, and observation. Strengths: Can be administered via , Paper, Telephone, Web-based, etc. Relatively fast and easy to collect data Challenges: Survey fatigue and response rates Limited in type of questions asked
7
Existing data Data that has already been collected usually from previous assessment projects, student information, office systems or tracking systems. Strengths: No time needed to collect data Cuts down on survey fatigue, response rate issues Challenges: Reliant on the reliability/validity or trustworthiness of the source Gaining access to data that may be housed elsewhere
8
Document analysis A form of qualitative research in which documents are used to give voice, interpretation and meaning. Strengths: Documents are already collected or easily collected and available Documents are a stable data source (they don’t change) Challenges: Documents are context and language specific (must be familiar when assessing) Data analysis takes time
9
Rubric A scorecard used typically to rate learning outcomes either through observation or artifacts. Includes a scale, key dimensions and descriptions of each dimension on the scale. Strengths: Clearly states standards and expectations Provides both individual and program-level feedback Challenges: Limited in use just for learning outcomes Developing a rubric takes time
10
Focus groups & interviews
Asking face to face open-ended questions in a group or one-on-one setting. Questions are meant to be a discussion. Strengths: Helps to understand perceptions, beliefs, thought processes Focus groups encourage group interaction and building upon ideas Challenges: Getting participants (think of time/places) Data collection and analysis takes time (data is as good as the facilitator)
11
Visual Methods Captures images (pictures, presentations, etc.) as a main form of data collection, usually also includes captions or a journal to accompany images. Strengths: More detail and depth to data Visual aspect allows for depth in sharing results Challenges: Beware of threats to alterations of images (especially with technology) Time for implementation and follow-through (analysis takes time)
12
Tips to consider Build your assessment toolbox
Use culturally sensitive language Keep it Super Simple (KISS) Get feedback from colleagues, Student Voice, and peers Ask if the data already exist, seek collaboration Assessment is an ongoing process, don’t be afraid to change as you learn
13
Method selection Highline community college
Hand out: Types of methodologies sheet Highline community college
14
Defining Methodology:
Ingredients of Assessment Method Tools needed for data collection Criteria or targets that tell you when the outcome has been met Strategy for data analysis Identified target audience Defining Methodology: The criteria, process, and tools used to collect evidence and to determine the degree to which the outcomes were reached.
15
Method Selection Considerations
“Designed to capture students’ progress toward institution or program-level outcomes” (Maki, 2004, p.4). “Designed to capture students’ achievement at the end of their program of study” (Maki, 2004, p.6). Formative Summative 5 Qualitative Quantitative 3 Direct Indirect 4 Population 1 Assessment type 2 Requires subject to display their knowledge, behavior, or thought process. Requires subject to reflect on their knowledge, behavior, or thought process. Method Selection Considerations Is “multimethod in focus, involving an interpretive, naturalistic approach to its subject matter” (Schuh & Upcraft, 2001, p.27). Is “a means for testing objective theories by examining the relationship among variables” (Creswell, 2009, p. 4). Hand out Method Selection Sheet(s) There are 5 main considerations for method selection. Considering the programs in your department that are mission aligned—what are the programs/populations that you want more evidence about? What are the associated learning/service outcomes? Be thinking about how you can best measure them in light of the 5 criteria for method selection? Write down the criteria that is best suited your assessment (i.e. formative or summative/ direct or indirect, what is your population ect) Comparative/Benchmarking; National Standards (CAS); Campus Climate; Learning Outcomes; Program Effectiveness, Student Needs; Usage Numbers, Cost Effectiveness
16
Who do you want to know more about?
What information do we already have about them? How accessible are they? When are they typically available and for how long? Are they proficient with technology? Do they have access to technology? What is their level of investment (i.e. how much time/effort would they give to the assessment)? How often is this population assessed? Are they fatigued?
17
-Handout “Types of Assessment” sheets (including qual/quant/direct/indirect sheet)
-Considering your student population, what is it that you want to know about? This will help narrow down the list to the most relevant methodologies.
18
Quantitative Methods Qualitative Methods
Focus on numbers/numeric values Who, what, where, when Match with outcomes about knowledge and comprehension (define, classify, recall, recognize) Allows for measurement of variables Uses statistical data analysis May be generalized to greater population with larger samples. Easily replicated. Focus on text/narrative from respondents Why, how Match with outcomes about application, analysis, synthesis, evaluate Seeks to explain and understand Ability to capture “elusive” evidence of student learning and development Methods: survey, existing data, rubric (if assigning #’s), tracking, system, observation, document analysis, KPI Methods: focus group/ interview, portfolio, rubric (if descriptive), visual methods, one-minute assessment, open-ended survey question, observation document analysis, case study Quantitative Methods Qualitative Methods Next, consider whether quantitative/qualitative methods are most consistent with what you want to know/what your target population’s needs are/and the type of assessment you want to do.
19
Direct Methods Any process employed to gather data which requires subjects to display their knowledge, behavior, or thought processes (when assessing learning). Any process employed to gather data which asks subjects to reflect upon their knowledge, behaviors, or thought processes (when assessing learning). e.g. Where on campus would you go or who would you consult with if you had questions about which courses to register for the fall? e.g. I know where to go on campus if I have questions about which courses to register for in the fall. Strongly agree, Neither agree nor disagree, Strongly disagree Indirect Methods Methods: “quiz” type survey, rubric, document analysis, observation, portfolio, visual methods, one-minute assessment, and/or case study Methods: survey, focus group, document analysis, and/ or one-minute assessment How about direct/indirect methods—which are most consistent with what you want to know/what your target population’s needs are/and the type of assessment you want to do.
20
capture students’ achievement at the end of their program of study?
Do you want to know about student progress toward institution or program-level outcomes? OR capture students’ achievement at the end of their program of study?
21
Which types of assessment methods have you used
Which types of assessment methods have you used? Which seem most appropriate for the programs and services within your office? Share out: What are the methods you’ve used in your departments? Was it successful? Why? Break into teams, select one person’s assessment project and evaluate. What are some strengths, what are some areas to reconsider? Depending on time, keep rotating through projects.
22
Implementation Highline Community College
23
Assessment cycle Cost-Effectiveness Assessment Satisfaction Outcomes Assessment Needs Assessment Needs Assessment: every 1-2 years (community colleges) Cost-Effectiveness: Real: annually Perceived: less frequently Satisfaction: 1 year after each needs assessment Outcomes: annually or biannually (Schuh & Upcraft, 2001, pp ) -Schuh and Upcraft (2001) suggest conducting assessments in the following order: 1) Needs Assessment, 2) Cost-Effectiveness, 3)Satisfaction Assessment, 4) Outcomes Assessment and designing a cyclical plan to repeat the cycle to keep data fresh and current. -Close consideration should be given to the mission and the target populations under consideration as changes arise (including the addition of new services, a new reporting structure etc.) Cost-Effectiveness Assessments include both real and perceived components. Real= are costs equivalent to comparable programs elsewhere Perceived= do students believe they are getting the money’s worth?
24
Assessment time Post test!
25
References Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches. Los Angeles, CA: Sage. Maki, P. (2010). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus. Schuh, J. H., & Upcraft, M. L. (2001). Assessment practice in student affairs: An applications manual. San Francisco, CA: Jossey-Bass.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.