Presentation is loading. Please wait.

Presentation is loading. Please wait.

Designing your Guided Pathways Evaluation:

Similar presentations


Presentation on theme: "Designing your Guided Pathways Evaluation:"— Presentation transcript:

1 Designing your Guided Pathways Evaluation:
A whirlwind intro to evaluation rubrics Dawn Coleman, EdS GPI Retreat June 27, 2018

2 What Evaluation Is and Is Not…
Evaluation is more than a listing of activities Evaluation is more than setting goals Evaluation is more than tracking indicators Evaluation is about answering important, big picture questions with well reasoned answers based on a convincing mix of evidence Jane Davidson – Actionable Evaluation Basics Emphasize that they should be setting goals, tracking activities and indicators but that is PART of the evaluation process, not the sum total of evaluation – it provides the evidence Sometimes can’t see the forest for the trees if focus solely on indicators and measures

3 “Evaluation refers to the process of determining the merit, worth, and significance of something”
Michael Scriven “Official” classic definition of evaluation

4 This may be easier to understand as quality, value, and importance
Quality - is this a “good” program (reaching the target population, achieving its intended outcomes) Value – is this a good program for this specific situation (worth the resources, better than the alternatives, meeting our needs) Importance – is the program good enough (solving a problem worth solving with meaningful – not necessarily statistically significant - results) Davidson (and even Scriven) now often use the language of quality, value, and importance

5 What skills do you need to evaluate your program?
A genuine interest in improving your program, not just “proving” that it works. The ability to think honestly and critically about whether your evidence is good enough to say that your program works (and what “works” even means). Humility and a growth mindset (because you’re basically evaluating yourself and there’s always room for improvement).

6 “The rigor is not in the methodology. The rigor is in the thinking.”
Michael Quinn Patton It’s not about RCTs or a strict adherence to methods (quant versus qual wars)

7 Critically assess information to make judgements and justify decisions
Evaluative Reasoning and Evaluative Questions Remember Bloom’s Taxonomy? Creating Evaluating Analyzing Applying Understanding Remembering Critically assess information to make judgements and justify decisions Revised Blooms – the original had evaluation at the top with synthesis below Most in higher ed are somewhat familiar with Blooms – point out that evaluating involves higher order thinking that just analyzing data – it’s about synthesizing evidence to make judgements and decisions

8 Evaluative Reasoning and Evaluative Questions
Evaluative reasoning is used to synthesize all of our evidence to arrive at answers to evaluative questions Not Evaluative Evaluative How many students participated in program X How well did we reach the students we intended to reach? Was the program implemented with fidelity (as intended)? How well was the program implemented? Did the program reach its target objectives? To what extent did the program achieve its objectives? How large is the achievement gap? To what extent did the program address the achievement gap? I’m using evaluative reasoning and evaluative thinking interchangeably, understanding that a few scholars have a much more specific definition for evaluative thinking (but in general we use them interchangeably in program eval)

9 Evaluative Reasoning and Evaluative Questions
If you can answer your question with Yes/No or a single number, it’s probably not evaluative Focus on 4-7 high level evaluative questions High level evaluative questions can then be unpacked into more specific questions which will drive your collection of data

10 Evaluative Reasoning and Evaluative Questions
We have a tendency to let our data drive our questions instead of asking the questions we care about and then seeing what data we have to answer those questions (collecting additional data when needed) The data are not the answers, they are a tool for answering our important questions Emphasize that they should use data they already have/already collect regularly if it fits their questions – but they may need to collect additional data – and they shouldn’t use data just because they have it (often the data we collect, particularly for federal reporting requirements, do not really answer our questions – and in the case of IPEDS was never intended for institution-level use)

11 Evaluative Questions: Option 1
Focus on the key components of Guided Pathways 4 questions based on the 4 essential practices on the SOAA + 1-3 big picture questions of particular relevance to your college (equity, resource management, institutional barriers, etc.) Examples: To what extent has the practice of mapping pathways to student end goals improved the experience for students at our institution? To what extent has implementing Guided Pathways improved resource management at our institution? Extra question(s): Equity? Removing institutional barriers? Cultural responsiveness? Resource use/efficiency?

12 Evaluative Questions: Option 2
Focus on broad questions related to need, implementation, outcomes, resources, and sustainability Examples: How well does GP address the needs of our students? How well designed and implemented is GP at our institution? How valuable are the outcomes for our students (looking at specific populations of interest)? Overall, to what extent is GP worth the resources used (not just in terms of money but also time and energy)? How sustainable is GP and its impact?

13 Evaluation Rubrics Evaluation rubrics can help us synthesize our data to arrive at answers to our questions If you are an instructor, you may have used rubrics for grading Here’s a quick example (for a writing course) Basic example of rubric from

14 Evaluation Rubrics Writing Standard D-F paper C paper B paper A paper
Clear and coherent writing Writing is difficult to understand, lacks organization, facts, examples, etc. Facts and examples show little relevance to the assignment. Overall understandable. Not well organized. Few supporting examples or facts. Critical thinking is demonstrated at a beginning level in only one or two sections. Understandable. Thorough explanation using relevant supporting facts, examples, etc. Critical thinking is demonstrated at an intermediate level in each segment. Thoroughly developed topic, use of relevant facts, examples, quotes, etc. Evidence of critical thinking demonstrated throughout the assignment. This is just the first row of the example rubric as an example of a grading rubric

15 Evaluation Rubrics Instead of writing standards with descriptive criteria for each rating, we have big picture evaluative questions Criteria can be specific to the question or more generic and consistent across all evaluative questions Recommendation for Guided Pathways: Criteria that are more generic and consistent (simpler) Five rating categories (corresponds with grading) plus an option for insufficient evidence

16 Evaluation Rubrics: Example Criteria
Rating Descriptive Criteria A Most or all practices are implemented at scale, are sustainable, target all students, are of very high quality, use appropriate resources, produce very meaningful results B Most or all practices are in the process of being scaled, target most students, are of high quality, generally use appropriate resources, produce positive results C Plans to scale most or all practices, practices are of average quality, reach could be improved, may be expensive, results are limited D Practices are implemented inconsistently and/or are of low quality, reach, and/or impact F No plans to implement practices or they are implemented with very low quality, reach, and/or impact IE Insufficient evidence - evidence is unavailable or of poor quality – unable to assess performance or answer question Emphasize that they need to collaboratively arrive at the criteria – this is an example only!

17 Evaluation Rubrics “A” rating doesn’t indicate perfection – but it should be exemplary Things to consider: What constitutes high quality and value? What is your evidence? Is that enough evidence to convince a skeptic (not a die-hard opponent but a reasonable skeptic) Rubrics should be created and completed as a team to help you come to a shared understanding of what is important, what constitutes quality and value, and appropriate evidence to demonstrate this to others Complete rubrics at regular intervals (at least twice a year) to track progress

18 It can be hard to see the forest for the trees when you focus on data alone.
Rubrics provide a framework for collecting and organizing your data to better help you answer your overarching evaluative questions.

19 Any questions? And then I have some questions for you…

20 Question 1: What, if anything, are you currently doing to evaluate Guided Pathways at your institution? Question 2: What are the barriers to conducting an evaluation at your institution? Question 3: What evaluation-related assistance would be helpful to you?

21 Questions?


Download ppt "Designing your Guided Pathways Evaluation:"

Similar presentations


Ads by Google