What is Evaluation? David Dwayne Williams Brigham Young University

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

SCHOOL COUNSELING Fran Hensley, M.A.Ed. School Counselor
A Vehicle to Promote Student Learning
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
What You Will Learn From These Sessions
OST Certificate Program Program Evaluation Course
 1. Goals and Objectives  2. Appropriateness of Project Design  3. Coordination with Related Efforts  4. Consumer Involvement  5. Methodology 1.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Program Evaluation Essentials. WHAT is Program Evaluation?
Laura Pejsa Goff Pejsa & Associates MESI 2014
EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Nick Saxton Fall 2014.
The Purpose of Action Research
Evaluation.
Social Science Research and
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
PPA 502 – Program Evaluation
The Academic Assessment Process
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Title I Needs Assessment and Program Evaluation
Fact-Finding Fact-Finding Overview
Evaluation. Practical Evaluation Michael Quinn Patton.
Measuring Learning Outcomes Evaluation
Formulating the research design
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
WRITING A RESEARCH PROPOSAL
The phases of research Dimitra Hartas. The phases of research Identify a research topic Formulate the research questions (rationale) Review relevant studies.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Getting started with evaluation.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
 1. Methods of evaluation are thorough, feasible, and appropriate  2. Use of objective measures to produce quantitative and qualitative data  3. Methods.
RRTC-EBP-VR The Rehabilitation Research and Training Center on Effective Vocational Rehabilitation Service Delivery Practices (RRTC-EBP-VR) is established.
Management-Oriented Evaluation …evaluation for decision-makers. Jing Wang And Faye Jones.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Needs Assessment EDTC General Definition The process of comparing a desired goal state with existing conditions Data is fundamental to all decision.
HECSE Quality Indicators for Leadership Preparation.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
for quality and accountability
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
STAKEHOLDER MEETING Selecting Interventions to Improve Utilization of the IUD City, Country Date Insert MOH logoInsert Project logoInsert USAID logo (Note:
Successfully Conducting Employee Performance Appraisals Wendy L. McCoy Director HR & Benefits Florida Conference of The United Methodist Church.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Welcome to todays session!  Please take a moment to check your connection and audio settings.  If this is your first time using LYNC please see the resources.
Facilitate Group Learning
The Risk Management Process
This was developed as part of the Scottish Government’s Better Community Engagement Programme.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Data Driven Planning and Decision-making for Continuous School Improvement: Developing a Program Evaluation Plan Leadership for Innovative Omani Schools.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Evaluating the Quality and Impact of Community Benefit Programs
Presented by Anne C. Adams, MSW (919) and Bea Sweet
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Using outcomes data for program improvement
Using State and Local Data to Improve Results
Presentation transcript:

What is Evaluation? David Dwayne Williams Brigham Young University

Evaluation, Assessment, Measurement, and Research  Evaluation usually includes describing  what is and  what should be, then  judging or comparing the two, as in a balance.  Measurement is an essential tool for gathering information about what is. John Brown example.  Assessments involve using measurement processes regularly for established purposes.  Research involves measuring what is, then seeking to understand and explain, not to judge.

Vocational Rehabilitation Examples  Utah Statewide ASSESSMENT of the Rehabilitation Needs of Individuals with Disabilities : Final Report.  Michael Leahy’s presentation yesterday on a synergistic program evaluation MODEL PARTNERSHIP  From the announcement of this conference: “consumer satisfaction studies, surveys, case file reviews, comprehensive needs assessments, economic impact studies, and use of other quality assurance measures.”  Program Evaluation and Justification Review of the Rehabilitation Program Administered by the Department of Labor and Employment Security, Report No , July 1998

Vocational Rehabilitation Evaluation  Has a long history in the literature (I found resources from the 1970’s forward).  However, as in many fields, evaluations may turn out to be assessments, measures, or research rather than full evaluation.  Let’s look at what the field of evaluation says about evaluation and then we can decide whether Vocational Rehabilitation might gain from what they offer…

An Evaluation Framework based on Ideas From Several Theorists  Alkin, 2004  Fetterman, 2001  Guba & Lincoln, 1989  Patton, 2002, 2008  Stake, 2004  Stufflebeam, 2001, 2007  Weiss, 1998  Fitzpatrick, Sanders, & Worthen, 2003

Evaluation Framework Overview Background Information Audience & Stakeholders Evaluand Information Stakeholder Concerns Judging Criteria Questions to Answer Data Collection Processes Data Analysis Reporting Strategies Results Recommendations Resource Valuation Budget and Schedule Self-Critique using meta-evaluation Overview Meeting Requirements for Utility Meeting Requirements of Feasibility Meeting Requirements for Propriety Meeting Requirements for Accuracy Evaluation Checklist Program Evaluations Meta-Evaluation Checklist

7 Context for understanding an Evaluation  What does the literature associated with the evaluand say are the key issues?  How did this evaluand come to be of interest to you?  What is your background that is relevant to this evaluation?  What evaluation has been done on this evaluand already?  Is the evaluand evaluable at present?  Why is an evaluation appropriate now?  What approaches to evaluation were considered and which will be used and why?

Possible VR Context Questions  What does the literature about Vocational Rehabilitation say ought to be included in a study?  How did this program or this counseling technique or this client come to be of interest to you?  What perspectives are you taking on this evaluation because of your particular background?  What might you be missing because of that?  What alternative views do you need to insist on including, besides your own?  What evaluation have you or others already done on this evaluand?  What has been learned from previous evaluations?

Who are the stakeholders who care? Why?  Who asked for the evaluation and why?  Who stands to benefit from the evaluation and how?  Who is served by the evaluand or should be?  Who is likely to use the evaluation results to do something helpful?  Who does not usually have a voice in matters associated with the evaluand but has a stake in it? 9

VR evaluators might ask about stakeholders:  Who else besides me cares about this treatment, these resources, or this program?  Have any of them asked for an evaluation?  If so, why? If not, why not?  Why do I and these other people care about this program?  What do we stand to lose or gain by what happens with this program?  Who else is served by this program or should be and therefore should have an interest in its evaluation?  Are the administrators, other counselors, family members, employers, or others likely to use evaluation results to do something different?

What is the evaluand or “thing” the stakeholders care about?  What do you already know about the evaluand?  What or Who it is  What its or their objectives are  How it works or what they are doing  What more do you need to learn to refine the description and definition of the evaluand so you can focus your evaluation on it or them? 11

What are the “things” or people VR evaluators might evaluate?  One key evaluand may be them as counselors,  Or it may be the curriculum or program they’re using,  Or a particular technique they are piloting,  Their clients’ current performance, employment, concerns, and associated needs for improvement,  The relationships among several components of a program.  Or a test used to ascertain growth in client performance?

What criteria do stakeholders have for judging the evaluand?  What values do the stakeholders manifest regarding the evaluand?  What do they think the evaluand should be accomplishing (criteria for success)?  What standards do they have or how completely do they hope the evaluand will meet the criteria?  How will they know when the evaluand is successful to their satisfaction? 13

VR evaluators might ask these criteria questions:  What do we and other stakeholders value that should guide our evaluation efforts?  What should clients who participate in this program activity be able to do when they finish?  How well should clients perform on the selected criteria if the program is going to be considered successful?  What should counselors be doing, at what level of performance to help clients be successful?

What questions do stakeholders want to answer?  Based on the previous points, what evaluation questions should be asked?  Based on a rating or ranking of all possible questions raised, which are the highest priorities?  Which questions will this study address and why? 15

VR evaluators might ask these questions to match the criteria:  How are clients performing compared to the ideal?  Is there a need for an intervention change?  How well was the program implemented?  How many of the clients performed at or above 80% on the job placement test?  How well did this counselor do in preparing their clients to apply for a job?  How well are we evaluating our interventions in terms of implementation and outcome?

What processes will be used to collect & analyze data to answer the questions & compare the evaluand to it’s criteria?  For each question listed earlier, what:  Information will be collected & analyzed?  Using what data collection procedures?  By whom and when?  How will each procedure be refined to ensure validity, reliability, credibility, trustworthiness, etc. 17

VR evaluators may collect & analyze quantitative or qualitative data by:  Drawing upon formal measures developed by others or creating their own tests and performance activities,  Conducting informal interviews and observations,  Engaging clients in dialogues & digitally recording them for analysis by the clients or others,  Analyzing these and other data both qualitatively & quantitatively,  Comparing these descriptions of “what is” to the criteria and standards identified earlier.

What reporting & recommendation strategies will be used?  What interim reports will be given to whom and when?  What final reports will be given to whom and when?  How will the reports be organized, around what points?  Will there be oral reports? Written reports? Other formats?  How will results be organized and displayed?  What are the results or what results are anticipated?  Where will recommendations come from?  Will you be qualified to make recommendations and why?  What recommendations are there, who should implement them and how? 19

 Informal oral reports for own program evaluations,  Interim reports to share with others,  Formal written reports with charts and tables,  Reports on study progress & stakeholder involvement,  Implications for future evaluation activities,  Evaluative judgments about quality of evaluands,  Realistic recommendations- using processes that involve the stakeholders who will implement the recommendations. VR evaluators report results and recommendations through the use of

Metaevaluation of Evaluation Plans, Implementation, & Outcomes  Encourages high quality evaluations  Can be done internally or externally  Could involve Standards established by the Joint Committee of thoughtful professionals  Seeks to enhance evaluation quality in terms of:  Utility  Feasibility  Propriety  Accuracy

VR evaluators should meta- evaluate to enhance quality:  When anticipating conducting an evaluation,  While conducting one, and/or  While reviewing evaluations performed by themselves or others.  Using Joint Committee standards to help clarify what they want to evaluate,  Using the Standards to judge how well they are evaluating,  By clarifying who they are serving with their evaluation, and how they value the results of their evaluation efforts.

Implications for VR Participants  Use Measurement and Assessment in a broader Evaluation context to enhance VR programs by:  Attending to context, background and literature  Serving the values & interests of all stakeholders  Involving stakeholders in clarifying the evaluand, criteria, & standards they care most about  Targeting stakeholders’ questions with a variety of data collection and analysis methods that involve measures of high quality to assess how well “what is” matches up with “what should be” for the stakeholders  Sharing results and recommendations that are realistic and useful for the stakeholders in ways they can use.

Come Learn More This Afternoon at a Workshop. We will:  Review the evaluation framework presented here  Discuss the premise that measurement and assessment are means for doing evaluation and research  Discuss and write down current practices and questions about evaluating your work activities.  Develop plans for applying these ideas to your practice  Share emerging plans with other participants for feedback  Receive guidance and feedback from presenter  Accept the challenge to apply this plan at home and to contact the presenter with questions and further guidance if wanted.

For more information or questions, Contact: David Williams 150 G MCKB Brigham Young University Provo, UT USA