From the graphic point of view, P. D. C. A

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Laura Pejsa Goff Pejsa & Associates MESI 2014
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
How to Plan a Local Evaluation and Lessons Learned Philip Rodgers, Ph.D. Evaluation Scientist American Foundation for Suicide Prevention 2013 Garrett Lee.
Dennis McBride, Ph.D. The Washington Institute (253) Goal Driven Logic Models.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Logic Modeling for Success Dr Kathryn Wehrmann Illinois State University.
Methodology Tips for Constructing Instruments. Matching methods to research paradigm MethodQuantitativeQualitative Written Instrument Standardized Instrument.
Evaluating Professional Development Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin,
OUTCOME MEASUREMENT TRAINING Logic Models OBJECTIVES FOR TODAY: n Recognize and understand components of a logic model n Learn how to create a logic.
Formulating the research design
1 SUMMER CONFERENCE What is “Mixed Methods” Research Research studies that include both QUALitative and QUANtitative data. QUAL and QUAN data purposely.
UOFYE Assessment Retreat
Molly Chamberlin, Ph.D. Indiana Youth Institute
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Observations A Mirror of the Classroom. Objectives  Gain specific strategies for communicating with teachers to promote successful teaching and learning.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Research Design Methodology Part 1. Objectives  Qualitative  Quantitative  Experimental designs  Experimental  Quasi-experimental  Non-experimental.
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Evaluating Your STEM Outreach Program MISO Spring Workshop May 7, 2012 MISO Data Analytics Team Jeni Corn, Tricia Townsend, Alana Unfried
Program Planning & Evaluation Begin with the End in Mind Dr. Dallas L. Holmes Specialist Institutional Research Utah State University Extension.
Program Evaluation Using qualitative & qualitative methods.
Sophia Gatowski, Ph.D., Consultant National Council of Juvenile & Family Court Judges Sophia Gatowski, Ph.D., Consultant National Council of Juvenile &
Program Evaluation & Research In Service-Learning Service-Learning Mini-Institute Lynn E. Pelco, Ph.D. Division of Community Engagement.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Planning an Applied Research Project Chapter 7 – Forms of Quantitative Research © 2014 John Wiley & Sons, Inc. All rights reserved.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
CRJS 4466 PROGRAM & POLICY EVALUATION LECTURE #5 Evaluation projects Questions?
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Patrick Barlow, CATL April 15, 2011 And all shall have prizes?: Assessing Group Learning.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
Using a Logic Model to Plan and Evaluate Your Technology Leadership Development Program Chad Green, Program Analyst Lynn McNally, Technology Resource Supervisor.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
 2007 Johns Hopkins Bloomberg School of Public Health Section B Logic Models: The Pathway Model.
Overview of Sociological Research/Formulating a Research Question September 9, 2015.
Prevention Education Meeting May 29, 2013 Evaluation 101.
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Research Designs Social Sciences. Survey Research Can be Qualitative or Quantitative Can be Qualitative or Quantitative Self-report Self-report Opinion.
ABRA Week 3 research design, methods… SS. Research Design and Method.
Overview of Sociological Research September 8, 2014.
Goals of research Exploratory: new topic Descriptive: gathering information Explanatory: explain patterns, answer why Evaluation: assess outcomes Types.
Assessment of Your Program Why is it Important? What are the Key Elements?
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
Unit 7 Research Designs. What is a Research Design?? Researcher’s strategy: Describes how the researcher(s) will answer their questions/test hypotheses.
Simple Surveys: From Program Objectives to Impact Statements in 90 Minutes 2016 Extension Conference Jeff Buckley & Jennifer Cantwell January 13, 2016.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Back to the Future: Evaluation and Measurement of Learner Outcomes in Financial Education National Endowment for Financial Education® (NEFE®) August 2-4,
MSP Summary of First Year Annual Report FY 2004 Projects.
Logic Models How to Integrate Data Collection into your Everyday Work.
Evaluating the Quality and Impact of Community Benefit Programs
MPU 1024 Mixed Methods.
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Program Evaluation Essentials-- Part 2
Presentation transcript:

From the graphic point of view, P. D. C. A From the graphic point of view, P.D.C.A. is represented by a moving circle called the Deming wheel. The movement stands for dynamism and continuity of the application process. P D C A CYCLE The Deming Wheel From the graphic point of view, P.D.C.A. is represented by a moving circle The movement stands for dynamism and continuity of the application process. 11 http://www.iwolm.com/en/the-pdca-method-or-deming-wheel-for-your-improvement/

Theoretical Frameworks Utilization Focused (Patton) Evaluation of effectiveness by potential users Who might use What will be used Intention to use 12

Evaluation Techniques

Logic Models “A graphic representation of a program that describes the program’s essential components and expected accomplishments and conveys the logical relationship between these components and their outcoms.” (Conrad et al. 1999) Logic models guide evaluation http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html 14

How the Logic Model Guides Evaluation Provides the program description Aids in matching evaluation to the program Identifies what and when to measure Are you interested in process and/or outcomes? Keeps focus on key, important information What do we really need to know?? Where should limited evaluation resources be used? 15

Basic Logic Model INPUTS OUTPUTS OUTCOMES Activities Participants 16

Input Inputs Investment/Resources Time People Money Materials Products (teaching components) 17

Output Activities Participation What is done Who receives Training/teaching Counts Products (teaching components) Description Develop resources Satisfaction Form collaborations/partnerships Assessment tools and assessment 18

Outcomes Primary Secondary Tertiary Immediate Behavior Results (Short term) Behavior Results (Long-term) Knowledge Teaching Adaptation Skills Practice Standardization Abilities Improved practices Changes in attitudes Assessments (tools and assessment) 19

When to evaluate? Before the experience/event/class Mini-assessments within (quiz, exercise, etc) Post-test only Pre-test and Post-test Retrospective Pretest and Post-test Pre-test, Post-test, and follow-up Intermediate testing Can be combined with pre and post test designs 20

Evaluation Design Experimental Quasi-experimental Non-experimental Uses random assignment, control Quasi-experimental Groups from “natural” characteristics Males to females Class A to Class B Non-experimental Compares before and after 21

Types of Assessment Qualitative Quantitative Mixed Method Open-ended Constrained Choice Mixed Method Using both quantitative and qualitative techniques to collect data Post-test: advantages – quick Retrospective Pretest: both pre and post collected following the experience – simple, gives idea of “improvement” or change Pretest/posttest: same instrument at two points – shows actual change, controls for prior knowledge, better evidence of effectiveness of experience However, takes more time, doesn’t control for intervening events, Pre, post, follow-up – allows to see longer term impacts 22

Mixed Methods Evaluation Enhances both formative and summative evaluation Triangulation from combining both types of evaluation Merging Explanatory Exploratory 23

Mixed Methods Evaluation - Merging Triangulation from combining both types of evaluation Merging Quantitative Qualitative Interpretation 24

Mixed Methods Evaluation - Explanatory Triangulation from combining both types of evaluation Explanatory Quantitative, then Qualitative, followed by interpretation Quantitative Identify Issues Qualitative Interpretation 25

Mixed Methods Evaluation - Exploratory Triangulation from combining both types of evaluation Exploratory Qualitative Design Measure Quantitative Interpretation 26

Mixed Methods Evaluation Triangulation from combining both types of evaluation Merging Explanatory Quantitative, then Qualitative, followed by interpretation Exploratory Quantitative Qualitative Interpretation Quantitative Identify Issues Qualitative Interpretation Qualitative Design Measure Quantitative Interpretation 27

Selecting the Evaluation Method Participation records Self-report Achievement (knowledge) tests Interviews Focus groups Direct observations Medical record reviews Product count 28

Example 29

Example 30

Examples 1.1 The percent of adults aged 65 and older in the United States is currently approximately ________ and will increase to ________ by the middle of the century. a. 10% 25% b. 12% 20% c. 12% 15% *d. 13% 20% e. 13% 25%   1.2 Persons reaching age 65 have an average life-expectancy of an additional _____ years. a. 5 b. 9 c. 10 d. 15 *e. 19 31

Examples: 32

Example 33

Evaluation: Quality Improvement vs. Research Institution-specific Reporting to stakeholders is not research Funders Collaborators UAB Institutional Review Board defines research as any presentation in a local or national forum IRB approval is needed 34

American Evaluation Association Founded 1986 http://www.eval.org/p/cm/ld/fid=1 35

Evaluation Strategies for Educational Interventions November 1, 2013 GEC Faculty Scholars Program Patricia Sawyer, PhD