Presentation is loading. Please wait.

Presentation is loading. Please wait.

From the graphic point of view, P. D. C. A

Similar presentations


Presentation on theme: "From the graphic point of view, P. D. C. A"— Presentation transcript:

1 From the graphic point of view, P. D. C. A
From the graphic point of view, P.D.C.A. is represented by a moving circle called the Deming wheel. The movement stands for dynamism and continuity of the application process. P D C A CYCLE The Deming Wheel From the graphic point of view, P.D.C.A. is represented by a moving circle The movement stands for dynamism and continuity of the application process. 11

2 Theoretical Frameworks Utilization Focused (Patton)
Evaluation of effectiveness by potential users Who might use What will be used Intention to use 12

3 Evaluation Techniques

4 Logic Models “A graphic representation of a program that describes the program’s essential components and expected accomplishments and conveys the logical relationship between these components and their outcoms.” (Conrad et al. 1999) Logic models guide evaluation 14

5 How the Logic Model Guides Evaluation
Provides the program description Aids in matching evaluation to the program Identifies what and when to measure Are you interested in process and/or outcomes? Keeps focus on key, important information What do we really need to know?? Where should limited evaluation resources be used? 15

6 Basic Logic Model INPUTS OUTPUTS OUTCOMES Activities Participants 16

7 Input Inputs Investment/Resources Time People Money Materials
Products (teaching components) 17

8 Output Activities Participation What is done Who receives
Training/teaching Counts Products (teaching components) Description Develop resources Satisfaction Form collaborations/partnerships Assessment tools and assessment 18

9 Outcomes Primary Secondary Tertiary Immediate Behavior Results
(Short term) Behavior Results (Long-term) Knowledge Teaching Adaptation Skills Practice Standardization Abilities Improved practices Changes in attitudes Assessments (tools and assessment) 19

10 When to evaluate? Before the experience/event/class
Mini-assessments within (quiz, exercise, etc) Post-test only Pre-test and Post-test Retrospective Pretest and Post-test Pre-test, Post-test, and follow-up Intermediate testing Can be combined with pre and post test designs 20

11 Evaluation Design Experimental Quasi-experimental Non-experimental
Uses random assignment, control Quasi-experimental Groups from “natural” characteristics Males to females Class A to Class B Non-experimental Compares before and after 21

12 Types of Assessment Qualitative Quantitative Mixed Method Open-ended
Constrained Choice Mixed Method Using both quantitative and qualitative techniques to collect data Post-test: advantages – quick Retrospective Pretest: both pre and post collected following the experience – simple, gives idea of “improvement” or change Pretest/posttest: same instrument at two points – shows actual change, controls for prior knowledge, better evidence of effectiveness of experience However, takes more time, doesn’t control for intervening events, Pre, post, follow-up – allows to see longer term impacts 22

13 Mixed Methods Evaluation
Enhances both formative and summative evaluation Triangulation from combining both types of evaluation Merging Explanatory Exploratory 23

14 Mixed Methods Evaluation - Merging
Triangulation from combining both types of evaluation Merging Quantitative Qualitative Interpretation 24

15 Mixed Methods Evaluation - Explanatory
Triangulation from combining both types of evaluation Explanatory Quantitative, then Qualitative, followed by interpretation Quantitative Identify Issues Qualitative Interpretation 25

16 Mixed Methods Evaluation - Exploratory
Triangulation from combining both types of evaluation Exploratory Qualitative Design Measure Quantitative Interpretation 26

17 Mixed Methods Evaluation
Triangulation from combining both types of evaluation Merging Explanatory Quantitative, then Qualitative, followed by interpretation Exploratory Quantitative Qualitative Interpretation Quantitative Identify Issues Qualitative Interpretation Qualitative Design Measure Quantitative Interpretation 27

18 Selecting the Evaluation Method
Participation records Self-report Achievement (knowledge) tests Interviews Focus groups Direct observations Medical record reviews Product count 28

19 Example 29

20 Example 30

21 Examples 1.1 The percent of adults aged 65 and older in the United States is currently approximately ________ and will increase to ________ by the middle of the century. a. 10% % b. 12% % c. 12% % *d. 13% % e. 13% % 1.2 Persons reaching age 65 have an average life-expectancy of an additional _____ years. a. 5 b. 9 c. 10 d. 15 *e. 19 31

22 Examples: 32

23 Example 33

24 Evaluation: Quality Improvement vs. Research
Institution-specific Reporting to stakeholders is not research Funders Collaborators UAB Institutional Review Board defines research as any presentation in a local or national forum IRB approval is needed 34

25 American Evaluation Association
Founded 1986 35

26 Evaluation Strategies for Educational Interventions
November 1, 2013 GEC Faculty Scholars Program Patricia Sawyer, PhD


Download ppt "From the graphic point of view, P. D. C. A"

Similar presentations


Ads by Google