Management-Oriented Evaluation …evaluation for decision-makers. Jing Wang And Faye Jones.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Critical Reading Strategies: Overview of Research Process
Critical Reading Strategies: Overview of Research Process
STRATEGIES FOR COURSE REDESIGN EVALUATION Laura M. Stapleton Human Development and Quantitative Methodology University of Maryland, College Park
Introduction to Monitoring and Evaluation
School Improvement Through Capacity Building The PLC Process.
Review: Introduction Define Evaluation
PM&E Participatory Monitoring and Evaluation Prepared by BMCalub.
METAEVALUATION An Overview (dls 8/30/11). Key Questions  1. What is the essence of metaevaluation?  2. Why is metaevaluation important?  3, What are.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
OST Certificate Program Program Evaluation Course
What is Evaluation? David Dwayne Williams Brigham Young University
Laura Pejsa Goff Pejsa & Associates MESI 2014
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
Lessons Learned in Initiating and Conducting Risk Assessments within a Risk Analysis Framework: A FDA/CFSAN Approach Robert Buchanan DHHS Food and Drug.
Evaluation.
Introduction Introduction. Problem. Literature. Data. Quantitative. Qualitative. Presentation. Cases. Analytical methods for Information Systems Professionals.
Capability Maturity Model (CMM) in SW design
PPA 502 – Program Evaluation
Daniel L. Stufflebeam C. I. P. P. Evaluation Model.
Analytical methods for Information Systems Professionals Week 13 Lecture 1 CONCLUSION.
An evaluation framework
To accompany Quantitative Analysis for Management, 9e by Render/Stair/Hanna 1-1 © 2006 by Prentice Hall, Inc. Upper Saddle River, NJ Chapter 1 Introduction.
Evaluation. Practical Evaluation Michael Quinn Patton.
Pradeep P Research Scholar Guide Dr. N.S. Harinarayana.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Participant-Oriented Evaluation Prepared by: Daniel Wagner Jahmih Aglahmie Kathleen Samulski Joshua Rychlicki.
Continuous Quality Improvement (CQI)
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
McGraw-Hill/Irwin ©2009 The McGraw-Hill Companies, All Rights Reserved Marketing Research, Primary Data, Secondary Data, Qualitative Research, Quantitative.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Program Evaluation & Research In Service-Learning Service-Learning Mini-Institute Lynn E. Pelco, Ph.D. Division of Community Engagement.
1 Situational Analysis as a Process Tool in Human Services Programming Presented By Gary Bess, PhD Jim Myers, MSW Gary Bess Associates School of Social.
Marketing Research Audhesh Paswan Chapter 1: The Nature and Role of Marketing Research.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Chapter 12: Survey Designs
ASSESSMENT OF HRD NEEDS Jayendra Rimal. Goals of HRD Improve organizational effectiveness by: o Solving current problems (e.g. increase in customer complaints)
Community Assessment Process WHY?? To identify and document the opportunities, challenges, strengths, and needs of a specific geographic community and.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
GSSR Research Methodology and Methods of Social Inquiry socialinquiry.wordpress.com January 17, 2012 I. Mixed Methods Research II. Evaluation Research.
EVALUATION THEORY AND MODEL Theory and model should have symbiotic relationship with practice Theory and model should have symbiotic relationship with.
Monitoring & Evaluation. Objective Learn the why, what and how-to approach to monitoring Review monitoring techniques and define the roles monitoring.
UNDP-GEF Community-Based Adaptation Programme Anne-France WITTMANN CBA-Morocco Programme Manager (UNV) Tools & Tips to foster Gender Mainstreaming & Inclusion.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Program Evaluation.
The Proposed Evaluation of the Freshman Scholars Program at San Diego State University – Imperial Valley Campus Alissa M. Ramos California State University,
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
Introduction to Project Management Chapter 9 Managing Project Risk
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
WELCOME Challenge and Support. What is challenge and support Table discussion As a governor what do you think Challenge and Support looks like?
Unit 1 Module 4 Explain: Roles of the Evaluator Introduction to Educational Evaluation Dr. Kristin Koskey.
Introduction to research
Overview of nursing research Nursing research 471 Rawhia salah Assistant Prof. Of Nursing 2015/2016.
Program Evaluation Alternative Approaches and Practical Guidelines
FOUNDATIONS OF NURSING RESEARCH Sixth Edition CHAPTER Copyright ©2012 by Pearson Education, Inc. All rights reserved. Foundations of Nursing Research,
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Evaluation What is evaluation?
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
DATA COLLECTION METHODS IN NURSING RESEARCH
Nursing Process Applied to Community Health Nursing
CHAPTER11 Project Risk Management
WEEK 4 CURRICULUM EVALUATION
Management, Leadership, and Internal Organization
Presentation transcript:

Management-Oriented Evaluation …evaluation for decision-makers. Jing Wang And Faye Jones

Theoretical Basis—Systems Theory Systems Theorist—Burns and Stalker (1972), Azumi and Hage (1972), Lincoln (1985), Gharajedaghi (1985), Morgan (1986). Systems theorist in education—(Henry Bernard, Horace Mann, William Harris, Carleton Washburne). Mechanical/linear constructions of the world versus organic/systems constructions. Closed versus Open systems. The role of the environment. References: Patton, 2002: Fitzpatrick, Sanders, & Worthen, 2004: Scott, 2003.

Theoretical Basis Systems Theory Reference: Scott, 2003.

Defining and conceptualizing a system A system is a whole that is both greater than and different from its parts The effective management of a system requires managing the interactions of its parts, not the action of its parts taken separately (Gharajedaghi and Ackoff, 1985). Describe that system—volunteers! Reference: Patton, 2002.

Management Oriented Evaluation The primary focus of management oriented evaluation is to serve the decision-maker (s). The needs of the decision-makers guide the direction of evaluation.

Stufflebeam’s CIPP CIPP serves decision-makers facing four types of decisions: Context Evaluation—planning decisions Input Evaluation—structuring decisions Process Evaluation—implementation decisions Product Evaluation—recycling decisions

Context Evaluation Objective To define context, identify target population, assess needs, diagnose problems Questions to ask What are the needs to be addressed? Who is the target population? Any existing problems? Method System analysis, survey, document review, interviews, hearings, tests, Delphi technique Relation to decision-making Decide on setting, goals, planning, a basis for judging outcomes

Input Evaluation Objective To assess system capability, alternative strategies, procedural designs, budgets, and schedules Questions to ask What resources are available? What plan has the best potential? What alternatives should be considered? Method Inventory, literature review, visits to other programs, advocate teams, and pilot trials Relation to decision- making Select sources of support, solutions, and procedural designs. A basis for judging implementation

Process Evaluation Objective To identify/predict defects in design or implementation, record/judge procedural events/activities. Questions to ask How well is the plan being implemented? What barriers threaten its success? What revisions are needed? Method Monitoring, describing process, interacting, observing Relation to decision- making Effecting process control, save information for future use in interpreting outcomes

Product Evaluation Objective To judge outcomes, relate to objectives, context, inputs and process, interpret worth Questions to ask What results are obtained? Were the goals met? What should be done with the program after it has run its course? Method Defining criteria, stakeholders’ judgments, qualitative and quantitative analyses. Relation to decision- making Deciding to continue, terminate, modify, or refocus. Present record

Stufflebeam’s Evaluation Steps Focusing the evaluation Collection of information Organization of information Analysis of information Reporting of information Administration of the evaluation

Alkin’s UCLA Model Types of evaluation –Systems assessment (context) –Program planning (input) –Program implementation –Program improvement (process) –Program certification (product)

Strengths Focuses on informational needs and pending decisions of decision-makers Systematic and comprehensive Provides a wide variety of information Stresses importance of utility of information Evaluation happens throughout the program’s life Provides timely feedback and improvement CIPP—heuristic tool that helps generate important questions to be answered. Easy to explain.

Weaknesses Narrow focus –Inability to respond to issues that clash with concerns of decision makers –Indecisive leaders unlikely to benefit Possibly unfair or undemocratic evaluation May be expensive and complex Unwarranted assumptions –Important decisions may be correctly identified up front –Orderliness and predictability of decision-making process

Application of CIPP Evaluation framework for nursing education programs: application of the CIPP model Critical success factors –Create an evaluation matrix –Form a program evaluation committee including representatives from all partners –Determine who will conduct the evaluation: internal or external –Ensure the evaluators understand and adhere to the program evaluation standards

It’s Time for….. Management Oriented Evaluation Trivia. –Split up into two groups. –Each group selects a team captain. –The group with the most money wins.

References Fitzpatrick, J.L., Sanders, J.R., & Worthen, B.R. (2004). Program evaluation: alternative approaches and practical guidelines. White Plains, NY: Longman. Patton, M. Q. (2002). Qualitative evaluation and research methods (3 rd ed.). Newbury Park, California: Sage. Scott, W. R. (2003). Organizations: Rational, natural, and open systems (5th ed.).Upper Saddle River, NJ: Prentice Hall. Singh, M.D. (2004). Evaluation framework for nursing education programs: Application of the CIPP model. International Journal of Nursing Education Scholarship, 1, Issue 1.