MONITORING FOR DESCRIPTION Performance Monitoring and Evaluation College of Public and Community Service University of Massachusetts at Boston ©2006 William.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Research Narrative Designs Dr. William M. Bauer
Narrative Research Designs
Jim Moran Nancy Lucero MEASURING OUTCOMES. TELLING THE WOKSAPE OYATE STORY Assumptions.
Social Science Research and
Analysing and interpreting cognitive interview data: a qualitative approach.
Chapter Four. Writing the Proposal  What does the intended reader/audience need to understand better about the topic?  What does the audience know little.
The link between stress and life changes. Starter Write examples of life changing events Post-it note.
DATABASE INTRO What is it? What does it do? Information Technology University of Massachusetts at Boston ©2009 William Holmes 1.
PLANNING TO MONITOR OR EVALUATE (CONT’D) ©2006 William Holmes.
UNOBTRUSIVE RESEARCH Research Methods University of Massachusetts at Boston ©2011 William Holmes.
Introduction to Research
Data Analysis, Interpretation, and Reporting
GRANTS, CONTRACTS, & NEEDS ©2007 William Holmes College of Public and Community Service University of Massachusetts at Boston.
Research Proposal Development of research question
CHARTS IN POWERPOINT Information Technology University of Massachusetts at Boston ©2006William Holmes.
PROBLEM FORMULATION Defining a Researchable Problem Research Methods College of Public and Community Service University of Massachusetts at Boston ©2011.
Evaluation of Health Promotion CS 652 Sarah N. Keller.
Title I Needs Assessment and Program Evaluation
Refining a Theory of Change 1 Barbara Reed & Dan Houston November 2014.
Designing Case Studies. Objectives After this session you will be able to: Describe the purpose of case studies. Plan a systematic approach to case study.
Hypothesis & Research Questions
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Case Study Research: A Primer Mark Widdowson, TSTA (P) University of Leicester.
Chapter 9 Qualitative Data Analysis Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Research Methods in Psychology (Pp 1-31). Research Studies Pay particular attention to research studies cited throughout your textbook(s) as you prepare.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Outcome Based Evaluation for Digital Library Projects and Services
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
1 WRITING EVALUATION PLANS Grantsmanship and Fundraising College of Public and Community Service University of Massachusetts at Boston ©2008 William Holmes.
Evaluation Tools & Approaches for Engaging Stakeholders: The AREA Approach Presentation at the Canadian Evaluation Society Dr. Mariam Manley, Principal.
Introduction to Research Deny A. Kwary Airlangga University
Hypothesis & Research Questions Understanding Differences between qualitative and quantitative approaches.
Quantitative and Qualitative Approaches
Professional Writing College of Public and Community Service University of Massachusetts Boston ©2012 William Holmes CASE STUDY 1.
G544 – Practical project SELF REPORT. Revision  Socrative quiz  In pairs – answer each question.  We will then discuss each answer given.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
A step-by-step way to solve problems. Scientific Method.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
1 IDENTIFYING NEEDS Grantsmanship and Fundraising College of Public and Community Service University of Massachusetts at Boston ©2008 William Holmes.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
Story Map: graphic display of story elements (character, setting, problem, action or events, and solution) Sequencing Title BeginningMiddleEnd “Select.
Program Evaluation Principles and Applications PAS 2010.
Introduction to Research. Purpose of Research Evidence-based practice Validate clinical practice through scientific inquiry Scientific rational must exist.
Using patterns to determine the main point the author is trying to convey.
ISO CONCEPTS Is a management standard, it is not performance or product standard. The underlying purpose of ISO 1400 is that companies will improve.
Unit 5.  Upon completion of this unit you should be able to: ◦ Describe the steps used in analyzing community assessment data ◦ Analyze actual community.
VALUES AND MONITORING Performance Monitoring and Evaluation College of Public and Community Service University of Massachusetts at Boston ©2006, William.
Logic Model, Monitoring, Tracking and Evaluation Evaluation (Section T4-2)
PROGRAM DEVELOPMENT DOCUMENTS Professional Writing College of Public and Community Services University of Massachusetts Boston ©2012 William Holmes 1.
ASK STANDARDS Assessment and Accountability CNS 610 Written & Narrated by: Kelcie Dixon Western Kentucky University.
© 2011 Pearson Education, Inc. All rights reserved. This multimedia product and its contents are protected under copyright law. The following are prohibited.
This multimedia product and its contents are protected under copyright law. The following are prohibited by law: any public performance or display including.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Fact Finding (Capturing Requirements) Systems Development.
Chapter 6 Selecting a Design. Research Design The overall approach to the study that details all the major components describing how the research will.
Statistics & Evidence-Based Practice
An Introduction to the Inquiry Process
PROBLEM FORMULATION Defining a Researchable Problem Research Methods
Qualitative Research Quantitative Research.
EVALUATION RESEARCH Research Methods
Program Evaluation Essentials-- Part 2
Compare Text Structure
Descriptive Statistics
بسم الله الرحمن الرحیم.
DATABASE ACCOUNTABILITY
Analysing your own research
Logical problem solving sequence
Logical problem solving sequence
Compare Text Structure
Presentation transcript:

MONITORING FOR DESCRIPTION Performance Monitoring and Evaluation College of Public and Community Service University of Massachusetts at Boston ©2006 William Holmes

2 DESCRIPTIVE ISSUES: 1 What do you need to describe? How do you want to describe? What are the units of analysis for your description?

3 DESCRIPTIVE ISSUES: 2 What are the strengths and limitations of these procedures? How may the descriptive procedures be improved or changed?

4 WHAT DO YOU NEED TO DESCRIBE? PART 1 Intake/Goal Status/Participant and Environmental Characteristics Services/Interventions/Resources Activities/Events/Processes Output/Goal Status/Participant Characteristics

5 WHAT DO YOU NEED TO DESCRIBE? PART 2 Actions/Contacts/Tracking/Processing Compliance/Accountability/Standards Adherence/Requirement Documenting Quantitative Data Collection Qualitative Data Collection Knowledge Development

6 HOW DO YOU WANT TO DESCRIBE? Numbers and Percentages Graphic charts Written narratives Media presentation

7 WHAT ARE THE UNITS OF ANALYSIS FOR DESCRIPTION? Clients Transactions Events Services/activities Service workers Programs

8 WHAT ARE THE STRENGTHS AND LIMITATIONS OF THIS DESCRIPTION? PART 1 Data related to logic model Elements missing from logic model Reliability, validity, completeness, and timeliness of data Purposes served by existing data Problems in data collection

9 WHAT ARE THE STRENGTHS AND LIMITATIONS OF THIS DESCRIPTION? PART 2 Privacy protection issues Duplicate records Missing records Multiple records Matching records

10 WHAT ARE THE STRENGTHS AND LIMITATIONS OF THIS DESCRIPTION? PART 3 Service data linkage to outcomes Moderating influences on outcomes Resource limitations Standards for comparison Problems of goal displacement

11 DATA AND INFORMED CONSENT ISSUES Information to be provided Uses of information Access to information Client rights Risks and benefits Sunset provisions

12 QUANTITATIVE AND QUALITATIVE DESCRIPTIVE ISSUES: 1 Characteristics as measures Standards for evaluating Tables, charts, & narrative Focus and consistency

13 QUANTITATIVE AND QUALITATIVE DESCRIPTIVE ISSUES: 2 Good examples Deviant cases Telling a story Comparing and contrasting Summarizing

14 IMPROVEMENTS/CHANGES: PART 1 Logic model measures refined Goal measures improved Unneeded data dropped New uses of existing data

15 IMPROVEMENTS/CHANGES: PART 2 Procedures for more complete data Procedures for more timely data Procedures for better presentation of data Privacy and informed consent improvements

16 IMPROVEMENTS/CHANGES: PART 3 Tying services to outcomes Linking risk factors and auxiliary services Better time sequencing of data Use of comparison standards

17 DESCRIPTIVE EXERCISE: HOLLOWEEN PARTY Goals Value assumptions Measures When and how collected by whom How information used Limitations