Know how a data management project can help:  Improve program design  Demonstrate effectiveness  Highlight the best work being done  Compete for.

Slides:



Advertisements
Similar presentations
Building a Career Portfolio
Advertisements

An Indispensable Quality Assurance Tool for Dairy Processing Plants.
IT Works so Uwork(s): Letting Technology work for You!!!
LAO PDR Summary Findings from NOSPA Mission and Possible Next Steps.
Training for OCAN Users Day 2
Overview of IS Controls, Auditing, and Security Fall 2005.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Learn How States Are Finding “Hard-to-reach” Students for Post-school Outcome Data Collection! How the Heck Do We Contact Some of Our Former Students?
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Principles of Marketing
YJB TOOLKITS: Disproportionality YJB owner: Sue Walker Dept: Performance May (2011) Version 1.0.
Evaluation. Practical Evaluation Michael Quinn Patton.
Computerised Maintenance Management Systems
Copyright © 2014 by The University of Kansas Choosing Questions and Planning the Evaluation.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
What is Business Intelligence? Business intelligence (BI) –Range of applications, practices, and technologies for the extraction, translation, integration,
1 CADE Finance and HR Reports Administrative Staff Leadership Conference Presenter: Mary Jo Kuffner, Assistant Director Administration.
Reporting and Using Evaluation Results Presented on 6/18/15.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Measuring Our Success Kristen Sanderson, MPH, CHES Program Coordinator, Safe Kids Georgia 1.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
NTeQ: Designing an Integrated Lesson
USING URS for QUALITY MANAGEMENT Case Study 1: “How many of the women currently enrolled in the RWCA case management program are actually receiving routine.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Working Definition of Program Evaluation
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
Military Family Services Program Participant Survey Training Presentation.
You’ve Got What It Takes: Peer Training and Mentoring for Staff Development You’ve Got What It Takes: Peer Training and Mentoring for Staff Development.
Approaches to Measuring Child Outcomes Kathy Hebbeler ECO at SRI International Prepared for the NECTAC National Meeting on Measuring Child and Family Outcomes,
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Purposes of Evaluation Why evaluate? Accountability: Justify trust of stakeholders (funders, parents, citizens) by showing program results (Summative)
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
IPMA Executive Conference Value of IT September 22, 2005.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Adolescent Literacy – Professional Development
Putting the Pieces Together Access DB You Excel Spreadsheet HQ Staff SQLServer DB Standard AF DB (Gunter) Unit POCS Oracle DB Base Web Services Photo &
System Changes and Interventions: Registry as a Clinical Practice Tool Mike Hindmarsh Improving Chronic Illness Care, a national program of the Robert.
Types of information system College of Information Technology Presented by Reem haroun _ Introduction to Business Information Processing Supervised.
Military Family Services Program Participant Survey Briefing Notes.
STAKEHOLDER MEETING Selecting Interventions to Improve Utilization of the IUD City, Country Date Insert MOH logoInsert Project logoInsert USAID logo (Note:
Unit 8: Abacus Law and Smart Draw. Specialty Software AbacusLaw is a legal specialty software program that can be used for case management, calendaring,
Promoting a Culture of Evidence Through Program Evaluation Patti Bourexis, Ph.D. Principal Researcher The Study Group Inc. OSEP Project Directors’ Conference.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
New England Region Homeless Management Information System PATH Integration Into HMIS Richard Rankin, Data Remedies, LLC Melinda Bussino, Brattleboro Area.
1 A QTS Web Training Writing Consumer Education & Referral Outcomes.
Catholic Charities Performance and Quality Improvement (PQI)
ORIENTATION WORKSHOP. Target Capabilities Assessment Purpose Objectives Structure of the Target Capabilities Assessment Process Overview The Self-Assessment.
The CSO’s IT Strategy and the GSBPM IT Directors Group October 2010 Joe Treacy Central Statistics Office Ireland.
Using common indicators: A tool-building approach Becca Blakewood, US Impact Study.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
Getting Public Agencies Started on Fund Mapping evidence2success Strategic Financing.
Choosing Questions and Planning the Evaluation. What do we mean by choosing questions? Evaluation questions are the questions your evaluation is meant.
DOQ-IT Project The EHR Roadmap Tony Linares, MD Medical Director, Quality Improvement.
Managing Marketing Information 4 Principles of Marketing.
Requirements. Outline Definition Requirements Process Requirements Documentation Next Steps 1.
Session 6: Data Flow, Data Management, and Data Quality.
Waste Management Inspection Tracking System (WMITS)
Leadership Guide for Strategic Information Management Leadership Guide for Strategic Information Management for State DOTs NCHRP Project Information.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Orienting Agency Leads for Fund Mapping evidence2success Strategic Financing.
Enacting Multiple Strategies and Limiting Potential Successes: Reflections on Advocacy Evaluation, Competing Objectives, and Pathways to Policy Change.
Overview of the RHIS Rapid Assessment Tool
Jenny Lyn Tee Estrada-Firman Reporter
2 Selecting a Healthcare Information System.
Teaching and Learning with Technology
BIS 221 Education for Service-- snaptutorial.com.
CIS 515 STUDY Lessons in Excellence-- cis515study.com.
Windows® MultiPoint™ Server 2010
Implementation Guide for Linking Adults to Opportunity
Presentation transcript:

Know how a data management project can help:  Improve program design  Demonstrate effectiveness  Highlight the best work being done  Compete for funding, and  Mobilize public support.

Learn about:  Identifying expected outcomes, and  defining objectives and incremental indicators of success consistent with your mission.

 Understand the elements and stages of a data management planning process  Be familiar with common barriers and costs associated with data management

 Learn how quality data can influence and inform the strategic planning process.  Explore options for tracking and using data efficiently at reasonable cost.

 City of Pittsfield Neighborhood evaluation ◦ Combining & analyzing data from multiple sources  DIAL/SELF (Greenfield, MA) Transitional Living Program housing outcomes ◦ Sorting and interpreting data from a single collection source (Lets go to visit source tables in Excel then come back to PowerPoint to review graphs)

 Bring key people together at each stage of planning process  Administration, program directors and supervisor participation is critical in early stages, but direct care staff can be helpful too (ask questions!)

 Initial planning stages require a deep understanding of the resources (funding, technology/equipment, and staff time) required to plan, implement and maintain a data management project/data driven culture.

 It may be worthwhile to invest in a consultant or devote substantial administrative time to produce useful estimates of the time and cost involved in implementing and maintaining a data driven culture

 Understand the purpose of your project - what will this data do for your organization?  Identify data priorities  Plan to start small and efficiently – you can grow as you learn and achieve - look for the intersection of what data you can easily obtain and what you would want to know in an ideal world! (go to flip chart )

 As you move into more detailed planning, direct care staff input becomes extremely important.  Involve staff in a formal way and carefully assess what support they will need to succeed!  Design formal systems for Training, Support and Accountability

 Reports/data you already need for funders  Identify information for internal evaluation and improvement (even if it isn’t currently required by funders)  Develop a functional draft of outcomes, objectives and indicators (your dataset) prior to shopping for a database or building a data collection system

A Brief Summary

OObjectives = desired participant changes or achievements IIndicators = measurable events OOutcomes = level of achievement

Inputs resources Outputs actions Outcomes achievements Basic Logic Model Inputs resources Outputs actions Indicators events Objectives expectations Outcomes achievements ImProve Outcomes SM Model *Identify tracking method

 Extension of logic models  Based on incremental change  Means of prioritizing information  Method of categorizing information

Levels of Learning Mastery ◦ Knowledge/Comprehension  (learn about it) ◦ Application  (use it, try it out) ◦ Synthesis  (integrate with other knowledge)

 S pecific  M easurable  A chievable  R elevant  T imely

 Use active verbs to describe indicators  Look for achievement opportunities at levels that are relevant to the services, time frame or intervention level of your program  Indicators reflect participant capacity for positive change and choices that indicate forward movement

 Web/Cloud Based ◦ Require reliable, high speed internet connection(s) ◦ Each user has own license – can access from anywhere ◦ Easy to monitor data entry ◦ Evaluate capability and cost of compilation, sorting and reporting ◦ Carefully evaluate ownership of data and “worst-case scenarios” (e.g., you or the provider go out of business?)  PC-Based ◦ You own software and data that is on your computer ◦ Speed depends on speed of machine ◦ May require additional software to run the database ◦ Can be difficult to synchronize data from multiple sources. ◦ Ease of data retrieval depends a lot on initial design and software used.

 1. Surveys:  Useful to capture information from participants  You have to ask the right question(s). That takes planning and some experimentation to gather aggregateable data.  Results can be compiled in Excel – but consider using Survey Monkey where you can get reports and export to excel.  2. Microsoft Access:  Good for demographic data and tracking objectives and indicator completion – data that changes or needs to be cross-referenced.  Inexpensive, but requires expertise to develop functional applications  Easy to retrieve data through queries

3. Daily Logs (paper or software ) ◦ Most useful if data is aggregated and entered into a database or spreadsheet regularly (daily, weekly or monthly) ◦ Like surveys, the right questions have to be asked to get useful, accessible information ◦ With proper planning, could be used to track a variety of participant achievements. 4. Exit interviews! ◦ Build some of the questions to have aggregateable answers (e.g., multiple choice, name at least one xxx, etc.)

Please Contact: Doug Tanner

Please Contact: Cindy Carraway-Wilson