ATTC Network Evaluation 1 National Evaluation of the Addiction Technology Transfer Center (ATTC) Network: ATTC Event & Activity Reporting Database Review.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

A Roadmap to Successful Implementation Management Plans.
Guidelines for good project management An output of the RE-ACT project under Interact Ronan Gingles for AER, Strasbourg, 14 June 2007.
REL Appalachia and the Virginia Middle School Research Alliance Justin Baer, Director, REL Appalachia Virginia School-University Partnership Steering Committee.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
The “sheer volume of dissonant statistics”... demands colleges “fortify their institutional research capacities.” ~ Cliff Adelman. “The Toolbox Revisited:
Overview and Discussion of NRS Changes. Massachusetts Department of Elementary and Secondary Education 2 NRS Changes for FY13 AGENDA  Review changes.
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
Schoolwide Planning, Part III: Strategic Action Planning
Evaluation. Practical Evaluation Michael Quinn Patton.
The Current Status of States' Early Childhood Outcome Measurement Systems Kathy Hebbeler, SRI International Lynne Kahn, FPG Child Dev Inst October 17,
Measuring for Success Module Nine Instructions:
April 2, 2013 Longitudinal Data system Governance: Status Report Alan Phillips Deputy Director, Fiscal Affairs, Budgeting and IT Illinois Board of Higher.
T HE T RANSITION OF HCS C ASE M ANAGEMENT TO MRA S ERVICE C OORDINATION.
Funding Opportunity: Supporting Local Community Health Improvement Sylvia Pirani Director, Office of Public Health Practice New York State Department of.
Regional Technical Forum End-use Load Shape Business Case Project Project Initiation Meeting Portland, OR March 5, 2012.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Using the Indistar® Web-based Planning Tool to Support School Improvement Session #4 Presenters: Yvonne A. Holloman, Ph.D. Associate Director Michael Hill.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
2013 NEO Program Monitoring & Evaluation Framework.
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
National Office Updates Laurie Krom ATTC Directors Meeting November 5, 2009.
Management & Development of Complex Projects Course Code MS Project Management Perform Qualitative Risk Analysis Lecture # 25.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
IBIS-Admin New Mexico’s Web-based, Public Health Indicator, Content Management System.
University of Central Florida Assessment Toolkit for Academic, Student and Enrollment Services Dr. Mark Allen Poisel Dr. Ron Atwell Dr. Paula Krist Dr.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
PREVIEW: STATE CHILD OUTCOMES DATA QUALITY PROFILES National Webinar February 2014.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
Joint Presentation of the Auditor of Public Accounts and the Virginia Retirement System GASB 68 Implementation Virginia Association of School Business.
Professional Development Opportunities for the New Math Standards.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Project 3 Supporting Technology. Project Proposal.
Regional Seminar on Promotion and Utilization of Census Results and on the Revision on the United Nations Principles and Recommendations for Population.
Module 8: Monitoring and Evaluation Gap Analysis and Intervention Plans.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
July 2007 National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Role of Action Planning in The Developmental.
2010 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career Dan Schreier, Gregg Corr, Jill Harris, Ken Kienas, Kate Moran,
By: Albert Byamugisha, PhD Commissioner, Monitoring and Evaluation Office of the Prime Minister – Uganda Presented at the Evaluation Capacity Development.
Southend Together Secretariat 21 st February Developing Southend Together’s Sustainable Community Strategy
RIDE Educator Evaluation System Design ACEES Meeting December 6, 2010.
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
MAT-PDOA Program Evaluation Diana Seybolt, Ph.D. Karen McNamara, Ph.D. Systems Evaluation Center (SEC)
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
How to Participate in the Webinar * When you log in your dashboard will look like this: – It will minimize – Press the red arrow to restore the dashboard.
Laboratory System Improvement Program (insert name of state) Assessment Date.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Data Driven Decisions: Using the Tools Susan Barrett, Jerry Bloom PBIS Maryland Coaches Meeting October 2007
1 An Overview of Process and Procedures for Health IT Collaboration GSA Office of Citizen Services and Communications Intergovernmental Solutions Division.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Wait Time Project Implementation Strategy. Implementation Plan: Goals 1.To educate and provide clarification around the wait time project, wait time definitions,
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
SCHOOL-WIDE POSITIVE BEHAVIORAL INTERVENTIONS AND SUPPORT: ADDRESSING THE BEHAVIOR OF ALL STUDENTS Benchmarks of Quality KENTUCKY CENTER FOR INSTRUCTIONAL.
16-19 Accountability Measures. When Outcomes from summer 2016 (for students on 2 year courses). That is enrolments September First publication:
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
First Things First Grantee Overview.
Program Evaluation Essentials-- Part 2
Evaluation in the GEF and Training Module on Terminal Evaluations
DRAFT - FOR REVIEW PURPOSES ONLY
Environmental Monitoring: Coupling Function Calculator
Implementing Equitable Services Requirements
Behavior Modification Report with Peak Reduction Component
Technical and Advisory Meeting
ESF monitoring and evaluation in Draft guidance
MAC Input on Section 4.9 Review
New Special Education Teacher Webinar Series
Presentation transcript:

ATTC Network Evaluation 1 National Evaluation of the Addiction Technology Transfer Center (ATTC) Network: ATTC Event & Activity Reporting Database Review Roy M. Gabriel, Ph.D. Jeffrey R.W. KnudsenMargaret Gwaltney, M.B.A.Richard Finkbiner, Ph.D. RMC Research CorporationAbt Associates, Inc.MANILA Consulting Group Portland, ORBethesda, MDMcLean, VA Presentation to ATTC Directors November 5, 2009

ATTC Network Evaluation 2 Agenda The ATTC Event & Activity Reporting Database:  Review of past action steps  The NO, NET, and the ATTC Network  Implementation to date  Examination of current data  All events vs. GPRA only  Role of these data in overall evaluation  Next steps???

ATTC Network Evaluation 3 Review— Where We’ve Been

ATTC Network Evaluation 4 ATTC Event & Activity Reporting Database and the NET ATTC Event & Activity Reporting Database was augmented to meet multiple reporting needs: National Office/CSAT, NIDA, and the national evaluation. In coordination with ATTC National Office, the NET developed a web-based query system to extract and summarize information from the database. Examples:  Query the entire Network, specific region(s), time periods, event types, etc.  Tally number of events, number of participants, etc., as ATTC “outputs” (both GPRA and non-GPRA).  Cross-tab tallies w/many other event characteristics (e.g., funding source, materials source, GPRA/non-GPRA, collaborator type, delivery mode, etc.).

ATTC Network Evaluation 5 The Underlying Premise... During the design phase of the ATTC Evaluation contract, ATTC Directors clearly indicated that, from a service provision perspective, GPRA data did not tell the entire story. It was clear that other very important services were being provided by ATTC centers, but were not being counted by the GPRA system.

ATTC Network Evaluation 6 The Output is Only as Good as the Input... Data Quality The NET presented a series of sample summary tables at the December 2008 ATTC Directors meeting. It was determined that assuring useful (and accurate) data queries and reports is contingent on the NET and the Network working together:  The ATTC NO and the NET hosted a 2-part webinar series in March 2009 reviewing (a) the purpose of the Event & Activity Reporting Database (from the perspectives of CSAT, NIDA, and the NET) and (b) data entry procedures.  The ATTC Performance Monitoring Subcommittee was working to (a) re-disseminate activity/event definitions and to (b) establish rules of thumb regarding what non-GPRA events should be entered into the system.  The NO provided draft guidelines for review (April 2009).

ATTC Network Evaluation 7 NET Reporting Plans—On Hold The NET prepared templates for quarterly summaries of events/activities to each regional ATTC.  Reports were to feature regional activity data along with a Network-wide summary of the same information.  The NET had hoped such summary reports would increase data utilization within Network.  Production and dissemination were contingent upon more consistent implementation of data system.  The NET does not want to issue reports if numbers are misleading, skewed, or simply inaccurate.

ATTC Network Evaluation 8 Implementation of Event & Activity Reporting Database January 1–June 30, 2009

ATTC Network Evaluation 9 GPRA vs. Non-GPRA Events GPRA?# of Events% of EventsValid % of Events (Not Specified) a 294% Yes44359%62% No27337%38% Total b 745 a Note: This variable is only required once an event has occurred. “Not specified” responses likely represent event records not updated after the event took place. b Note: Unduplicated event count.

ATTC Network Evaluation Illustrative NET Interpretation In a 6-month period, the Network conducted nearly 750 events and activities.  An average of about 50 events/activities per region.  Much variability in these numbers from region to region. Nearly 40% of these are not captured by the GPRA reporting system. “Natural” follow-up questions:  What kind of events?  Provided to whom? With whom?  On what topics? 10

ATTC Network Evaluation 11 Variability in Entering Non-GPRA Events 3 Regional Centers account for 70% of all non-GPRA events entered into database.  These 3 Regional Centers demonstrated a 25% GPRA/ 75% non-GPRA balance.

ATTC Network Evaluation NET Observations/Questions Is it accurate that:  3–4 regions conducted no non-GPRA events in these quarters?  The majority of regions conduct fewer than 5 non-GPRA events in a quarter?  In contrast to the earlier slide that indicated, “Network-wide,” nearly 40% of the events conducted fall in the non-GPRA category. What is the balance between GPRA and non-GPRA events in the Network: 60%/40%? 25%/75%? Something in between? 12

ATTC Network Evaluation 13 A Closer Look at Data: Event Type YR2, Q2–3

ATTC Network Evaluation 14 Event Type (n = 745)

ATTC Network Evaluation 15 Event Type—GPRA Only (n = 443)

ATTC Network Evaluation 16 Event Type—Non GPRA Only (n = 273)

ATTC Network Evaluation 17 Event Types for 3 Centers Consistently Logging Both GPRA & Non GPRA Events (n = 256)

ATTC Network Evaluation 18 A Closer Look at Data: Collaborator Type YR2, Q2–3

ATTC Network Evaluation 19 Collaborator Type (n = 745 events)

ATTC Network Evaluation 20 Collaborator Type—GPRA Only (n = 443)

ATTC Network Evaluation 21 Collaborator Type—Non GPRA Only (n = 273)

ATTC Network Evaluation 22 Collaborator Types for 3 Centers Consistently Logging Both GPRA & Non GPRA Events (n = 256 events)

ATTC Network Evaluation 23 Many More Examples Available... Previous slides provided only as illustrative examples. Many other queries could have also been used to demonstrate the impact of non-GPRA data, for example:  Primary funding source  Tech transfer objective  Delivery mode  Target audience  State-level service saturation

ATTC Network Evaluation 24 What This Means... The inclusion of non-GPRA data changes the overall service delivery “picture” for the Network.  This is not a revelation, nor unexpected; but nonetheless, from the NET’s perspective, very important to the Network. The change in picture is even more apparent when looking at those few Centers who seem to be higher implementers (or reporters) of non-GPRA activities. However, given the variability in reporting, the NET can only confidently put forth the service delivery picture drawn from GPRA.  When 70% of non-GPRA entries are coming from 20% of the Network, the climate for skew exists.

ATTC Network Evaluation Importance of Event & Activity Reporting Database in National Evaluation The NET has designed a comprehensive series of data collection activities to:  Address and assess multiple objectives of tech transfer in the Network.  Obtain a variety of perspectives on the effectiveness and value of the ATTC Network. Across all of these data collection activities, the database is the only one that can capture the sheer volume of work produced by the Network (and volume is valued by many key stakeholders). 25

ATTC Network Evaluation 26 Next Steps The NET will run all developed queries on GPRA events entered into ATTC Event & Activity Reporting Database to learn more about planning processes, partners, funding, materials, objectives, etc. of ATTC provided services. If the NET can determine that entry of non- GPRA events/activities has improved, pool of events targeted for inquiry will be expanded.