Governor’s Prevention Initiative for Youth: Evaluation Jane A. Ungemack, Dr.P.H. Evaluator University of Connecticut Health Center.

Slides:



Advertisements
Similar presentations
Floridas Outcome-Based Evaluation Program Technical Session LSTA Coordinators Annual Conference November 16-17, 2000.
Advertisements

Successfully Implementing Evidence-Based Programs for Children and Families in North Carolina A Presentation for the Family Impact Seminar Michelle Hughes,
Purpose of Instruction
Elementary School Counselor
Risk and Protective Factors for Substance Use Steve Delaronde, MSW, MPH University of Connecticut Health Center The Governor’s Prevention Initiative for.
Linking Actions for Unmet Needs in Children’s Health
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Title I, Part D—Prevention and Intervention Programs for Children.
A Logic Model for the Effective Implementation of Service Coordination: Culmination of Five Years of Research Michael Conn-Powers, Indiana University Julia.
Jane Ungemack, DrPH University of Connecticut Health Center Governor’s Prevention Initiative for Youth Evaluation Team Needs Assessment Training Session.
Office of Student Support Services An Overview of HIB Grades: Understanding the Self-Assessment Process & Data Entry Requirements Spring 2015.
Laying the Foundation for Success: SDPI Demonstration Projects Overview November19, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative:
NRCOI March 5th Conference Call
Laying the Foundation for Success: SDPI Demonstration Projects Overview November 17, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Diabetes Prevention Program.
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Monitoring Accommodations in South Dakota Linda Turner Special Education Programs.
Steve Delaronde, MSW, MPH University of Connecticut Health Center The Governor’s Prevention Initiative for Youth July 16, 1999 Identifying Community Resources.
Outpatient Services Programs Workgroup: Service Provision under Laura’s Law June 11, 2014.
QUALITY ASSESSMENT IN SCHOOL MENTAL HEALTH Johnathan Fowler, Ph.D.Johnathan Fowler, Ph.D. University of South CarolinaUniversity of South Carolina Waccamaw.
The Proof is in The Process: Data-Driven Program Implementation Rose Lee Felecia Johnson Tonya Johnson.
Perinatal and Infant Oral Health Quality Improvement National Learning Network Estimated Number Awards: One (1) Type of Award: Cooperative Agreement Estimated.
Office of Child Development & Early Learning Project MAX: Maximizing Access and Learning Tom Corbett, Governor Carolyn C. Dumaresq, Ed.D., Acting Secretary.
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
SPF SIG State-Level Evaluation COMMUNITY LEVEL INSTRUMENT (CLI): PART 2.
Training of Process Facilitators Training of Process Facilitators.
Must include a least one for each box below. Can add additional factors. These problems… School Performance Youth Delinquency Mental Health [Add Yours.
Community Planning Training 3-1. Community Plan Implementation Training Community Planning Training.
Must include a least one for each box below. Can add additional factors. These problems… School Performance Youth Delinquency Mental Health [Add Yours.
Overview of Performance Review Process
Participants Adoption Study 109 (83%) of 133 WSU Cooperative Extension county chairs, faculty, and program staff responded to survey Dissemination & Implementation.
Guidance from the CSDE on SRBI Implementation May 14, 2010 CAPSS Assistant Superintendents’ Meeting Mary Anne Butler, Education Consultant Iris White,
2 Misty Schulze, OMNI Institute & the Colorado Division of Behavioral Health Matt Beckett, Grand Futures Prevention Coalition.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
KENTUCKY YOUTH FIRST Grant Period August July
Perspectives on Impact Evaluation Cairo, Egypt March 29 – April 2, 2009 Presented by: Wayne M. Harding. Ed.M., Ph.D., Director of Projects, Social Science.
Washington State Department of Social & Health Services One Department Vision Mission Core set of Values - Division of Behavioral Health and Recovery Prevention.
Clay County IIIP Evaluation Project. Clay County 101 Clay County 101 Components of evaluation plan Components of evaluation plan Results of surveys Results.
Overview June,  Sub-recipients grant applications will go to ADAMHS/ADAS Boards only.  ADAMHS/ADAS Boards will be expected to identify a primary.
PRI Logic Model The following slides demonstrate various displays of the PRI logic model for your reference and use in local presentations. If you need.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Components of a national drug prevention system Ms. UNODC.
© 2010 NATIONAL TECHNICAL ASSISTANCE CENTER FOR CHILDREN’S MENTAL HEALTH, GEORGETOWN UNIVERSITY Expanded School Mental Health Services (ESMH) in Baltimore.
Community Resources Assessment Training 3-1. Community Resources Assessment Training 3-2.
Community Board Orientation 6- Community Board Orientation 6-1.
Community Plan Implementation Training 5-1 Community Plan Implementation Training 5-1.
One Department Vision Mission Core Set of Values Washington State Department of Social & Health Services Division of Behavioral Health and Recovery Building.
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
Office of Performance Review (OPR) U.S. Department of Health and Human Services (DHHS) Health Resources and Services Administration (HRSA) Stephen Dorage.
Lessons from the CDC/RTC HIV Integration Project Marianne Zotti, DrPH, MS, FAAN Team Leader Services Management, Research & Translation Team NCCDPHP/DRH/ASB.
By: Angela Martinez Education Specialist - Early Childhood Programs Division of Performance and Accountability *****Coordinated Services***** Community.
Lincoln Community Learning Centers A system of partnerships that work together to support children, youth, families and neighborhoods. CLC.
How to Plan for the Implementation of the Toolkit CEI Implementing the Reproductive Health Assessment Toolkit for Conflict-Affected Women November.
American Educational Research Association Annual Meeting AERA San Diego, CA - April 13-17, 2009 Denise Huang Identification of Key Indicators of Quality.
Domain-Based Outcome Measures. In order to assess the effectiveness of DAS-funded prevention programming and its influence on certain specified risk and.
1-2 Training of Process Facilitators Training of Process Facilitators To learn how to explain the Communities That Care process and the research.
Key Leader Orientation 3- Key Leader Orientation 3-1.
Interview Design Four Focal States Connecticut, Indiana, North Carolina, Massachusetts Additional States Arizona, Utah, Washington State Interview Protocol.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
EARLY CHILDHOOD CONSULTATION PARTNERSHIP A COLLABORATION BETWEEN ADVANCED BEHAVIORAL HEALTH & THE THE DEPARTMENT OF CHILDREN AND FAMILIES. FUNDED BY THE.
1 A Multi Level Approach to Implementation of the National CLAS Standards: Theme 1 Governance, Leadership & Workforce P. Qasimah Boston, Dr.Ph Florida.
National 4-H Common Measures Suzanne Le Menestrel, Ph.D. National Institute of Food and Agriculture, USDA Jill Walahoski, Ph.D. University of Nebraska-Lincoln.
An Introduction to the 4-H Common Measures Suzanne Le Menestrel, Ph.D. National Institute of Food and Agriculture, USDA Jill Walahoski, Ph.D. University.
I NTRODUCTION TO E VALUATION CSU Evaluation Team.
NYU Child Study Center: Bridges Program Caring Across Communities: Annual Grantee Meeting April
BSAS Quarterly Coordinator Meeting Friday, October 28, 2016
Presentation transcript:

Governor’s Prevention Initiative for Youth: Evaluation Jane A. Ungemack, Dr.P.H. Evaluator University of Connecticut Health Center

Evaluation Systematic efforts to collect and use information: –Document program implementation –Describe target populations/participants –Inform and improve program performance –Access program effectiveness –Increase accountability –Increase understanding

Governor’s Prevention Initiative for Youth Long-term Goal: Reduce substance use among adolescents Intermediate Goal: Reduce risk factors and increase protective factors for substance use in the individual, peer, family, school and community domains Target Group: year old youth

Evaluation Framework Overview

Evaluation Approach Process evaluation: documents program implementation and activities Outcome evaluation: assesses program effects or impacts

Capacity-Building for Evaluation Science-based approach Evaluation and assessment as integral parts of the program design –Community-level –Program-level

Evaluation Team Relationship with Grantees Training and technical assistance Instruments and administration protocols Consultation and collaboration Statewide coordination

Grantee Responsibilities Develop a program plan based on the logic model Specify measurable program objectives Cooperate and collaborate with UConn Evaluation Team Coordinate community survey Collect and submit process data Collect and submit outcome data Commit time and effort to evaluation activities

Assessing Community-Level Outcomes: CSAP Requirements School survey Use of core substance use, risk and protective factor measures

Community-Level Assessment: School Survey Mandated 7th-10th grade students Representative sample (minimum n=500; 125/grade level) Year 1 and Year 3 First administration: February-April, 2000

School Survey Self-administered during a classroom period Anonymous and confidential Parental consents Sampling, instrument and administration protocols provided by UConn Evaluation Team

School Survey Measures –Demographic characteristics –Lifetime and current use of ATOD –Risk and protective factors –Limited community-specific items

School Survey Grantee responsibilities: –Planning/coordination with UConn Evaluation Team –Planning/coordination with school personnel –Instrument duplication –Data cleaning –Data entry –Analysis

Assessing Program-Level Outcomes: CSAP Requirements Select a minimum of three programs (for each of three domains) Measure program outcomes using core measures Include a sufficient sample size for analysis Collect pre- and post-test data

Program-Level Evaluation Process evaluation: Document program implementation and activities Outcome evaluation: Assess program outcomes

Program-Level Evaluation: Process Evaluation Each program will be responsible for reporting: –Prevention strategies –Types of activities –Dosage –Number served –Participant characteristics (age, gender, race/ethnicity, etc.).

Process Evaluation Minimum Data Set (MDS) Instruments, protocols, and training provided by the UConn Evaluation Team

Program-Level Evaluation: Outcome Evaluation Based on the logic model, identify measurable objectives that you will address in your program Program objectives should be selected from one or more of the risk/protective factors included in the RFP list of Connecticut Intermediate Outcomes

Outcome Evaluation UConn Evaluation Team staff will work with each grantee to finalize program- specific objectives and measures All grantees will be asked to participate in a pre/post-test assessment of age-eligible participants as appropriate

Outcome Evaluation Pre- and Post-Test Assessments Youth participation will be voluntary Confidential Informed consents Standardized instrument plus optional program-specific items Minimum sample size = 50

Considerations for Estimating Evaluation Costs Personnel (.25 FTE minimum recommended) Computer equipment Photocopying Office supplies Data collection and cleaning Data entry

Evaluation Themes for the Governor’s Prevention Initiative Evaluation at all levels Science-based Capacity-building Collaboration Coordination Sustainability