Public Health Learning Network

Slides:



Advertisements
Similar presentations
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Advertisements

Measuring Information Literacy from Third Grader to College Senior Carolyn Radcliff, M.L.S., SAILS Project Director Julie Gedeon, Ph.D., TRAILS Assessment.
Summary of Key Results from the 2012/2013 Survey of Visa Applicants Who Used a Licensed Adviser Undertaken by Premium Research Prepared: July 2013.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Talbert House Project PASS Goals and Outcomes.
TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1.
Joe Selby, MD MPH EBRI December 15, 2011 What Might Patient (Employee)- Centered Research Look Like?
Where the Jobs Are: developing competency in the use of labour market information Carole Brown Past President, Career Development Association of Australia.
Professionalizing Mobility Management: Developing Standards and Competencies Julie Dupree, Easter Seals Association of Travel Instruction Conference August.
Outcome Measures of Triple Board Graduates: Marla J. Warren, MD,MPH; David W. Dunn, MD; Jerry L. Rushton, MD,MPH. Section of Child Psychiatry.
Division of Aging Services State Plan on Aging Georgia Department of Human Services Presenter: Jean O’Callaghan Deputy Director Division of Aging Services.
GC e-Orientation Program for New Hire Module 4 – Knowing your Career in Oracle Updated by HR in July 03.
Continuing Education Provincial Survey Winter 2012 Connie Phelps Manager, Institutional Research & Planning.
Informational Webinar Troy Grant Assistant Executive Director for P-16 Initiatives Tennessee Higher Education Commission.
Long-Term Services and Supports (LTSS) Workforce Competencies Project Update to Options Counseling Standards Grantees July 12, 2011.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Introduction Tony Cortez, Account Executive
How Can High School Counseling Shape Students’ Postsecondary Attendance? Exploring the Relationship between High School Counseling and Students’ Subsequent.
Prof. Sushma Acquilla, Project Director
Coordinator Course Managing Satellite Locations,
Practice Experience 101 Boot Camp
Ruth Geraghty Data Curator Children’s Research Network
Interpreting, Editing, and Communicating MISO Survey Results
Key Performance Indicator - Survey Overview
Learning Into Practice Plan
John Halpin, Associate Dean, Perkins & Work Experience
Course format and credit hour (7.13)
Preparing Public Health Professionals in a Changing World
Curriculum Model Policy (7.18)
Course credit hour definition (7.13)
Strategic Performance Management
WIRED Performance Reporting
Course format and credit hour (7.13)
Strategy Review Sessions September 11 and 12, 2017
UTRGV 2016 National Survey of Student Engagement (NSSE)
Grant Writing Information Session
DISCOVERY & CUSTOMIZED WORK-BASED LEARNING
KY Alternate Assessment
Measuring Project Performance: Tips and Tools to Showcase Your Results
2017 Health care Preparedness and Response Draft Capabilities
YouthLink Scotland National Agency for Youth Work in Scotland:
Evaluating Online Courses and the Challenges Associated
Fifth participant survey results & actions 6 January 2017
2018 OSEP Project Directors’ Conference
Ruth Geraghty Data Curator Children’s Research Network
Management and Evaluation Programs for Faculty
Gary Carlin, CFN 603 September, 2012
Course format and credit hour (7.13)
Adjunct, newbies, and non-tenure track faculty – oh my!
Christine Fleming, PhD, CRC
Overall Evaluation Results 3/24/08 69% response rate
Implementing Race to the Top
HUNTERSVILLE FAMILY FITNESS & AQUATICS Where greatness grows.
Butler University Great Colleges To Work For
Background The following slides display the most recent NHS Staff Survey data (2016), for the questions that were deemed most relevant to the NHS QUEST.
Health Care Management Angell Snyder School of Business
The Heart of Student Success
UTRGV 2017 National Survey of Student Engagement (NSSE)
Select Findings from the Fall 2018 Enrolled Student Survey
David Mann David Stapleton (Mathematica Policy Research) Alice Porter
Sam Catherine Johnston, Senior TA Specialist National AEM Center
Learning Collaborative
Future of Public Health in Kansas: Local Pilot
Cynthia Curry, Director National AEM Center
Comparing 1-Year Out Surveys from Three Concurrent Enrollment Programs
Pathways to High School Equivalency
Linda Mayo Willis and Carolyn Pope Edwards
Philadelphia’s Nonprofit Human Service Organizations: How African American-Led Organizations Differ from White-Led Organizations Presentation at Philanthropy.
2019/2020 Staff Performance Evaluation Cycle Goals – Employee Presentation Tony Yardley, Human Resources.
Presentation transcript:

Public Health Learning Network Workforce Training Chartbook February 2019 Prepared by: National Coordinating Center for Public Health Training Common Metrics Reporting Year 2: Training Data: July 2017 – June 2018 Field Placement Data: All 2018 field placements ending by September 14, 2018

Contents Table of Contents Intro to the Common Metrics Part 1: Characteristics of Training Graphic 1. Quick Training Statistics Graphic 2. Training Delivery Mode Chart 1. Core Competencies Part 2: Common Metrics Data: Overall Chart 2. Common Metrics Question #1 – Improved Understanding Chart 3. Common Metrics Question #2 – Identified Actions Chart 4. Common Metrics Question #3 – Presented Clearly Chart 5. Common Metrics Question #4 – Satisfied with Training

Contents Table of Contents Part 3: Common Metrics Data : Delivery Mode Chart 6. Improved Understanding by Delivery Mode Chart 7. Identified Actions by Delivery Mode Chart 8. Presented Clearly by Delivery Mode Chart 9. Satisfied with Training by Delivery Mode Part 4: Student Field Placements Data Chart 10. Learning Objectives Met Chart 11. Application Chart 12. Relevant to Career Chart 13. Working with Vulnerable Populations Chart 14: Preceptor

Contents Table of Contents Part 5: Comparative Analysis Chart 15. Common Metrics Question #1 – Improved Understanding (Overall) Chart 16. Common Metrics Question #2 – Identified Actions (Overall) Chart 17. Common Metrics Question #3 – Presented Clearly (Overall) Chart 18. Common Metrics Question #4 – Satisfied with Training (Overall) Part 6: Key Findings Three-Year Comparison Data Hybrid Trainings Student Field Placements

Purpose Expectations Reporting Periods Intro to the Common Metrics To collect data on key metrics from all 10 Regional Public Health Training Centers (RPHTC). To report national training and field experience* evaluation data. Purpose All RPHTCs will incorporate the common metrics in their training and field experience evaluation tools beginning in July 1, 2016. Expectations Year 1: Data from 7/2016 – 12/2016 Year 2: Data from 1/2017 – 6/2017 Year 3 Data from 7/2017 – 6/2018 Reporting Periods Note: Represents Common Metrics data from nine reporting regions. One region did not submit data.

Part 1: Characteristics of Training

70,586 1,081 Quick Training Statistics Graphic 1 Quick Training Statistics 4,360 hours of training were offered across all regions. 70,586 Range of total training hours across regions = 97 – 1,385 hours. Training participants attended.* 1,081 Unique training courses offered. *Note: This figure is not limited to unique individuals and may include participants who have attended multiple trainings.

57% of all training hours were delivered in a Classroom-based format. Graphic 2 Training Delivery Mode 66% of participants attended trainings offered in a Self-paced Distance learning format. Archived learning was the most common training format (502 trainings). 438 trainings took place in a Classroom-based setting. 57% of all training hours were delivered in a Classroom-based format. NEEDS TO BE UPDATED n=1076

Chart 1 Core Competencies The most frequently addressed core competency across all trainings was Community Dimensions of Practice. Only 3% of courses addressed Financial Planning and Management. n = 1,074

Part 2: Common Metrics Aggregate Data

Question 1: Improved Understanding Chart 2 Half of training attendees strongly agreed with the statement “my understanding of the subject improved as a result of participating in this training.” n=35,725

Question 2: Actions to Apply Learning Chart 3 Question 2: Actions to Apply Learning Nearly half of attendees strongly agreed with the statement “I have identified actions they could take to apply information learned in the training.” n= 35,951

Question 3: Presentation Clarity Chart 4 Over half of attendees strongly agreed with the statement “the training information was presented in a way they could clearly understand.” n= 35,251

Question 4: Overall Satisfaction Chart 5 Over half of attendees strongly agreed with the statement “I was satisfied with the training overall.” n= 35,581

Part 3: Common Metrics By Delivery Mode

Understanding by Delivery Mode Chart 6 Understanding by Delivery Mode Attendees in Hybrid trainings were the most likely to agree with the statement “my understanding of the subject improved as a result of participating in this training.” n= 3,329 n= 4,133 n= 11,307 n= 16,956

Understanding by Delivery Mode Identified Actions by Delivery Mode Chart 7 Understanding by Delivery Mode Attendees in Hybrid trainings expressed the highest level of agreement with the statement “I have identified actions I will take to apply the information I learned.” n= 3,311 n= 4,131 n= 11,251 NEEDS UPDATED n= 17,258

Presentation Clarity by Delivery Mode Chart 8 Attendees in Hybrid trainings expressed the highest level of agreement with the statement “the information was presented in a way I could clearly understand.” n= 3,344 n= 4,139 n= 10,955 n= 16,813

Satisfaction by Delivery Mode Chart 9 Overall Satisfaction by Delivery Mode Attendees in Hybrid trainings were most likely to agree with the statement “I was satisfied with this training overall.” n= 3,368 n= 4,125 n= 11,368 n= 16,720

Part 4: Student Field Placements Data

Question 1: Learning Objectives Chart 10 Over 90% of student field placement participants agreed with the statement “my learning objectives were met.” n = 164

Question 2: Application Chart 11 A strong majority (94%) of student field placement participants agreed with the statement “I identified actions to apply the information.” n = 163

Question 3: Career Chart 12 94% of student field placement participants agreed with the statement “information was relevant to my career.” n = 164

Question 4: Vulnerable Populations Chart 13 86% of student field placement participants agreed that the experience “increased interest in working with vulnerable populations.” n = 164

Question 1: Preceptor Chart 14 Chart 14 95% of student field placement preceptors agreed that “student learning objectives were met.” n = 106

Part 5: Comparative Analysis

Comparison Table Chart 15 Three-Year Comparison Data: “My understanding of the subject improved as a result of participating in this training.”

Comparison Table Chart 16 Three-Year Comparison Data: “I have identified actions I will take to apply information I learned from this training in my work.”

Comparison Table Chart 17 Three-Year Comparison Data: “The information was presented in ways I could clearly understand.”

Comparison Table Chart 18 Three-Year Comparison Data: “I was satisfied with this training overall.”

Part 6: Key Findings

Three-Year Comparison (2016-2018) Key Findings Comparison Three-Year Comparison (2016-2018) Common Metric (CM) #1: Overall participant understanding was highest in 2018 or Year 3 (agree/strongly agree), with a shift leaning from strongly agree to agree responses CM #2: Overall participant actions identified to apply in the workplace remained the same over all three years (85% agree/strongly agree)

Three-Year Comparison (2016-2018) Key Findings Comparison Three-Year Comparison (2016-2018) CM #3: Overall participant clarity has remained at 90% or above with a strong leaning towards strongly agree over agree responses (preference for strongly agree over agree ranging from 20%-11%) CM #4: Overall participant satisfaction remained the same over all three years (89%; agree/strongly agree), with a general shift leaning towards strongly agree to agree responses

Three-Year Comparison Findings Key Findings Comparison Three-Year Comparison Findings (2016-2018) The most frequently addressed core competency across all trainings during all years was Community Dimensions of Practice. There was a significant increase in the number of total training participants each year: from 29,525 to 54,983 in 70,586 in Year 3. The total number of unique trainings also increased: from 532, to 779, to 1,081 in 2018.

Three-Year Comparison Findings Key Findings Comparison Three-Year Comparison Findings (2016-2018) The total number of training hours was about 3 times higher in 2018 than in 2016 (4,360 versus 1,467). Self-paced distance learning (or archived learning) remains the most common training format across all years.

The Rise of Hybrid Trainings Key Findings Trainings The Rise of Hybrid Trainings In 2017 those attending hybrid trainings were the most likely to agree with the statement “the information was presented in a way I could clearly understand.”, a slight shift (2%) from classroom based trainings as the lead in 2016. In 2017, those that participated in hybrid trainings were the most likely to agree with the statement “I was satisfied with this training overall. In 2016, those that participated in both classroom and hybrid trainings were the most likely to agree, thereby representing a slight decrease in classroom-based training outcome percentages. In the most recent year (July 2017-June 2018), training outcomes for Hybrid trainings were highest across all four Common Metrics >90% Strongly Agree/Agree on all items NEEDS TO BE UPDATED

Student Field Placements Key Findings (All 2018 student field placements ending by September 2018) Nearly all field placement participants felt their learning objectives for the placement were met. Similarly, most preceptors also agreed that the student field placement learning objectives were met.

Funding Statement Satisfaction by Delivery Mode This project is supported by the Health Resources and Services Administration (HRSA) of the U.S. Department of Health and Human Services (HHS) under grant number UB6HP27435.

Evaluation Team For More Information Brittany Bickford, MPH Senior Research and Evaluation Analyst National Coordinating Center for Public Health Training National Network of Public Health Institutes bbickford@nnphi.org Jennifer Edwards, PhD, GCIS Principal Research Scientist Aaron Alford, PhD, MPH, PMP Director, Research and Evaluation