EH&S Measures and Metrics That Matter

Slides:



Advertisements
Similar presentations
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Advertisements

The Value of What We Do Dan Phalen US EPA Region 10.
The Compelling Display of Health & Safety Information to Achieve Desired Decision Making Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Assistant.
Improving Your Writing Style: Clarity Designed by Duke University’s Writing Studio.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Honor Respect Devotion to Duty Please be seated…. Only one person from a Division at the same table.
Statistical Issues in Research Planning and Evaluation
Work motivation among healthcare professionals in the Saudi hospitals Presented by Nouf Sahal Al-Harbi Supervised by: Dr. Saad Al-Ghanim 2008.
Lesson Designing Samples. Knowledge Objectives Define population and sample. Explain how sampling differs from a census. Explain what is meant by.
Chapter 3 Producing Data 1. During most of this semester we go about statistics as if we already have data to work with. This is okay, but a little misleading.
Targeted Methods for Obtaining Feedback on Your EH&S Program Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Vice President for Safety, Health,
1 ISE Ch. 22 Managing an Ergonomics Program History of Ergonomics Programs  1993: OSHA Ergonomics Program Management Guidelines for Meatpacking.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Biostatistics Frank H. Osborne, Ph. D. Professor.
MR2300: MARKETING RESEARCH PAUL TILLEY Unit 10: Basic Data Analysis.
Evaluation. Practical Evaluation Michael Quinn Patton.
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
SAFA- IFAC Regional SMP Forum
Environmental Health and Safety Radiation Control and Radiological Services.
Safety and Health Programs
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Quality Improvement Prepeared By Dr: Manal Moussa.
Safety Review and Approval Process for Research Proposals Robert Emery, DrPH, CHP, CIH, CBSP, CSP, CHMM, CPP, ARM Vice President for Safety, Health, Environment.
Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Vice President for Safety, Health, Environment & Risk Management The University of Texas Health.
Understanding Universities and the Needs of Faculty Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Vice President for Safety, Health, Environment.
Chapter 1: Introduction to Statistics
Facilities Institute July , 2012 Houston, Texas Secrets to Synergy Between Safety, Facilities, and Procurement Robert Emery, DrPH,
2010 TECHNICAL ASSISTANCE EVALUATION REPORT SUMMARY The provision of training and technical assistance (TA) to provider agencies is an integral component.
1 October, 2005 Activities and Activity Director Guidance Training (F248) §483.15(f)(l), and (F249) §483.15(f)(2)
Performance Measurement and Analysis for Health Organizations
50 Things Any Lab Manager Member Should Know About EH&S Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Vice President for Safety, Health, Environment.
Analyzing Reliability and Validity in Outcomes Assessment (Part 1) Robert W. Lingard and Deborah K. van Alphen California State University, Northridge.
June 20, 2012 Outsourcing Physical Plant Should I ???
1. Infection Control Risk Assessment Terrie B. Lee, RN, MS, MPH, CIC Director, Infection Prevention & Employee Health Charleston Area Medical Center Charleston,
Chapter 10 THE NATURE OF WORK GROUPS AND TEAMS. CHAPTER 10 The Nature of Work Groups and Teams Copyright © 2002 Prentice-Hall What is a Group? A set of.
Program Measures and Metrics that Matter Bruce J. Brown, MPH, CBSP, CHMM, ARM Director, Environmental Health & Safety The University of Texas Health Science.
§ 1.1 An Overview of Statistics. Data and Statistics Data consists of information coming from observations, counts, measurements, or responses. Statistics.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
User Study Evaluation Human-Computer Interaction.
The University of Texas Environmental Health & Safety Academy Origins and Objectives Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Vice President.
FY05 SHERM Annual Report Key workload, performance, and managerial aspects of the UTHSCH Safety, Health, Environment & Risk Management function.
An Overview of Statistics
LOGO Evaluating Quality-in-Use Using Bayesian Networks M.A. Moraga 1, M.F. Bertoa 2, M.C. Morcillo 2, C. Calero 1, A. Vallecillo 2 1 Universidad de Castilla-La.
University Risk Management & Insurance Association Understanding Universities and the Needs of Faculty Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP,
Business Statistics for Managerial Decision Farideh Dehkordi-Vakil.
Risk Management Loss Control Metrics That Matter Jason Bible, MSM, ARM, CHMM Risk Manager The University of Texas Health Science Center at Houston 1851.
© 2013 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Nuclear Security Culture William Tobey Workshop on Strengthening the Culture of Nuclear Safety and Security, Sao Paulo, Brazil August 25-26, 2014.
Learning Objectives In this chapter you will learn about the elements of the research process some basic research designs program evaluation the justification.
Issues concerning the interpretation of statistical significance tests.
Comparison of two Methods for Estimating the Costs of Environmental Health Services Provided by LHDs in North Carolina Simone Singh 1, Nancy Winterbauer.
Management System Part II: Inventory of Radiation Sources – Regulatory Authority Information System (RAIS)
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Statistical Concepts Basic Principles An Overview of Today’s Class What: Inductive inference on characterizing a population Why : How will doing this allow.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Vice President for Safety, Health, Environment & Risk Management The University of Texas Health.
Statistics Vocabulary. 1. STATISTICS Definition The study of collecting, organizing, and interpreting data Example Statistics are used to determine car.
Program Measures and Metrics that Matter Robert Emery, DrPH, CHP, CIH, CSP, CBSP, CHMM, CPP, ARM Vice President for Safety, Health, Environment & Risk.
Targeted Methods for Obtaining Feedback on Your EH&S Program Robert Emery, DrPH, CHP, CIH, CBSP, CSP, CHMM, ARM, CPP Vice President for Safety, Health,
15 Inferential Statistics.
Logic Models How to Integrate Data Collection into your Everyday Work.
Health Care Interpreting
Inference for Proportions
Statistical Data Analysis
Lecture 1: Descriptive Statistics and Exploratory
Assessment Literacy: Test Purpose and Use
Serving our students and community today and tomorrow
Two Halves to Statistics
STAT 515 Statistical Methods I Chapter 1 Data Types and Data Collection Brian Habing Department of Statistics University of South Carolina Redistribution.
Presentation transcript:

EH&S Measures and Metrics That Matter Robert Emery, DrPH, CHP, CIH, CSP, RBP, CHMM, CPP, ARM Assistant Vice President for Safety, Health, Environment and Risk Management The University of Texas Health Science Center at Houston Associate Professor of Occupational Health The University of Texas School of Public Health

Colleges and Universities as Worksettings Very unique places of work due to the potential for simultaneous exposures to all four hazards types Physical Chemical Radiological Biological And a diverse “population at risk” Students, faculty, staff, visitors, “others”

Training Gap There are over 4,500 colleges and universities in the US Interestingly, none the EH&S professionals who serve them were formally trained on how universities operate This lack of understanding results in a lot of frustration and confusion Enhanced understanding can improve services and support

Course Objectives To begin to articulate the EH&S needs of an institution, we first must understand its characteristics To accomplish this, we need some basic descriptive institutional data Once assembled, we can begin to ask some probing questions, such as…

Basic Questions How big is your campus? How is size measured? What measures are important (e.g. resonate with resource providers?) What risks are present? How are these risks managed? Are these risks real or hypothetical? How might you determine that? How does management determine that?

Basic Questions How many EH&S staff? Are others involved with safety aspects? In your opinion, are you over or understaffed? How would you know? How would others know? How are you performing? How is your EH&S program’s performance measured? In your opinion, are these measures true indicators of performance? What do the clients served really think of your program?

Basic Questions Within the context of the mission of your institution, is your EH&S program viewed as hindering or helping? Is this measured? Is other feedback garnered? Do clients feel there are real (or perceived) EH&S program duplications of effort? What does EH&S do that really irritates clients?

Basic Questions The age old question for our profession is: “how many EH&S staff should I have?” Perhaps a equally important question is: What can the college and university EH&S profession realistically hope to obtain from a benchmarking exercise involving staffing metrics? What level of precision can we really expect? At best, we can likely only achieve a reasonable estimation of “industry averages”, such as number of EH&S FTE’s for an institution exhibiting certain characteristics

Sampling of Possible Staffing Predictors and Influencing Factors Quantifiable Institution size Number of labs Age Level of funding Population Geographic location Deferred maintenance Public/private Medical/vet schools Disjunct campus Non-quantifiable Regulatory history Level of regulatory scrutiny Tolerance of risk by leadership Level of administrative arrogance Level of trust/faith in program Ability of EH&S program to articulate needs

Desirable Characteristics of Predictors for Benchmarking Consistently quantifiable Uniformly defined by a recognized authority Easily obtained Meaningful and relevant to decision makers (provides necessary context) Consider something as simple as the definition of “number of EH&S staff”

Suggested Definition “EH&S Staff”: technical, managerial, and directorial staff that support the EH&S function Suggest including administrative staff, but it probably doesn’t make a big difference Can include staff outside the EH&S unit, but must devote half time or greater to institutional safety function (0.5 FTE) Example Safety person in facilities Student workers (>0.5 FTE) Contractors included only if on-site time is half time or greater (0.5 FTE) Example – contract lab survey techs, yes if >0.5 FTE Fire detection testing contractors, likely no.

Preliminary Results Based on Roundtable Input Findings indicated that Total NASF and Lab NASF are the most favorable (statistically significant) and pragmatic predictors On a two dimensional graph, we can only show 2 parameters, but the relationship between sq ft and staffing is clear….

Predictability of Various Models (based on n = 69) Total campus sq ft Lab + non-lab sq ft ln (total campus sq ft) ln (lab) + ln (non lab sq ft) Med/vet school General “others” category BSL3 or impending BSL4 R Squared Value X 47.69 50.46 x 64.90 71.10 78.19 78.41 80.05

Current Metrics Model # EH&S FTE = e [(0.516*School) + (0.357*ln (Lab NASF)) + (0.398*ln (Nonlab NASF)) + (0.371*BSL)] - 8.618] R2 value based on 69 observations = 80% Definitions for predictor variables: Lab NASF: the number of lab net assignable square footage Nonlab NASF: the number of non-lab net assigned square footage (usually obtained by subtracting lab from gross) School: defined as whether your institution has a medical school as listed by the AAMC or a veterinary school as listed by the AAVMC; 0 means no, 1 means yes BSL: this variable indicates if the institution has a BSL3 or BSL4 facility; 0 means no, 1 means yes

Summary The data from 69 institutions from across the country indicate that four variables can account for 80% of the variability in EH&S staffing: Non lab net assignable square footage Lab net assignable square footage Presence of Med or Vet School Existence of BSL3 operations These predictors important because they are recognized and understood by those outside the EH&S profession With the collection of more data, the precision of the model could likely be improved –to the benefit of the entire profession

Epilogue Note – even a predictor number for staff doesn’t give us any indication about their proficiency and efficiency So what should EH&S know? And what should they measure to display what they do?