1 Performance Management Presentation (7) Manage and Administer Worksite Enrichment Program Lead: Timothy Tosten Members: Charly Abrew Joy Postell John.

Slides:



Advertisements
Similar presentations
1 Performance Assessment Process: The Reviewer’s Perspective May 2014.
Advertisements

1. Renewing Your Agency’s Contract 2 FY 2013 CACFP Application Late August an will be sent to the Authorized Representative regarding FY2013 contract.
Performance Assessment Process: The Employee’s Perspective May 2014.
Project Monitoring Evaluation and Assessment
Survey Results September Survey Information There is an error margin of ±3.6 on this survey. South Ogden City sent out 5,300 surveys and received.
1 FY 2004 Performance Management Plan Presentation Conduct Fire Prevention Services Team Members: J. P. McCabe, P.E. Fire Marshal, DFM, ORS Samuel A. Denny,
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
WELCOME! We will begin our webinar at the top of the hour As you log on, do not be surprised if you don’t hear anyone else; participants are placed on.
ASPEC Internal Auditor Training Version
1 Performance Management Presentation Access Control Team Members: Major Billy Alford, Team Leader Bill Brosius, Alex Salah, Cassandra Harris ORS National.
1 Performance Management Presentation Team Members: Ken Ryland and Marianne Bachmann, Hugh Malek, Karla Terney, Alice Hardy, Joseph Kristofik, Dan Shoen.
Measuring for Success Module Nine Instructions:
Performance Management Open Information Session Spring 2009.
Welcome to the Learning Community 2015 Roll out webinar Hosted by the Family Institute for Education, Practice & Research The webinar will begin shortly.
1 Performance Management Presentation Team Members: Gil Kruemmel, Paulette Mills ORS National Institutes of Health 9 January 2004 Provide Printing and.
You said… We did ……. Patient Survey Towards the end of 2012 we conducted a Patient Satisfaction Survey which we put on our website and also made paper.
FLC Website — Overview Brent Jacobs February 13, 2007.
Tulane University 1 Tulane University Employee Satisfaction Survey Results October 2012.
THE HR APPRENTICERICHMOND THE HR APPRENTICE RICHMOND Marvelous Membership Mavericks.
McLean & Company1 Improving Business Satisfaction Moving from Measurement to Action.
UBC Department of Finance Campus Community Customer Service Survey Forum Presentation March 1, 2004.
Alternative Pay Program Guidelines for Managers July 2012.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
ORF 1 Performance Management Presentation Team Members: Melissa Richardson, Veronica Crawford-Robinson, Jim Phelan, Stewart Hill 9 January 2005 Lease,
ACSP Report – Review of Open Suggestions Nate Davis.
2010 Results. Today’s Agenda Results Summary 2010 CQS Strengths and Opportunities CQS Benchmarks Demographics Next Steps.
Military Family Services Program Participant Survey Training Presentation.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Food Services Program Evaluation March 28, 2011 Pocantico Hills Board of Education April 26, 2011.
Youth Ambassadors & Site Liaison Training. How does the partnership between my site and Connecting Generations work? The AmeriCorps member is the bridge.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Leadership Council Retreat August 21, 2014 New Mentor Orientation Anchoring Our Work with DATA.
Indistar® Support and Coaching. The Who, What, When an d Where of Coaching.
1 Data Quality Standards at the U.S. Census Bureau Pamela D. McGovern and John M. Bushery U.S. Census Bureau Quality Program Staff Washington, DC
1 FY04 ORS Performance Management Manage Information Technology Team Members: Charlie Jones, Ben Ashtiani, Denny Bermudez, La'Tanya Burton, Ron Edwards,
1 Performance Management Presentation Provide Basic Animal Life Support Team Members: Eileen Morgan, Dr. Charmaine Foltz, Jim Weed, Dennis Barnard, Andy.
Background Management Council (MC) was briefed on approach in early Feb 2003 and approved it Agreed that every Service Group (SG) will participate in.
GOVERNOR’S EARLY CHILDHOOD ADVISORY COUNCIL (ECAC) September 9, 2014.
LibQUAL+™ Process Management: Using the Web as a Management Tool Amy Hoseth Massachusetts LSTA Orientation Meeting Boston, MA October 21, 2005 old.libqual.org.
ATUL PATANKAR [ ASUG INSTALLATION MEMBER MEMBER SINCE: 2000 LINDA WILSON [ ASUG INSTALLATION MEMBER MEMBER SINCE: 1999 JUERGEN LINDNER [ SAP POINT OF CONTACT.
1 You are a New Member of the JAC; NOW WHAT? As a new Journey-Level Advisory Council (JAC) member, you probably have many questions, including those about.
1 FY02/FY03/FY04 Evaluation Summary Security and Emergency Response Program Division of Fire Rescue Services “NIH Fire Department”
Take Charge of Change MASBO Strategic Roadmap Update November 15th, 2013.
1 Performance Management Presentation (7) Manage and Administer Worksite Enrichment Program Worksite Enrichment Program Team Lead: Timothy Tosten Members:
ORF 1 Performance Management Presentation List your Service Group Team Members: List Leader and Members ORS National Institutes of Health Date.
Building Human Resource Management SkillsNational Food Service Management Institute 1 Delegating and Empowering Objectives At the completion of this module,
1 Performance Management Presentation Support Foreign Staff Exchange Program Team Members: Team Leader: Candelario Zapata Team Members: Brian Daly, Stephanie.
1 FY03 PMP Presentation Package Security and Emergency Response Program Division of Fire Rescue Services “NIH Fire Department”
1 Performance Management Report Service Group: Provide Police Services Team Leader: Major Ophus Robertson Team Members: Lieutenant Jensen Lieutenant Cox.
Project web site: old.libqual.org LibQUAL+™ Process Management: Using the Web as a Management Tool ALA Midwinter Conference San Diego, California January.
@theEIFoundation | eif.org.uk Early Intervention to prevent gang and youth violence: ‘Maturity Matrix’ Early intervention (‘EI’) is about getting extra.
FILLING THE GAPS THERE ARE NO PROBLEMS ONLY SOLUTIONS.
District Climate Survey—Parents & Community Results and Analysis June /10/20101.
1 Results from the Performance Management Survey Prepared by Janice Rouiller and Laura Stouffer, SAIC and Joe Wolski, OQM 9 May 2005.
Report on the NCSEAM Part C Family Survey Batya Elbaum, Ph.D. National Center for Special Education Accountability Monitoring February 2005.
Library Role in Global Health Survey Global Health Vision Task Force April
Summary of Action Period 2 TN Patient Safety Collaborative: Reducing Physical Restraints Learning Session 3 October 6, 7 & 8 th, 2009.
Overall NSW Health 2011 YourSay Survey Results YourSay - NSW Health Workplace Survey Results Presentation NSW Health Overall Presented by: Robyn Burley.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
UKZN Employee Engagement Survey – 2013 Overall Report 1.
Response to an Emergency Training for 211 Staff in Ontario Updated September
CBFWA Organization and Staff Satisfaction Survey 2010 Wildlife Advisory Committee Results.
AEA 267 Sector Meetings August 12-15, 2013 Clear Lake Waverly Cedar Falls Marshalltown.
School Improvement Updates Accreditation (AdvancED) Process ASSIST Portfolio for Schools May 2016 Office of Service Quality Veda Hudge, Director Donna.
A FRUIT AND VEGETABLE PRESCRIPTION PROGRAM
Well Trained International
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
HUNTERSVILLE FAMILY FITNESS & AQUATICS Where greatness grows.
Presentation transcript:

1 Performance Management Presentation (7) Manage and Administer Worksite Enrichment Program Lead: Timothy Tosten Members: Charly Abrew Joy Postell John CrawfordMary Ellen Savarese Christopher GainesDavid Shea Carole Harman Kiana Timmons Pamela Jenkins Office of Research Services National Institutes of Health January 14, 2005

2 Table of Contents PM Template ……………………………….………………………3 Customer Perspective……………………….……………………..7 Internal Business Process Perspective………………………………………………………34 Learning and Growth Perspective………………………………………………………45 Financial Perspective………………………………………………50 Conclusions and Recommendations………………………………………………58

3

4 PMP Template- Page 2

5 Relationship Among Performance Objectives

6 High Impact Objectives

7 Customer Perspective

8

9 Survey Background Purpose Use customer feedback to improve the work/life programs offered to the NIH community: Child Care Fitness Centers / Wellness Programs Food Services (dining centers, coffee and snack bars, concession stands, vending machines) Interpreting Services Retail Programs (employee stores, banking services) Assess importance of services offered, and their contribution to quality of work life. Assess awareness of availability of NIH liaison (DoES staff) for work/life programs, and assess satisfaction with interactions with DoES staff. Solicit comments on ideas for additional services and/or potential improvements to NIH employees’ quality of work life.

10 Survey Background Methodology Original web-link for survey was made available for testing on Thursday, July 15, 2004 Testing completed and survey ed to NIH Staff-DC Area distribution list on Friday, August 27, 2004 Survey initially became “unresponsive” when respondents entered comments that were over 255 characters or included a “special character” – Issue was resolved late Monday, August 30, During this time, over 1500 surveys were received despite the comment restriction. Reminder sent on Thursday, September 9, 2004 Respondents immediately began to experience a “database error “. On the same day, the server on which the survey was hosted was down for a few hours due to a network problem (presumably unrelated to the database error) – Issue was resolved later the same day. During this time, no surveys were received. Survey closed on Monday, September 20, 2004 Survey analysis completed on Monday, October 18, 2004

11 Survey Background Distribution Number of surveys distributed26,731* Number of respondents 4,889 Response Rate 18 % *Census population estimates obtained from DFP effective May 15, Census number includes NIH employees, contractors, Postdocs, Visiting fellows, and Volunteers on-campus and in the local Bethesda area, but does not include students.

12 FY04 Importance Ratings Mean Response Very Unimportant Very Important N = 3,212 N = 653 Note: ** Groups significantly different (p <.01) N = 1,448 N = 2,394 N = 1,296 N = 2,790 N = 1,393 N = 2,561 N = 1,444 N = 2,583 N = 2,576 N = 239 N = 3,414 N = 285 N = 2,663 N = 227

13 FY04 Quality of Work Life Ratings N = 4,362 Mean = 4.11 Median = 4 The work site enrichment programs (e.g., dining centers, concession stands, credit union, gift stores, child care, fitness centers) add to my quality of my work life here at NIH. Strongly DisagreeStrongly Agree Note: 527 respondents skipped this question

14 FY04 Quality of Work Life Ratings N = 4,392 Number of Respondents 85% Did you know you have an NIH liaison for all of these programs? 15% Note: 497 respondents skipped this question

15 N = 4,392 Number of Respondents 89% Do you know how to contact an NIH liaison for these programs if you need to? FY04 Quality of Work Life Ratings 11% Note: 497 respondents skipped this question

16 N = 4,382 Number of Respondents 3% Have you contacted someone from the Division of Employee Services about any of the programs discussed in this survey during the past 6 months? FY04 Quality of Work Life Ratings 5% 92% Note: 507 respondents skipped this question

17 FY04 Satisfaction Ratings Interaction with Division of Employee Services Staff N = 220 N = 217 N = 222 N = 219 Mean Response Note: Only answered by those who have contacted someone from the Division of Employee Services during the past six months.

18 Customer Survey Summary Importance of Services Respondents were asked to rate the importance of services on a 10-point scale ranging from (1) Very Unimportant to (10) Very Important. Respondents who have used service within past 6 months versus those who have not There are significant differences between respondents who have used the services recently and those who have not. In all cases, respondents who have used the services recently find the services to be significantly more important than those who have not. For those who have used the service, importance ratings range from a high of 9.04 (Fitness Center) and 9.01 (Child Care Services) to a low of 6.74 (Gift Store) and 7.16 (Vending Machines). All are above the midpoint of the scale. For those who have not used the service, importance ratings range from a high of 5.61 (Fitness Center) and 5.16 (Dining Services) to a low of 4.08 (Vending Machines) and 4.13 (Gift Stores). The fitness center seems to be rated highest in importance for both groups while vending machines and gift stores least important – regardless of use.

19 Customer Survey Summary (cont.) Importance of Services (cont.) On- versus off-campus respondents On and off campus respondent ratings differed significantly from each other. In general, on campus respondents found services to be more important than off campus respondents with two exceptions. Off campus respondents indicated that fitness centers were significantly more important to them than on campus respondents. Vending machine importance did not differ significantly between respondents. NIH Employees versus other respondents NIH employee versus other respondents did not differ significantly with two exceptions. NIH employees found concession stands to be more important to them than other respondents. Other respondents found child care services to be more important to them than NIH employees.

20 Customer Survey Summary (cont.) Quality of Work Life Ratings Respondents were asked to rate the extent to which the services offered by DoES contribute to the quality of the work life at NIH. The scale ranged from (1) Strongly Disagree to (5) Strongly Agree. The mean rating of 4.11 indicates that respondents strongly agree that the services contribute to the quality of their work life. Only 15% of respondents know that they have a NIH liaison for the services. Only 11% of respondents know how to contact the NIH liaison. Only 5% of respondents contacted someone from DoES during the past 6 months. Respondents (N = 225) who had contacted someone were asked to rate their satisfaction with their interaction on four dimensions. A 10- point scale ranging from (1) Unsatisfactory to (10) Outstanding was used. The following mean ratings were obtained: Availability (7.43) Responsiveness (7.46) Competence (7.68) Handling of Problems (7.09) Their were no differences between on and off campus respondents or between NIH employees and others.

21 Communicate the survey results to important stakeholder groups ORS Senior Management DoES staff Survey respondent pool DoES ACTIONS Present results to ORS Director and staff Have 15 minute portion of DoES staff meeting dedicated to results Create a webpage for broadcasting our results Recommendations & Actions

22 Recommendations & Actions (cont.) Communicate DoES services and capabilities to customers Provide survey results to survey respondent pool Consider ways to provide information on services to the NIH community (e.g., website, newsletters, s, orientation for new employees, etc.) DoES ACTIONS Attend NIH Orientation Sessions every two weeks Publish a series highlighting our results (News2Use, Catalyst) Devise other “new” means of getting the word out Create a webpage for broadcasting our results

23 Review comments in presentation for specific issues that can be tackled DoES ACTIONS Each service area will read through comments and determine which appear more frequently By April 2005, each area will develop an action plan based on comments Review specific questions with lowest mean ratings to determine priorities for improvement DoES ACTIONS By April 2005, each service area will take lowest mean rated services and develop action plan Recommendations & Actions (cont.)

24 Recommendations & Actions (cont.) Determine possible reasons for differences between NIH on campus and off campus service use and perceptions. If reasons for differences cannot be changed, consider ways to manage perceptions through education, awareness, etc. DoES Senior Management brainstorming session Focus group including employee representatives DoES ACTIONS Each service area will study this appearance By April 2005, each area will develop an action plan Conduct follow-up survey (within next 2 years) to check on improvements DoES ACTIONS DOES will work with OQM in early 2006 to develop a follow- up survey

25 Customer Feedback Boxes Implementation Date – February 2005 Purpose and expected outcome: To actively solicit customer feedback at service locations Measurements of success: C1 a. Results of Customer satisfaction survey C1 b. Usage/Demand for services (customer counts) for each service I2 a. Percent of customer suggestions/complaints and vendor inputs with follow through C1. Improve Quality of Worklife Initiative

26 C1b: Customer Counts Per Program Total customers served in F04 = 2,478,958 12% increase from FY03 (2,185,797) Note: Interpreting Services shows total number of requests, not total people served.

27 C1b: Usage/Demand for Child Care Services Total = 290 Available SpacesTotal = 943 Children on the Wait List = 10 Children

28 C1b: Usage/ Demand for Food Services Total Number of Customers Served

29 C1b: Usage/ Demand for Food Services Percentage of Actual Customer Participation by Dining Center Notes: Based on ORF/DFP census data for the actual buildings and those surrounding them Average customer participation for FY %

30 C1b: Usage/ Demand for Food Services Percentage of Total Customers by Dining Center

31 C1b: Usage/Demand for Interpreting Services Note: Interpreting Services usage has increased by 52% over the past 5 years.

32 C1b: Usage/Demand for Interpreting Services

33 C1b: Usage/Demand for Fitness Membership

34 Internal Business Process Perspective

35 Internal Business Process Perspective

36 Vendor Auditing System Implemented - October 2004 Purpose and expected outcome The purpose is to create a system to evaluate vendors during inspections. While the criteria for evaluating vendors will vary for the different programs, the scoring scale will be the same thus allowing the evaluation across programs. The theory is diligent and systematic vendor auditing will improve compliance to contracts and ultimately increased customer satisfaction. Measurements of success I1 a. Results of Audits (difference between actual performance and standards in contract/use agreement) I1 b. Percent of follow-through by vendors (based on inspections) I1. Hold Vendors Accountable Initiative

37 Customer Comment Log Implemented – August 2004 Purpose and expected outcome Track customer comments, suggestions, and complaints to ensure that they are being addressed in a timely manner. Keeping a log in Excel will allow statistics to be gathered and reported on response time to issues by DoES staff. Measurements of success C1a: Results of Customer Satisfaction survey. I1. Hold Vendors Accountable Initiative

38 Internal Business Process Perspective (cont.)

39 I2a. Percent of customer suggestions/complaints and vendor inputs with follow through Note: 78.6% of issue were resolved in less than a week.

40 IB3b. Advertising and Promotion Volume

41 IB3a. Percent of customers that can put our name with our program

42 I3. Educate Employees to Services and Service Standards Initiative Include DoES message in HR welcome package Implemented - October 2004 Purpose and expected outcome Including advertisement in HR package distributed during orientation would be an effective way of informing new employees of DoES services. Measurements of success I3 a. Percent of customers that can put our name with our program

43 Internal Business Process Perspective What the data tells us We have made a good start in increasing our vendor monitoring We are working more closely with our vendors to hold them more accountable We are marketing our services, but still there are a number of people who do not know who we are and what we do

44 Internal Business Process Perspective Actions We need to incorporate standard performance-based information into all of our contracts and use agreements We need to do more in marketing our services to the NIH

45 Learning and Growth Perspective

46 Learning and Growth Perspective

47 L2a. Results of Previous Internal Climate Survey N = 9 UnsatisfactoryOutstanding

48 L2b. Results of Previous Vendor Survey N = 112 UnsatisfactoryOutstanding

49 Learning and Growth Perspective What we have done since surveys We instituted a newsletter to our vendors We are meeting on a more regular basis with our vendors and their staff We brought in consultants to work on team building with DoES staff, both as a Division, but also in teams We have restructured staff meetings to become more efficient and effective We did not have another climate or vendor survey in FY04, we plan to have one in FY05

50 Financial Perspective

51 Financial Perspective (cont.)

52 Vendor Service Log Implementing – February 2005 Purpose and expected outcome The purpose of the log is to ensure service issues are being addressed and to track how long they take to resolve. The vendor is in the best position to detect facility deficiencies and to confirm when they are corrected. Measurements of success F1a. Speed of service tickets F1. Fulfillment of Obligation

53 F1a. Unit Measures DS1- Number children/families served DS2- Number of customers DS3- Number of customers DS4- Based on hours provided per year

54 F1a. Unit Measures Unit MeasuresFY 2004 Budget FY 2004 Actual FY 2005 Request FY 2006 Request FY 2007 Forecast Manage child care services, programs, contracts and use agreements. ( # of children/families served 1,400 1,878 2,103 2,203 Manage food services programs, contracts and use agreements. # of customers 1,727,000 2,022,684 2,083,3642,145,864 2,210,239 Manage retail and fitness services, programs, contracts and use agreements. # of customers436, ,973450,000465,000480,000 Manage interpreting services, programs, and contracts # of interpreting service hours12,58413,423 14,09414,79815,537

55 F3. Influence ORS Investments Amenities Guidelines Implemented – October 2004 Purpose and expected outcome Provide guidelines for designers so that amenities are properly planned for in construction projects. Prevent costly change orders and tenant dissatisfaction. Measurements of success F3 a. Number of change orders due to amenities and total cost. F3 b. Construction dollars spent on submission changes (impacting amenities) as a percent of total construction dollars.

56 Financial Perspective What the data and initiatives tell us The new unit measures are working much better and truly reflect what we do We did a good job of influencing what the final amenities plan was going to look like, which allows us to get amenities planning into all space planning

57 Financial Perspective Actions Ensure Amenities Plan is used like it was intended to be Continue to forecast our budget by learning as best we can what future needs will be

58 Conclusions

59 Conclusions from FY04 PM Process In 2004, we mainly concentrated on our “High Impact” Objectives Demand for our services is increasing yearly We are increasing our oversight and partnerships with our vendors We still need to do a better job of formally tracking and monitoring vendor performance We need to use the new feedback boxes and vendor maintenance logs to improve our performance We need to devise new ways to get out customers to know that we are here for them