Presentation is loading. Please wait.

Presentation is loading. Please wait.

Portfolio Committee: 4 November 2015

Similar presentations


Presentation on theme: "Portfolio Committee: 4 November 2015"— Presentation transcript:

1 Portfolio Committee: 4 November 2015
Frontline Service Delivery Monitoring (FSDM) Programme Portfolio Committee: 4 November 2015

2 FSDM BACKGROUND AND OBJECTIVE
The FSDM initiative is aimed at strengthening the M&E practices of field-level managers and their supporting decision makers in head offices. The intention is not to cover all facilities yet to demonstrate the value of on-site monitoring to selected types of facilities/ sectors. We anticipate that monitoring findings may not be acted on – hence we present the findings to top management of departments as well as re-visit some facilities to track progress. The programme is implemented jointly with all 9 Offices of the Premier: Demonstrate to OTPs and sector departments the value of on-site verification of reported results. Demonstrate the value of collecting monitoring information from different sources = users, staff, independent monitors. Demonstrate how to use evidence collected at facility-level for catalysing improvements – conduct improvements monitoring. Objective is to catalyse service delivery improvements – work with departments to use the findings as an input into service delivery improvement and planning.

3 SCOPE OF FRONTLINE FACILITIES
There is approximately frontline facilities in the country Of which are schools, 4100 are health facilities (or more than 8000 if we include mobile/satellite clinics), Police stations: 1140 (2014), SASSA: 9415 (2015), Home Affairs: 408  (2015)…. What does DPME do? The DPME has since 2011 done field monitoring of 764 facilities, focusing on 8 types of facilities (out of a potential ) – each facility was visited twice, once to do the monitoring and once to provide a feedback report to the management of the facility. The monitoring teams are DPME staff, OTP staff and some provinces eg Gauteng use Community Development Workers to monitor. DPSA is the policy champion for service delivery standards and support for service delivery improvements.

4 IMPLEMENTATION OF THE FSDM PROGRAMME
DPME: (i) Design and maintain the monitoring tools, the monitoring protocols, analyse findings and report to Cabinet and national sector departments. (ii) Jointly conduct the monitoring visits and the improvements monitoring visits. Offices of the Premier: Joint monitoring with DPME Present findings to provincial HODs and MECs Monitor adherence to agreed improvements WHAT ARE THE STEPS Assess the outcomes of quality of service delivery improvement programmes at facility-level: output is to produce score cards and improvement plans Communicate: Provide feedback (report and meeting) on the findings Facilitate/advise on improvements – facilitate improvements monitoring meeting Assess improvements (every year): Monitor improvements and report on findings

5 FSDM IMPLEMENTATION METHODOLOGY
Data is collected using a set of paper based tools/ questionnaire from three info-sources (citizens, Staff) through interviews and Monitors observations. Baseline assessment - the initial unannounced monitoring stage to the targeted service delivery sites. Baseline data collected and compiled describes the situation with proposed recommendations prior to the development /implementation of the improvement plan i.e. summary report. Feedback meeting - communication of findings generated through the baseline assessment to the relevant stakeholders and departments. The feedback process is aimed at verifying and presenting the findings, agreeing on the recommendations with activities, budget allocation and timelines i.e. improvement plan aligned to other sector/ departmental initiatives. Improvements monitoring – follow up engagement with selected facilities to track progress on the committed action items/activities as per the agreed improvement plan. This process takes place in two phases i.e. (i) Improvement meeting and (ii) re-scoring.

6 THE FSDM PROGRAMME TARGETS THE FOLLOWING FACILITIES
Police Stations SAPS Schools (Primary & High Schools) Basic Education Hospitals, Clinics, Community Health Centres Health SASSA – Local Offices & Pay Points Social Development Home Affairs (Local Offices) Home Affairs Magistrate Courts Justice Municipal Customer Care Centres (MCCC) Local Government (COGTA) Drivers Licencing and Testing Centres (DLTC) Transport

7 Key Performance Areas (KPAs) monitored TO ASSESS QUALITY OF SERVICE DELIVERY
Location and Accessibility Accessible distance Physical access into facility Physical premises fit for purpose Adequacy of min equipment and staff to provide service Dignified Treatment Courteous, dignified and respectful services Language of choice Knowledgeable and responsive officials Easily recognisable frontline staff Information about application process or service requests Awareness of service charters and standards Cleanliness & Comfort Cleanliness and maintenance of facility Suitable waiting area Child friendly services (courts only) Accessible, clean and functional ablution facilities Visibility & Signage Signage to facility Signage within facility Signage in local language Service offering information Safety Minimum safety and security measures Safety procedures Safety of records FSDM KPAs and PAs Complaints and compliments/ Citizen experience Awareness of complaints lodging mechanisms Complaints monitors and tracking system Citizen satisfaction Queue Management & Waiting times Queue management systems Waiting times Special provision for users Opening & closing times/ Service availability and efficiency Operational hours Adherence to operational hours

8 FSDM IMPLEMENTATION METHODOLOGY CONT...
PERFORMANCE RATINGS (FSD and MPAT) Poor 1 Fair 2 Good 3 Very Good 4 Level Description Level 1 Non-compliance with legal/regulatory requirements Level 2 Partial compliance with legal/regulatory requirements Level 3 Full compliance with legal/regulatory requirements Level 4 Full compliance and doing things smartly

9 BASELINE MONITORING FACILITIES MONITORED 1 JUNE 2011 TO 31 MARCH 2015 (678) FACILITIES MONITORED 1 APRIL 2014 TO 31 MARCH 2015 (123) DLTC Education Health Home Affairs Justice MCCC SAPS SASSA Total per province EC* 4 3 13 7 5 8 55 1 2 16 FS 6 21 12 10 73 GP 9 51 49 25 19 187 KZN* 37 LP 11 75 MP 15 NC 62 NW* 14 WC* 61 Total per sector 52 128 158 57 60 85 77 678 28 31 123 Since 2011, 678 facilities have been monitored:, 52 DLTCs, 128 Schools, 158 Health Facilities, 61 Home Affairs offices, 57 Courts, 60 MCCCs, 85 Police Stations, 77 SASSA facilities. Although this sample size of 678 represents a small percentage of the total number of facilities in the country, departments are encouraged to increase their on-site monitoring presence so as to deepen their understanding of frontline facilities conditions. In 2014/ facilities were assessed in all nine provinces.

10 PROGRESS TO DATE Baseline monitoring: 678 frontline facilities monitored since June 2011 to March 2015, by joint DPME-OTP monitoring teams. For each facility, a score card and report is produced and presented annually to the facility-management, the responsible national department and cabinet. 90 new facilities will be monitored annually by DPME. Improvements Facilitation and Improvements Monitoring: of the 678 facilities monitored, 123 were selected for improvement monitoring in 2014/15. Outcomes of the improvements monitoring was presented to facility-management, national sector departments and cabinet and published in September 2015. This being the fifth year of implementing the programme, the focus is more on facilitating and advocating for managing and sustaining improvements, ensuring that interventions are equipped to continue delivering benefits into the future. The programme has contributed to a stronger strategic focus by responsible departments on the performance of frontline facilities. Most departments that we are monitoring are now improving their own planning and monitoring of facilities. It has also assisted in the review and improving of front-office operations, customer care and service delivery particularly on customer interface and access by departments.

11 BASELINE MONITORING SUMMARY OF FINDINGS FOR 2014/15: TYPE OF FACILITY (2)
Of the 123 facilities monitored, Health, HA and Courts achieved scores closest to the desired 3, although none of the 8 types of facilities have yet achieved the average of 3. Complaints Management remains weakest at 1.9 average, Dignified Treatment highest at 3.1 average.

12 BASELINE MONITORING SUMMARY OF FINDINGS FOR 2014/15: TYPE OF FACILITY (3)
PROVISION FOR ACCESS TO SERVICES Kimberly Magistrate Court: (NC): Special lower counter for disability access. Umzimkhulu Home Affairs: (KZN): Special lower counter for disability access. Bethlehem Home Affairs (FS): Provision for physical access to the facility in PROVSISION FOR EXTERNAL SIGNAGE – WAY FINDING SIGNS Siwali Primary School (EC) , Ntabankulu CHC (EC) , Mount Frere Magistrate Court (EC)

13 IMPROVEMENT MONITORING RESULTS (1)
A total of 678 facilities have been assessed since 2011, of which 123 were selected for improvement monitoring in 2014/15. Of the 123 subjected to improvements monitoring, 65% (80) facilities have shown improvement in scores, 33% (41) of facilities regressed in scores and 2% (2) facilities status remains the same.

14

15 IMPROVEMENT MONITORING RESULTS (3)
Complaints management improves year on year, but remains the KPA with the lowest average score at 2.1 for 2014/15.

16 IMPROVEMENT MONITORING RESULTS (4)
IMPROVEMENTS SUSTAINED: Wolmaranstad SASSA (NW): Cleanliness and comfort with the systematic arrangement of waiting areas and consulting stations make best possible utilisation of available floor space adequate waiting areas, management of queues that enables a clear flow process. Provision for internal signage in three dominant languages , for nappy changing station and user-friendly toilets.

17 IMPROVEMENT MONITORING RESULTS (4)

18 IMPROVEMENT MONITORING RESULTS (4)
Each facility has a detailed score card that reflects the improvements-trends over time – for actioning by the responsible department. Score card example below. For 2015/16, the facilities will be again monitored to assess improvements (120 facilities)

19 KEY LESSONS AND COMMON CHALLENGES
Lesson 1: Frontline Performance is increasingly becoming a Strategic Issue the FSDM initiative aim to focus government on the strategic importance of having healthy institutions at the frontline. there has been a noticeable improvement in the focus of senior management and leadership on the frontline, departmental Strategic Plans, Annual Performance Plans and Budget speeches reflect this– they understand that a dysfunctional frontline facility is a strategic matter.

20 KEY LESSONS AND COMMON CHALLENGES
Lesson 2: Inadequate investment in managing improvements initiatives at facility level An experienced Lean Management practitioner said “Government has projects to improve staff attitudes, but they should rather invest in fixing processes - good processes will result in good staff attitudes and happy clients”. Continuous operations improvement culture is a requirement for sustaining operations excellence in government departments and at facility-level Complex change initiatives needed at facility-level fail because often head offices and facility staff do not have the required skills to introduce and implement change initiatives and are not allowed time to do so - we are likely to bring about short-term improvements and not systemic changes. Initiatives such as Project Khaedu is aimed at deployment of problem solving capacity, but anecdotal evidence showed that most of the officials deployed on the ground to assist do not have the necessary operations management and problem solving skills to facilitate and implement change.

21 KEY LESSONS AND COMMON CHALLENGES
Lesson 3: More in-depth assessments of complaints handling. Every year in the FSDM annual findings reports, we have highlighted that Complaint management in most facilities continues to be a challenge. Given this continuing weakness, the DPME, under the Presidential Hotline project, is developing a Complaints Handling Assessment Framework. This framework identifies eight standards that all organisations should adhere to when developing and maintaining a complaints and enquiry handling system - (1) Leadership and Accountability (2) Processes and Procedures (3) Resources (4) Acknowledgement, Interrogation and Investigation (5) Resolution (6) Accessibility (7) Continuous improvement and (8) Collaboration. Through a set of question the framework will test the extent to which these standards have been applied in a government department. This simple assessment framework will be made available to all departments and provinces – it can be used to conduct regular self-assessments to assess the state of complaint handling against the 8 KPAs and can then be used for more targeted improvements of complaints management systems.

22 KEY LESSONS AND COMMON CHALLENGES
Lesson 3: More in-depth assessments of complaints handling. Receive and capture all complaints Acknowledge all complaints quickly Assess the complaint and prioritise Plan the investigation Investigate the complaint Simple complaints Resolve Respond to the complainant Follow up any customer service concerns Consider if there are any systemic issues to be addressed

23 KEY LESSONS AND COMMON CHALLENGES
Lesson 4:The need for measurable service standards at facility-level In many cases, the absence of measurable service standards at facility-level for quality of service Benefits of measurable service standards (i) These signal the minimum level of service expected from service areas to citizens, also serve as the basis for recourse by citizens if these standards are not met. (ii) also serve to direct effort and resources towards achieving minimum service standards and a focus for planning measurable improvements in key service delivery processes. It appeared that some guidance was needed in assisting departments in setting norms and standards that are (i) targeted (ii) appropriate (iii) relevant and (iv) measurable.– so that measures for compliance/3 for each of the eight “quality assessment areas “can be clarified.

24 RATINGS METHODOLOGY – EXAMPLE OF THE PROPOSED METHODOLOGY
Cleanliness and comfort New measure proposal or rewording of existing measure Guidance for poor = 1 (unmet) Guidance for fair = 2 (partially met) Guidance for good = 3 (MET) Guidance for very good = 4 (exceed) The facilities grounds and outside areas are clean and well kept (Monitor) The outside areas are littered, plants are overgrown and grass is not mown and neat There are some pieces of litter or the outside area isn’t tidy The outside areas are free from litter and the grass and plants are well kept and neat The facility and its surroundings are clean and free from litter Is the facility and outside areas clean? (Citizen) No - the grounds are not neat and tidy yes - the grounds are tidy sometimes only Yes - the grounds are always neat and tidy Yes - the grounds are always neat and tidy and the area outside the facility is also tidy The inside of the facility is clean and well kept (Monitor) No the public areas are littered, floors are dirty and/or there are spills on the floor and/or they smell bad There are some pieces of litter or the inside area isn’t tidy The public areas are free from litter and dirt The public areas are free from litter and dirt and any litter and spills are cleaned up within a short space of time Is the facility clean inside? (Citizen) No - the facility is not clean inside Yes but there are some pieces of litter or the inside area isn’t tidy all the time Yes - the facility is clean inside Yes - the facility is clean inside and even if there is a spill it gets cleaned up quickly The same scoring systems which is linked and provides defensible results

25 Thank you


Download ppt "Portfolio Committee: 4 November 2015"

Similar presentations


Ads by Google