Presentation is loading. Please wait.

Presentation is loading. Please wait.

Monitoring and Evaluation for FETPs

Similar presentations


Presentation on theme: "Monitoring and Evaluation for FETPs"— Presentation transcript:

1 Monitoring and Evaluation for FETPs
Donna S. Jones, MD, MPH Division of Public Health Systems and Workforce Development Center for Global Health

2 Division M&E Goal for FETPs
Improve monitoring and evaluation of field epidemiology training programs to support program improvement, quality and sustainability – and thus to strengthen public health systems.

3 Levels of FETP Monitoring/Evaluation
Critical Outcomes: The intended results of an FELTP implemented in an MOH Multi-site FETP Evaluation: in development Facilitated Self-Assessment (Scorecard): in use Accreditation: in development Program Monitoring: Standard information used to track program implementation and outputs, including tracking completion and quality of individual trainee competencies Routine data reporting to Division

4 Critical Public Health Outcomes
Increased Capacity of the Workforce Strengthened Surveillance Systems Improved Preparedness and Response Expanded Collaboration and Networking Within and Across National Boundaries Enhanced Effectiveness of Policies and Practice Questions to assess progress toward achievement of critical outcomes (or desired and perceived program impact in strengthening systems) What do you consider to be the workforce gaps that the FETP is designed to address? How well are the graduates addressing this gap, both in terms of numbers of persons in positions and the skills they bring to the positions? Can you point to specific examples? Do you have a numeric or other target that you want to achieve for trained epidemiologists/program graduates? Are there other ways that the program or graduates address other workforce gaps (training lower-level epidemiologists, for example)? What do you consider the role of the program itself and the graduates in strengthening, expanding or improving the surveillance system at the national or other levels? Do you think the program and graduates are achieving success in those roles? Can you give specific examples of surveillance system improvements, better use of surveillance data, new surveillance systems or functionality that can be attributed to the program and/or the graduates? How critical is the program for responding to public health emergencies/outbreaks? Do they serve as first responders? Is their work considered critical to the response or are they doing studies that are of secondary use? Has it been an expectation that the program or graduates will contribute to building up preparedness and response for the country? If so, how successful have they been? Do you feel that the country (or sub-national areas) is better prepared for detecting and responding to emergencies and outbreaks since the program as begun? Can you give specific examples? What role has the program played in creating partnerships and expanding the number of organizations/other ministries/networks that collaborate with the MOH on health issues? Do you think this is an important role for the program? Has it been able to make progress in this? Please give examples. Do the graduates in their positions in the MOH stay involved in these types of collaboration and networks? What is the expectation of how the program and graduates should contribute to improved health policies and implementation of health policies?

5 Multi-site Evaluation
Last done in 1998 by Battelle Meeting last year to draft quality indicators Funded this year with end of year funds Up to 10 sites Expert consultants meeting this week

6 Planned multi-site evaluation draft quality measures –
Mentored Public Health Activities in Applied Epidemiology Ratio of time spent on mentored public health activities in applied epidemiology vs. classroom Access to and use of public health surveillance data Utilization of public health surveillance data for action Access to participate in variety of relevant outbreak investigations Access to participate in relevant epidemiologic assessment of other public health priorities Utilization of field activity results as a basis for public health action Number of abstracts and manuscripts accepted to meetings and journals Quality of abstracts

7 Draft quality measures – planned multi-site evaluation – cont.
Service and Value to Ministry of Health or other primary government public health institution (MOH*) Impact of field activity results: e.g.policy/ intervention created/modified as a result of FETP activities. Requests for assistance by MOH* and national programs Workforce Strengthening Graduates in relevant positions of country health system over time Supervision and Mentoring Sufficient program staff resources for supervision and mentoring Guidelines and orientation for supervision and mentoring Periodic systematic evaluation of supervision and mentoring Trainees completing the required activities within the required time

8 Facilitated Self-Assessment Tool
FETP Facilitated Self-Assessment Tool “Scorecard”

9 Overview Developed to assist countries to look at strengths and weaknesses Help programs focus on priority areas for improvement Consider a common vision of successful FETP Work collaboratively with other programs Brief background and evolution of ET Overview of functionality Accomplishments Challenges Next Steps

10 Prior problems Simple counting insufficient Results long in coming
Results not useful Program not actively engaged in process

11 Major Categories Competency-based Curriculum Field Activities
Leadership Development Program Management Sustainability There are 5 major categories or areas that incorporate about 25 indicators

12 Matrix Tool for FETP Assessment Capabilities - Indicators
1. Competency-based Training A. Competency-based curriculum B. Resident/Officer Assessment C. Mentor Support D. Mentor Assessment E. Coursework Faculty F. Field Sites/ Work Sites G. Graduates Here are the indicators under the first category. It includes the the presence of a competency-based curriculum, the assessment of officers, mentor support etcetera. In the next slide I will show how these are the placed in scenarios on the scorecard.

13 Matrix Tool for FETP Assessment Capabilities - Indicators
1. Competency-based Training A. Competency-based curriculum B. Resident/Officer Assessment C. Mentor Support D. Mentor Assessment E. Coursework Faculty F. Field Sites/ Work Sites G. Graduates 2. Public Health Work/Field Activities A. Outbreak Investigations B. Surveillance C. Public Health Studies D. Communications The next category highlights the public health work and looks at outbreak investigations, surveillance, planned studies, and communications, and just as you saw before scenarios have been developed for these as well.

14 A. Public health leadership
3. Public Health Leadership Development A. Public health leadership B. Ministry of Health retention/Career progression 4. Management A. Policies and procedures B. Resident/Officer Selection C. Staffing D. Office logistics/infrastructure E. Course work logistics F. Field Work logistics 5. Sustainability A. Support by Ministry of Health B. Sustainable leadership for program C. Graduate network D. Planning E. Partnerships F. Advisory Board/Steering Committee G. Advocacy for Program The leadership category looks at leadership development and career prevention.

15 A. Public health leadership
3. Public Health Leadership Development A. Public health leadership B. Ministry of Health retention/Career progression 4. Management A. Policies and procedures B. Resident/Officer Selection C. Staffing D. Office logistics/infrastructure E. Course work logistics F. Field Work logistics 5. Sustainability A. Support by Ministry of Health B. Sustainable leadership for program C. Graduate network D. Planning E. Partnerships F. Advisory Board/Steering Committee G. Advocacy for Program The management category looks at how well the FETP program is run and looks at things such as policies and procedures, selection process, and staffing and other indicators having to do with logistics.

16 A. Public health leadership
3. Public Health Leadership Development A. Public health leadership B. Ministry of Health retention/Career progression 4. Management A. Policies and procedures B. Resident/Officer Selection C. Staffing D. Office logistics/infrastructure E. Course work logistics F. Field Work logistics 5. Sustainability A. Support by Ministry of Health B. Sustainable leadership for program C. Graduate network D. Planning E. Partnerships F. Advisory Board/Steering Committee G. Advocacy for Program And finally sustainability which looks at ministry support, strategic planning process, partnerships, etcetera.

17 2. PUBLIC HEALTH WORK/FIELD ACTIVITIES OUTBREAK DETECTION AND RESPONSE
LEVEL OF ACHIEVEMENT Advanced 1 2 3 4 INDICATORS (A) OUTBREAK DETECTION AND RESPONSE Resident/officers do not participate Program resident/officers participate as observers/assistants in outbreak investigations. Program resident/officers are first responders for local outbreaks. Laboratory has participated in outbreaks. Investigation results are presented to relevant public health decision makers at site of outbreak. Investigations are reviewed for quality. Program resident/officers are first responders for outbreaks of national importance. Laboratory routinely involved in outbreak investigations. Outbreak recommendations are used for disease control and prevention and policy. Outbreaks investigations and reports meet a quality standard. (B) SURVEILLANCE No access or Limited access by resident/officers to surveillance data Some program resident/officers have access to surveillance data and play a role in review and report of data All program resident/officers have access to and have a role in surveillance data use. Officers report surveillance work to public health decision makers. Surveillance analyses and reports are reviewed for quality. AND Recommendations and conclusions are used for public health action or to improve surveillance systems and public health programs. Surveillance analyses and reports meet quality standard. (C) PUBLIC HEALTH STUDIES No protocol-based epidemiologic studies done by resident/officers Program resident/officers conduct protocol-based epidemiologic studies. Program resident/officers conduct protocol-based epidemiologic studies on problems of importance for the MOH. The results are presented locally. The protocols are submitted for ethical review as required. Protocol and report are reviewed for quality. Documentation of ethical review submission, feedback, and approval. Protocol, study and report meet quality standards. Study findings are used to improve public health programs. (D) PUBLIC HEALTH COMMUNICATIONS Officers do not generally present their findings at national or subnational level meetings Resident/officers participate in presentations of surveillance and other epidemiology data to national or subnational level meetings. Resident/officer work is presented at national, regional and international conferences. >60% of resident/officers have at least one presentation selected as an oral at an international conference. At least 2 presentations per year to conferences outside TEPHINET. Resident/officer publish work in peer review journals At least 2 published manuscripts per 10 resident/officers. They participate in presenting to the media for public information and guidance (if appropriate or expected)

18 Implementing Assessment
Country program interest External and internal teams identified Arrange for 4-5 day visit (logistics) Program compiles requested documentation The process of implementing the Assessment begins with the expression of interest by the country. This can be communicated to CDC or to TEPHINET. Following this, there is an external team leader and an internal country team leader who are identified and become the key lead persons of the assessment. Usually the country FETP program coordinator is the country lead. After this an external team has to be identified, usually this will be made up of representatives from CDC, TEPHINET, and other programs that have previously had an assessment. Funds to support the assessment have to be identified for a 4-5 day visit. Prior to the visit the country program will have make 2 major types of preparation: 1) organizing the agenda and logistics for the week in-country, and 2) preparing the documentation that will be needed for the external team to look at during the visit. This is time intensive and I’m sure you will her more about this in the country presentations.

19 Implementing Assessment
External team interviews key groups (fellows graduates, former directors, supervisors, mentors) Program and team review diagnostic tool together agree on level of achievement, documentation reviewed Agreement on key items to be worked on in short, medium and long term On this slide are the steps in the assessment that are initiated when the team arrives in-country. There is a series of focus group meetings with key groups that include current trainees, former program staff, supervisors, mentors, academic partners, etcetera. There is half-day workshop towards the middle of the week where program staff and the external team meet to review the level of achievement for each indicators. During the process, the program indicates the timeframe in which intends to work on each indicator

20 Typical Assessment Week
Mon Tue Wed Thu am Orientation Interviews Interviews Documentation pm Interview Interviews Workshop Preliminary Report Here you can see the typical schedule that the team followed during the assessments in South America. We arrive generally late Sunday, and started with an orientation on Monday. The next couple of days were filled with focus group interviews. Typically this was done in one location with group of people brought in to us in somewhat of an assembly line fashion. The questions were fairly open ended covering areas of the scorecard, but not in rigid fashion. On the Wednesday afternoon we held the scorecard workshop---core participants for this was the FETP coordination team, the international team, and in some instances trainees from current cohorts. We reviewed documentation and wrote a preliminary report which we left with the country prior to leaving. Focus group interviews Scorecard workshop (coordinating staff +) Preliminary report left in country (briefing)

21 Thoughts on why it works
Focused, intense effort, easily understood process Billed as a “self assessment” NOT evaluation Process is itself networking/advocacy We leave a Report, a “roadmap” for improvement

22 Accreditation Process in development with TEPHINET
Standards being drafted Focus on core processes such as Program infrastructure Program management Standard procedures Routine evaluation procedures for residents, faculty, curriculum and program

23 Draft Accreditation Standards
Staffing (director, staff, supervisors/mentors) The program has a host country director/coordinator who is assigned full-time to the FETP. The program has adequate technical, administrative and clerical staff for the effective administration of the program and support of the FETP’s trainees. The program has mentors trained in the goals of the program and in field epidemiology methods who provide on-the-job supervision to the trainees. All mentors/technical supervisors are adequately qualified, e.g. graduates of the program or similar applied epidemiology program and this qualification is documented.

24 Draft Accreditation Standards
Administrative and logistic support The program has dedicated office space in which the staff and trainees can work. This includes having basic office supplies such as telephone/computers/ /fax/internet available for use. Trainees have ready access to reference material in print or electronic format, including the capability to search electronic medical literature databases. The program has access to necessary supplies and logistical support for field investigations. The program has access to laboratory services in support of investigations as appropriate.

25

26 Program Monitoring What should you be doing routinely?
Are you doing it?

27 Minimum requirements for FETP core functions:
Expected competencies are documented. All competencies have method to determine achievement of that competency that can be tracked. The method includes a measure for quality of work. Trainees have regular access to trained mentors who can provide sufficient and adequate supervision for the public health work. Supervisors and trainees meet routinely (?weekly, ?monthly) to review their work and other topics of interest. Trainee assessment occurs routinely and is documented. Progress is reviewed with trainee and supervisor at least every 6 months. Program reviews curriculum and training program each year and changes/improvements are documented.

28 Possible Routine Reports from Residents and Supervisors
Monthly report of activities – provided to field site supervisor, technical supervisor with cc to program director, and RAs. Quarterly report by resident of progress toward completion of required field activities. Include self-assessment and evaluation of supervisor. Quarterly performance evaluation by field site supervisor Quarterly (or 6 monthly) progress assessment and development of next quarter workplan with FETP staff supervisors and RA Documentation for required activities (this will be program dependent)

29 FETP Quarterly Program Monitoring
Quarterly resident assessments Quarterly report submitted Quarterly supervisor assessment received Quarterly assessment conducted with participant Improvement plan (or revised workplan) Regular meetings (weekly/monthly) Meetings held regularly Appropriate MOH staff in attendance  Support to Ministry of Health List requests for assistance and program response.  Recommendations from field activities presented to appropriate level of MOH

30 Suggested accreditation standard
The program must monitor and track each of the following areas: trainee performance, mentor/supervisor/faculty performance, field assignment performance, graduate performance, program quality.

31 Annual program review Quarterly monitoring information
Review of number, type, adequacy of outbreak investigations Analysis and use of surveillance data Improvements to surveillance systems Uptake of recommendations (actions/policies, etc.) Proportion of residents on track to complete competencies Communication products (bulletins, abstracts, presentations, manuscripts) Review of training gaps identified in field work and plans to address Advisory committee activities Review of field sites (support, supervision, routine work, access to data, projects undertaken, use of findings)

32 Routine Reporting to Division
Before – Yearly annual report request Now – In process, but expect routine (monthly or quarterly) updates on core information, e.g. New cohorts Graduates Outbreaks Presentations Publications Other resident activities Other trainings Etc.

33 Questions?

34 Quality: System, Program, Individual


Download ppt "Monitoring and Evaluation for FETPs"

Similar presentations


Ads by Google