Presentation is loading. Please wait.

Presentation is loading. Please wait.

Plenary – Data Lifecycle, Review & Quality

Similar presentations


Presentation on theme: "Plenary – Data Lifecycle, Review & Quality"— Presentation transcript:

1 Plenary – Data Lifecycle, Review & Quality
2017 PEPFAR Data and Systems Applied Learning Summit Plenary – Data Lifecycle, Review & Quality September 13, 2017

2 Welcome & Introductions

3 Agenda Topic Estimated Time 1. The PEPFAR Data Lifecycle 10 minutes 3. Types of Data Checks 15 minutes 4. Tools for Reviewing Data Quality 5. Developing Processes Around Data Checks 6. Summary and Q&A

4 Session Learning Objectives
Understand the data PEPFAR Data Reporting Cycle for MER Be familiar with various types of data checks Know where to find standardized guidance and tools related to Data Quality & Review Understand processes and best practices to ensure and access high quality data

5 Why is it so Difficult to PEPFAR?
The PEPFAR Data Lifecycle

6 PEPFAR Quarterly Data Management Cycle

7 DATIM Data Submission – Journey of the Data
IP Data Collection Activity Manager PEPFAR Coordinator IP Data Entry and Submitter OGAC OU Accepter Implementing Partner Agency Inter-Agency OGAC HQ Return Return Return

8 PEPFAR Data Calendar

9 Each Quarter… Quarterly MER Data Entry and Review: Keeping it Real
We wait for the reporting period to close Partners aggregate 3 months’ worth of data from site-level MOH reports, IP systems, and other sources Partners enter hundreds of thousands of data values into thousands of sites Agencies review hundreds of IM submissions and then submit upwards to Interagency and then to HQ Country teams work to de-duplicate results across all partners where applicable (sometimes thousands of values)

10 But… Quarterly MER Data Entry and Review: Keeping it Real
We can’t see or analyze any data in DATIM until our partners officially submit it!

11 And… Quarterly MER Data Entry and Review: Keeping it Real
It often takes partners longer than 30 days to collect results & we only have 45 days to submit data…

12 Which means… Quarterly MER Data Entry and Review: Keeping it Real
We are all under intense pressure to enter, review, and submit our data as quickly as possible—even under the best of circumstances…

13 But Wait! Quarterly MER Data Entry and Review: Keeping it Real
The reporting period with the most indicators (Q4) requires the quickest turnaround for review and publication (World AIDS Day) When HQ changes the indicators, it’s difficult to harmonize with existing registers and approved reporting tools in country And when I really need my Q2 data available for COP planning, it might be ready…

14 And once data entry closes and Panorama gets refreshed…
Quarterly MER Data Entry and Review: Keeping it Real And once data entry closes and Panorama gets refreshed…

15 Quarterly MER Data Entry and Review: Keeping it Real

16 So… Quarterly MER Data Entry and Review: Keeping it Real
How can I get an earlier jump on reviewing quarterly results? How do I run data quality and completeness checks? For real though, where do I find HTS_TST_POS and yield calculations in DATIM?

17 PEPFAR Data System Improvements
Quarterly MER Data Entry and Review: Keeping it Real PEPFAR Data System Improvements Removing the black box: Adding the ability to view unsubmitted data in DATIM All access data: Generating nightly datasets for review during “crunch time” Building standard views/favorites in DATIM: HTS breakdowns by modality, yield calculations, and other key indicator tables to share with partners Sharing what works: DATIM and Excel-based data quality review tools and processes

18 Types of Data Checks

19 Types of Data Checks Data checks can be done at various levels:
OU level PSNU level Site level Higher level checks (e.g., IM level or PSNU level) can be quicker, but they can hide underlying site-level issues The more granular the level of check, the easier it is to take action to correct a problem Implementing Mechanism level IM by PSNU level IM by Site level Data checks can be done at a number of different levels. You could look at data for an indicator(s) at an OU level, by Implementing Mechanism, you can examine each Site for each Implementing Partner, etc. Performing checks at higher levels (e.g., OU, PSNU, IM, PSNU x IM levels) may sometimes be quicker or than drilling down to a lower level (e.g., Site level or Site x IM level) but completing checks at higher levels may hide issues. For example, imagine you are comparing Treatment New and Treatment Current at the PSNU level. You see that in this quarter, TX_NEW = 1,357 and the TX_CURR = 27,032. This seems reasonable given your program and you move on to do data checks on other indicators. However, if you had compared TX_NEW and TX_CURR at the Site level (or Site x IM level) you would have noticed that there were 14 sites that accidently reported greater numbers for TX_NEW than for TX_CURR. Additionally, when you perform data checks at a more granular level it is easier to take corrective action. Imagine you compared the OU-level PMTCT_STAT Total Numerator with the sum of your age disaggregates for PMTCT_STAT. You see that the sum of the disaggregates is much smaller than the total numerator – before your partners can correct this, someone (either you or the partner) must drill down to the Site x IM level to identify the specific instances where the numerator doesn’t match the disaggregates before the issue can be corrected.

20 Types of Data Checks Checks Across Time Periods MER Logic Checks
Disaggregate Completeness MER Logic Checks Consistency/Variability Programmatic Checks Miscellaneous Checks

21 Types of Data Checks – Disaggregate Completeness
Disaggregate Completeness Checks – Checks the completeness of disaggregate (e.g., age/sex, HIV Status, etc.) compared to the overall total for that indicator (e.g., Total Numerator) Disaggregate completeness is one of the most common kinds of data checks. For example, if the total numerator for VMMC_CIRC is 10,000 you want to check to see how close the sum of each disagg is to 10,000. There are a number of DATIM favorites that have been created to assess completeness, such as this one shown for VMMC

22 Types of Data Checks – MER Logic Checks
MER Logic Checks – Checks the relationships within a MER indicator or between two MER indicators to see if logic from MER Guidance is followed Examples of potential problems include: TX_NEW > TX_CURR TX_NEW was reported but TX_CURR was not reported VMMC_CIRC was reported, but no results were reported for VMMC service delivery modality of HTS_TST PMTCT_STAT Numerator > PMTCT_STAT Denominator PMTCT_ART_NEW ≠ TX_NEW (Pregnant Disaggregate) MER Logic Checks are based on MER Guidance – they assess the relationships within a MER indicator or between two related MER indicators to see whether the logic from MER Guidance is violated. Some are pretty firm rules, you can never have a situation in which TX_NEW is greater than TX_CURR. This is because by definition, TX_CURR is the number of people currently on treatment – this includes those who are newly enrolled on treatment this quarter (TX_NEW) and those who were enrolled on previous quarters. Other rules may indicate data quality problems BUT there might also be reasonable programmatic explanations for why a rule was violated. Often times it may depend on a) the level that you’re doing the check at and b) the contextual situation of your program. For example, MER guidance (and good programming) suggests that if circumcisions are being performed (VMMC_CIRC) then testing should also be offered through the VMMC service delivery modality. However, maybe you have a situation where you have two partners working in a circumcision facility – you might have one partner working on and reporting on circumcision (VMMC_CIRC) and a second partner doing the testing (HTS_TST service delivery model of VMMC). In this case, it would not be a problem at a Site x IM level if you see VMMC reported but no data in the HTS_TST VMMC modality then it wouldn’t be a problem. But if you looked at a PSNU level, you’d would definitely expect to see HTS_TST data in the VMMC modality if VMMC is being reported at that PSNU. Get audience to provide other possible examples of a business logic check

23 Checks Across Time Periods – Potential problems could include:
Types of Data Checks – Checks Across Time Periods Checks Across Time Periods – Potential problems could include: IM had targets at a site but did not report results IM reported results at a site but did not have targets IM reported results at a site last quarter but not this Q IM reported results at a site this quarter but not last Q There could be intentional programmatic shifts to/away from sites by different IMs – in which case this is not a problem. However, these can also be indicative of data reporting issues.

24 Types of Data Checks – Consistency/Variability
Consistency/Variability Checks – Assesses consistency/ inconsistency of data across time or within an indicator Example 1: Was there an unexpectedly large percentage change from last quarter’s results? Ex. Q3 an IM reported 175 TX_NEW at Kopanong site Q4 the IM reports 625 TX_NEW (257% increase from Q3) Could have a programmatic explanation OR could be a typo (should have been 265).

25 Types of Data Checks – Consistency/Variability (Part II)
Consistency/Variability Checks – Assesses consistency/ inconsistency of data across time or within an indicator Example 2: Was there a questionable amount of consistency from a partner? This suggests that a formula might have been applied to estimate results Ex. Project Stoney IM had the same age/sex distribution for all sites At every site, 40% of patients currently on treatment were males aged 15+ The Project Stoney IM provides quality improvement technical assistance at 50 Treatment facilities. In every site they reported that 40% of patients currently on treatment were males over the age of 15 Formulas could be applied to any kind of situation where you’re calculating a percentage – for example Viral Load Suppression or Treatment Retention

26 Examples Are there outliers?
Types of Data Checks – Programmatic Checks Programmatic Checks – Given what you know about your program, are there results that are very unexpected? Examples Are there outliers? Ex. Unexpectedly high/low testing yield (e.g., Index testing modality yield very low) Too small (or large) a percentage of TX_NEW or TX_CURR are peds (or other group) Another example of an outlier would be if you saw an unexpectedly low testing yield for a high risk population like female sex workers in an urban area

27 DSD reported instead of TA (or opposite)
Miscellaneous Checks DSD reported instead of TA (or opposite) Implementing Mechanism reported results in the wrong geographic area (may have accidently reported data in a site with a similar name) Other examples? These are not exhaustive lists!

28 Tools for Reviewing Data Quality

29 Tool #1: DATIM Favorites

30 Focus on data completeness
Tool #1: DATIM Favorites Focus on data completeness Allows DATIM users at all levels of the organizational hierarchy Available in DATIM at the beginning of every quarter data entry The DATIM Favorites that are produced by OGAC are primarily focused on data completeness. New DATIM Favorites produced by OGAC are available at the start of each quarter when the data entry period opens. You can certainly create, save, and share your own favorites that are focused on other aspects of data review though.

31 Tool #1: DATIM Favorites
Naming Convention: The DATIM Favorites produced by OGAC each quarter use a standard naming convention. So you can use the search box to find these OGAC favorites even if you haven’t seen an updated favorites list yet. Just follow the naming convention during your search

32 Tool #1: DATIM Favorites
This is just an example of a favorite in DATIM

33 Tool #2: DATIM Favorites / Excel Hybrid

34 Data Cleaning Tool Option #2: Using DATIM favorites with Excel
What is it? Excel template that has built-in data checks using exports from DATIM favorites. Excel tools can check for issues like: Completeness review Linkage to treatment proxy measures Age/sex disaggs linkage to care Yield estimates, by modality, by SNU, etc. % achievement review Much, much more Many of you already have developed Excel tools and templates that you’re using in combination with DATIM Pivot Tables Favorites. You save favorites, export those favorite pivot tables each time, and paste them into your Excel Templates which have formulas and conditional formatting saved to help you flag potential programmatic or data quality issues. There are a number of things this sorts of tools can help assess, the list offered here certainly is not exhaustive but it is a sample of the kinds of things you can easily check. Some countries have developed really great templates. For example, Mozambique, Ethiopia, a number of other countries have been building these kind of great templates already. You can also reach out to your SI Advisor for examples of templates that you can use and adapt for your program.

35 Data Cleaning Tool Option #2: Using DATIM favorites with Excel
Example with Treatment Data Checks – completeness review by district with data 12345: PALS Partner So this is just an illustrative example of part of a tool. We don’t need to get into the details, but what you can see here is just a huge DATIM pivot table pasted into an Excel file. This specific example is looking at Q2 & Q3 TX_NEW and TX_CURR. Some best practices to point out here is that the Excel template itself contains the name and the link for the favorite up in the top corner. It also includes specific notes about what, if anything, might need to be adjusted in the favorite before exporting data. Then as you’d scroll to the right in the Excel file you’ll see the section where formulas have been added to do checks and assessments on the data from the pivot table.

36 Data Cleaning Tool Option #2: Using DATIM favorites with Excel
Example with Treatment Data Checks – completeness review by district with data 12345: PALS Partner So as you’ve scrolled to the right after the pasted pivot table data, you’ll see this team has added a series of checks and conditional formatting. So in this example there would be formulas that will trigger a 1 if TX_CURR > TX_NEW ; if TX_NEW age/sex disaggs <= the TX_NEW Numerator ; if TX_CURR age/sex disaggs <= the TX_CURR Numerator. You’ve also got columns calculating the degree of variance in results from one quarter to the next to help flag major outliers. The NET_NEW and is calculated for programmatic review.

37 Data Cleaning Tool Option #2: Using DATIM favorites with Excel
Pros Countries can develop their own template to meet their own priorities Many DATIM Favorites already established Cons Easy to mess up with copy/paste errors or formula errors Labor intensive to export many pivots Many countries have their own tools developed – share ideas, templates, best practices with colleagues across PEPFAR So the great thing about this method is that it’s extremely flexible. You can customize pivot tables, use the pre-existing pivot favorites and really customize your own Excel templates for whatever checks your team is prioritizing. But as you can imagine, it is very easy to make errors with Excel templates – pasting into the wrong cells or errors in the formulas or conditioning formatting are very easy to make especially when you’re working on very tight timelines and working quickly. As a general rule of thumb, when you’re dealing with these kinds of error prone tools it’s important to have a second person just take a quick look at your work to do a QC (quality check). If you don’t have a tool like this and you would like to develop one (or you have one that you’d like to refine more) – your two best resources are 1) your SI Advisor and 2) each other. Use this week to talk to colleagues from other countries, learn about their best practices, and share ideas, templates, etc from one another. Mozambique and Ethiopia are two great examples of countries that have developed templates

38 Tool #3: Site x IM Genie Exports from DATIM

39 Genie Exports from DATIM
Q4 UPDATES: Ability to export unapproved data from Genie New format (like ICPI’s Site x IM Fact View Datasets) Choose new or current format Build templates in Excel or use statistical software Many countries export data using Genie for analysis and you can use the Genie exports to do your own checks. There are three new important updates about the DATIM Genie export tool. First, starting in Q4 users have the ability to export unapproved data. This means that you can be reviewing your partner’s data without having to push it up/down the ladder of approvals. Your approval Second, you’ll have a choice in the format that your data is exported in. You can either export the data in the current format that you all are used to seeing. Or there will be a new format – modeled after ICPI’s Site x IM Fact View Datasets. Third, in the past users who wanted to export large amounts of data often had to do many data exports in order to get all of their data. Now, the Site x IM export allows you to export an entire dataset for a country in one single export.

40 Current GENIE Format NEW Site x IM Format
Results or targets from multiple periods Results and Targets for FY17 only Formatted using DATIM Data Elements Data elements (e.g., age, sex, modality) in separate columns for easier use No calculated indicators Includes calculated indicators to make analytics easier. Ex: PMTCT_STAT_NEWLY_ IDENTIFIED_POSITIVE No MCAD Most Complete Age-Sex Disagg No Standardized Disag Standardized HTS Disagg to make analyzing HTS_TST easier New data available almost immediately ~24 hour delay before new/altered data available Two formats available for export: Current Genie export format NEW Site x IM format modeled after ICPI Fact View Dataset (newly launched for Q4)

41 Current GENIE Format NEW Site x IM Format
All Results & Targets since FY15 are available Results and Targets for FY17 only Formatted using DATIM Data Elements Data elements (e.g., age, sex, modality) in separate columns for easier use No calculated indicators Includes calculated indicators to make analytics easier. Ex: PMTCT_STAT_NEWLY_ IDENTIFIED_POSITIVE No MCAD Most Complete Age-Sex Disagg No Standardized Disag Standardized HTS Disagg to make analyzing HTS_TST easier New data available almost immediately ~24 hour delay before new/altered data available Two formats available for export: Current Genie export format NEW Site x IM format modeled after ICPI Fact View Dataset (still under development - anticipated for Q4) *MCAD combines both fine & coarse age bands as appropriate – it is a way to easily combine Fine & Coarse data for analysis when neither is complete For more information about the Site x IM format, please see the Users Guide and Release Notes that are posted in the Genie site. Additional information on how to effectively use the Site x IM exports, users can also see the training materials for the ICPI Fact View Datasets (since these extracts mirror the structure of the Fact View Datasets). The Fact View trainings are available in the ICPI Data Store in the MER > Training folder (

42 New Q4 Genie Site x IM Exports
Pro Completely customizable methods of assessing data Structured to make manipulation in Excel easier Can create re-usable templates & pivot tables that can be refreshed easily Con Learn a new data structure & develop new templates…who has time for that?! Once you export a Site x IM Genie dataset you can create Pivot Tables and templates that can be easily refreshed by replacing the original data with data from a more recent Genie export. As a best practice, you should always include a timestamp (with the date/time when the data was pulled from Genie) in your file and/or in the name of the file.

43 Tool #4: Data Review Tool (DRT)

44 Data Review Tool (DRT) DRT is an Excel-based tool that is created by ICPI for data cleaning The Data Review Too is created by ICPI after the initial data entry period closes. Here is a screenshot that demonstrates what the DRT looks like. It contains multiple data quality checks on different indicators. You can see how many sites there were that violated a check along with the total volume of results for those sites that violated a check. *Business Logic Checks are what we were previously referring to as the MER Logic Checks Bucket

45 DRT Comes prepopulated with a number of key data quality checks done. Only the site/IMs that did not pass a check are shown. You can drill down to see exactly where the potential problems occurred

46 Working on solution to make these accessible earlier
DRT Pro Pre-populated tool that includes a large number of data quality checks run for you….lower level of effort for you The appendix is as a good list of examples of data checks Can be used to identify consistent reporting errors from particular partners. Con Released too late to be truly useful! Working on solution to make these accessible earlier Currently the DRTs are produced after the initial data entry period closes each quarter. We recognize that the majority of data cleaning and review happens before this initial reporting period closes. We are working on solutions to make this tool (or something like this tool) available earlier in the process – ideally into DATIM itself. We hope to be able to implement some of these changes early in FY18. However, the DRT can still be used to identify data issues that can be resolved during the data cleaning/deduplication periods each quarter. Previous DRTs could also be reviewed in order to identify persistent reporting issues that partners make each quarter. For example, if you see that Partner Stoney IM frequently violated a data quality check about HTS_TST in Q3, you could reach out to that partner to better explain specific aspects of MER reporting requirements for HTS_TST before they finish entering their Q4 data.

47 Available on pepfar.net in the ICPI Data Store
DRT Available on pepfar.net in the ICPI Data Store Home > HQ > Interagency Collaborative for Program Improvement (ICPI) > Shared Documents > ICPI Data Store > ICPI Approved Tools > Data Review Tool You can find the Data Review Tool (along with all other ICPI tools) in the ICPI Data Store on pepfar.net. The first tab of the tool contains a user’s guide that explains how to use and interpret the tool.

48 Developing Processes Around Data Checks

49 Developing Processes Around Data Checks
Each country team should have SOPs in place around assessing data & around following-up with Partners to make corrections/adjustments

50 Developing Processes Around Data Checks
What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up? Teams need to prioritize what checks they’re going to focus on. What checks are most critical for your program? What level will you perform different checks at (e.g., PSNU x IM level or Site x IM level) – remember that different checks could be done at different levels depending on your need.

51 Developing Processes Around Data Checks
What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up? Who is responsible for each check? This should not just be limited to SI staff or by a single person. You should try to clearly define who is responsible for doing different checks. Think about how you want to divide the work. Also, remember that these checks are not just limited to USG! Yes, USG needs to be doing these checks. But you can also turn to your partners and set expectations around checks that they need to be doing before they submit data. And in some cases, USG are asking their partners to take screenshots of what they’re doing in DATIM or provide documentation about what checks they have done. Or it could be providing partners with templates for reviewing data that they should be doing.

52 Developing Processes Around Data Checks
What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up?

53 Developing Processes Around Data Checks
What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports Genie Exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up? What tools should be used for different checks? Remember that different tools could be used for different kinds of checks.

54 Developing Processes Around Data Checks
What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up? You need to have a clearly defined process for how feedback is going to provided to implementing partners and who will be providing feedbacAnd critically, make sure that is clear who is following up to ensure that feedback has been addressed by partners? In terms of follow up, you need to be thinking through each one of these steps again – what checks are you following up on, who is doing follow up, when does follow up need to be done, and how is the follow up going to be done (using what tools)?

55 Developing Processes Around Data Checks
What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up? On top of everything else you then have to figure out how deduplication efforts fit into this process. This is just one more layer of complication. DEDUPLICATION

56 Developing Processes Around Data Checks
Document your process! What Checks? What checks for each indicator? At what level (PSNU, Site x IM)? Prioritize checks (high/ medium/ low) Who Checks? Who is responsible for each check? Divide labor by indicator, geographic area, partner, etc. USG & Partner When? Deadlines/ frequencies for each kind of check How? What tools should be used? DATIM Favorite Excel w/ DATIM exports DRT Feedback How is feedback provided to partners? Who provides feedback? Who does follow up? Finally, make sure that you document your process. Documentation is a critical part of any process. DEDUPLICATION

57 Summary and conclusion

58 Summary During today’s session we discussed how to:
Understand the data PEPFAR Data Reporting Cycle for MER (data collection, submission, approvals, review, and cleaning) Importance of data review processes and best practices to ensure access to high quality data Be familiar with various types of data checks Know where to find standardized guidance and tools related to Data Quality & Review Understand processes and best practices to ensure and access high quality data

59 Questions? Notes Provide an opportunity for learners to ask any questions they may have about the content before you move on to providing information about further learning resources and key contacts. You may also encourage learners to ask questions throughout the course, but determine how you want to handle this given the amount of time you have to present and the amount of content you need to get through.

60 CONTACT YOUR SI ADVISOR
Key Contacts For additional information about the data life cycle, data checks, tools to help check data, and developing/ strengthening your data review process... CONTACT YOUR SI ADVISOR Your SI Advisor is the best person to reach out to for assistance/questions about the data life cycle, data checks, different tools for reviewing data, and for strengthening your data review processes. They can assist you or help connect you with other assistance.

61 Thank You! Speaker Notes
Don’t forget to that everyone for their time, participation, and attention during the session! Leave on a high note!

62 Logistics Needs for this Course
Element Notes Room setup/ configuration (Theatre style, semi-circle, classroom, etc.) Any configuration Computer needs for learners (certain programs) None Clicker? If the presenter wants White board?/ flip chart/ markers? Specific materials for interactive exercise? (i.e. post-its, pens, markers, tape, etc.) Notes This slide is intended to capture logistics information for the delivery of this course.

63 Software/Program/Technical Needs for this Course
Element Notes Software (what program (s) should learners have downloaded on their computers for this course? I.e. GIS Software, Excel, etc.) None. Though if learners want to use all the tools mentioned in this plenary, they would need Excel Access (what logins to PEPFAR supported systems, data sets, etc., should learners have access to for this course?) None. Though if learners want to use all the tools mentioned in this plenary, they would need DATIM accounts. In order to access DRT they would need a pepfar.net account. What additional technical requirements should learners have in order to participate in this course? None Notes This slide is intended to capture software/program/technical information for the delivery of this course.

64 This session can be taught using only this PowerPoint
Trainer Profile Background: One of the goals of the 2017 PALS is to empower country teams to take back what they’ve learned at the Summit to their home teams. This slide is intended to capture information about the skillset a person would need to deliver this training in the future. Trainer Profile Criteria Notes Level of familiarity with subject mater Describe what types of tasks or areas of knowledge the trainer should be comfortable with in order to deliver this course Trainer should be familiar with the MER reporting cycle, MER reporting guidance, and managing partner reporting Recommended years of experience with PEPFAR Program 1+ Recommended certifications What certifications should the trainer have to deliver this course? N/A Specific skills required Does the trainer require a certain level of proficiency with PEPFAR systems or other applications? If so, describe the types of skills/ tasks the trainer should be able to perform in the system Proficiency with DATIM and Excel are helpful for answering more advanced questions about the tools (DATIM favorites, DATIM/Excel hybrids, and the Site x IM Genie extract). Access needs to PEPFAR systems To teach this course, does the trainer require a certain type of account or level of access to a PEPFAR information system? This session can be taught using only this PowerPoint Note – this plenary was followed by an optional evening session. During the evening session, participants broke out into 5 different groups to discuss a topic of their interest in greater detail. The optional session followed a less formal setting so no PowerPoint presentations or materials are available. Breakout topics included: Possible MER data quality checks. This group focused on understanding different data clear checks in greater detail and discussed a general overview of resources to go to for MER data cleaning. Data cleaning using just DATIM favorites Data cleaning using DATIM favorites paired with Excel templates. Data cleaning using the new Site x IM GENIE extracts Data cleaning using the Data Review Tool


Download ppt "Plenary – Data Lifecycle, Review & Quality"

Similar presentations


Ads by Google