Presentation is loading. Please wait.

Presentation is loading. Please wait.

Similar presentations


Presentation on theme: ""— Presentation transcript:

1

2

3

4

5 An mHealth M&E Maturity Model

6 What are the main questions being asked at each stage?
Feasibility: Will the system work as intended in a given context ? Usability: Can the system be used by the intended end-user ? Efficacy: What is the benefit attributed to the system when delivered optimally ? Effectiveness: What can the system achieve in uncontrolled, real-word settings ? Implementation science: Develop tools to improve the uptake, integration and sustainability of systems for a given context, including policies and practices.

7 A Step-Wise Planning Process for Digital Project Evaluation

8 A Step-Wise Planning Process for Digital Project Evaluation

9 Emphasis on M&E Strategies to Support Anticipated Claims
Illustrative claims by stage of maturity Early Technology: Prototypes are functional and usable.    Intervention: Implementation protocols are utilized as intended by users. Mid Technology: Technology withstands testing under optimal field circumstances. Health: Health improvements demonstrated on a small scale, under optimal circumstances, warranting further testing. Advanced Health services delivery at moderate-scale implementation in a non-research setting is determined to be: high quality cost-effective

10 Monitoring and Evaluation are Interlinked Processes

11

12 Which Design is Appropriate for Your Project
At THIS TIME ?

13

14 Other important Evaluation “planning” sections
Developing a Conceptual Framework Selecting a Study Design Developing the M&E Plan Thinking through HR, Resources, Timeline

15 Monitoring – What is Different about Digital ?

16 Components of Digital Health Monitoring
Functionality – Does the system operate as intended? Stability – Does the system consistently operate as intended? Fidelity – Do the realities of field implementation alter the functionality and stability of the system, changing the intervention from that which was intended? Quality – Is the content and delivery of the intervention of high enough quality to yield intended outcomes? Performance – How well, and how consistently, are the users delivering the intervention?

17 Functionality What to Monitor:
Does the system meet the requirements outlined in the SRS? Does the system meet the needs of the health intervention?

18 Stability What to Monitor:
What is the failure rate of SMS messages from the server side? If there is a system UI, how often are there unexpected application closes, crashes or forced quits? How responsive is the digital health system under both normal and anticipated peak conditions for data loads?

19 Fidelity What to Monitor:
Pulls features of Technical functionality and stability with external factors that come into play during live field implementation and Technical: Does the server experience uptime interruptions? External: Is the device functional, charged, and neither lost nor stolen? User: Can workers operate the digital health system as intended, outside the context of training?

20 Quality What to Monitor:
User capability: Are users entering information correctly? Are there any knowledge gaps? Intervention Content: Are the services / messages / etc being delivered of high quality?

21 Performance What to Monitor:

22 Data Quality Assessment Process
STEP 1: Alignment of available data with program claims Identification of program Claims Indicators Data Sources Alignment of Indicators with data sources Summary of Recommendations- 1 Step 2: Data Source Mapping Understand how the data are collected, collated, analyzed and used for decision-making Visual illustration of the flow of data at each level of the health system Summary of Recommendations- 2 STEP 3: Data management protocol and data quality Data Collection Data Storage Data Analytics/Dashboard Data Management Data Use Summary of Recommendations- 3

23 Data Quality Assessment Process

24 Data Quality Assessment Process

25 mERA: mHealth Evaluation, Reporting and Assessment Guidelines

26 Guidelines to complement PRISMA / CONSORT
mERA: mHealth Evaluation, Reporting and Assessment Guidelines Guidelines to complement PRISMA / CONSORT A pragmatic approach that promotes high-quality reporting of mHealth innovation research, across varied study designs to facilitate evidence synthesis and development of guidance Domain 1: Research Methodology Reporting Domain 2: Essential mHealth (Technology, Functionality, Delivery) Reporting Domain Description No. Domain 1.1 General Reporting and Methodology Criteria 23 Domain 1.2 Quantitative Criteria 4 Domain 1.3 Qualitative Criteria 3 Domain 2 mHealth Criteria 14

27

28


Download ppt ""

Similar presentations


Ads by Google