NPT: Potential uses in Quantitative Research Tracy Finch Institute of Health and Society, Newcastle University.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model.
Donald T. Simeon Caribbean Health Research Council
NMAHP – Readiness for eHealth Heather Strachan NMAHP eHealth Lead eHealth Directorate Scottish Government.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
1 Funded by 1 Public involvement in health and social care research: the values we bring to working together Ann Jacoby on behalf of The Public Involvement.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
SPA’s Good Practice for Admissions Policies Admissions Policy and the QAA Quality Code – Are you ready? Peter Chetwynd, Admissions Support and Development.
Why don’t innovation models help with informatics implementations? Rod Ward University of the West of England Medinfo 2010.
Program Evaluation Unit, Univerity of Melbourne Evaluating organisational context Measurement approaches: evaluation David Dunt.
How do nurses use new technologies to inform decision making?
Public Relations in College Athletics An Examination of Athletic Directors’ Perceptions of the Role of Public Relations in a College Athletic Department.
Challenge Questions How good is our operational management?
©© Disentangling Diversity and Inclusion in Organizations Quinetta M. Roberson, Ph.D. Human Resource Studies, ILR Cornell University NSF Workshop on “A.
Work Survey Instrument Revision for Case Management Work Jane Christianson, RN; MSN L. Sue Davis, RN; PhD.
Scaling and Attitude Measurement in Travel and Hospitality Research Research Methodologies CHAPTER 11.
Assessment in Career Counseling
RESEARCH QUESTIONS, AIMS AND OBJECTIVES
Generic Employability Skills Centre for Developing and Evaluating Lifelong Learning (CDELL)
Culture and culture change in the NHS: comparing professional and patient perspectives Frederick Konteh, Russell Mannion, Huw Davies The Centre for Health.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
Making patients’ views count Frequent Feedback Service Regula Dent Marketing Manager Picker Institute 21 April 2009.
Needs Analysis Session Scottish Community Development Centre November 2007.
Welcome Maria Hegarty Equality Strategies Ltd. What ? Equality/Diversity Impact Assessment A series of steps you take that enable you to assess what you.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
KEYS to School Improvement Missouri National Education Association Teaching and Learning Director.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Building Research Capacity in social care: An untapped potential? Jo Cooke &Linsay Halladay University of Sheffield Others in the research team: Ruth Bacigalupo.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Impact assessment framework
Development version 19/06/ of 48 Effectiveness of a postural care training programme © 2012 Effectiveness of a postural care education programme.
Student Engagement Survey Results and Analysis June 2011.
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Evaluation of the Implementation of the MCH KAS Service Activity Framework Year 2 (2011) Progress Report Claire Jennings Centre for Community Child Health.
Dr. Lai Fong Chiu Senior Research Fellow Institute of Health Sciences and Public Health Research University of Leeds Critical Engagement The Community.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Foundation Degrees Foundation Degree Forward Lichfield Centre The Friary Lichfield Staffs WS13 6QG — Tel: Fax: —
Evaluating a Research Report
Transforming Community Services Commissioning Information for Community Services Stakeholder Workshop 14 October 2009 Coleen Milligan – Project Manager.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
RPA in Health & Social Care “Review of Effectiveness of Communication & Implementation of the Review of Public Administration in Health & Social Care”
Commissioning Self Analysis and Planning Exercise activity sheets.
Addressing ethnic inequalities in health service access, experience and outcomes through the critical mobilisation and application of evidence This project.
©Truven Health Analytics Inc. All Rights Reserved. 1 Jessica Kasten and Rebecca Woodward August 14 th - 15 th 2014 Minnesota LTSS Service Access Study:
Laying the Foundation for Scaling Up During Development.
Government IT Professionals Online Survey Results FINAL REPORT September 2010.
Dr J M Mathibe-Neke Department of Health Studies Unisa.
1-2 Training of Process Facilitators 3-1. Training of Process Facilitators 1- Provide an overview of the role and skills of a Communities That Care Process.
Profiling exercise of internally displaced persons’ situations in_______ General presentation of the project Official project launch event Date.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Chapter 14: Affective Assessment
Angela McKinnon Child health lead Aberdeenshire CHP NHS Grampian Jan 2015.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
The process of building a national eHealth strategy for Ukraine “ How do we get to where we want to be?” Clayton Hamilton Unit leader, eHealth and Innovation,
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
DESIGNING GOOD SURVEYS Laura P. Naumann Assistant Professor of Psychology Nevada State College.
Development of the Construct & Questionnaire Randy Garrison & Zehra Akyol April
Headline charts from ARM polling June Methodology BritainThinks conducted an online survey of 1240 adults living in England between 7 th and 9th.
Stages of Research and Development
DEVELOPING EVIDENCE-BASED PRACTICE IN CHAPLAINCY:
Peer Element of ODDESSI
Exploring the relationship between Authentic Leadership and Project Outcomes and Job Satisfaction with Information Technology Professionals by Mark A.
Rayat Shikshan Sanstha’s S.M. Joshi College Hadapsar, Pune 28
Standard for Teachers’ Professional Development July 2016
HOW TO ENGAGE COMMUNITY MEMBERS IN OUTCOME EVALUATION?
Presentation transcript:

NPT: Potential uses in Quantitative Research Tracy Finch Institute of Health and Society, Newcastle University

Objectives  To describe the development of a structured research instrument (questionnaire) to measure constructs of the NPM/T in relation to a particular intervention (E-health)  To consider wider issues for use of NPT in Quantitative research

Understanding Normalisation in relation to E-health  ‘Understanding the implementation and integration of e- health services’. Mair, May, Finch, Murray, O’Donnell, Wallace et al. NHS SDO Funded study. (April 2006 – Jan 09).  Focused on barriers and facilitators to use of e-health systems by health professionals within the NHS  One work-package (out of 4) involved developing a structured research instrument to assess ‘readiness’ for e- health  The NPM and NPT framed data collection and analysis throughout project (Began with NPM – finished with NPT)

WP3 ‘Technology Adoption Readiness Scale’ (TARS) Objective: To develop a structured, predictive instrument to test the contextual readiness of a health care setting for uptake and routine use of a specific ehealth system by health professionals. Contextual readiness: combination of individual and organisational factors Perspective: Professionals

TARS Study Structure  Phase 1: Instrument development  included online survey of experts  Phase 2: Generic instrument  ‘factors’ rated in importance  Phase 3: Specific instrument  ‘questions’ about specific e-health system being used

TARS Phase 1: Item development  Aim was to develop an initial item set, using expert review  Generate potential items through translation of construct statements into plain language and single dimensions

Item development: Example…. Step 1Factor/ Issue: Concerns about security and confidentiality and standards Step 2Question: Does X affect your confidence in knowledge available to you? (Relational Integration) Step 3Question for ehealth study: How important are the following factors in affecting the use of e-Health systems in the everyday work of health professionals: ….how much the e-Health system affects the users’ confidence in their ability to conduct their work safely and efficiently (scale: Not at all important – Extremely important)

Phase 1: Expert Survey Survey:  27 Items for inclusion (IW, RI, SW, CI)  5pt scale of ‘importance’ (+ ‘don’t know)  Free-text space for ‘any factors missed’  Conducted electronically Sample:  Authors of reviews relevant to field Recruitment:  invitation

Phase 1: Results Response:  63 completed surveys (25% of 252 invitations presumed received) Data analysis:  Descriptive (mean ratings, correlations) Findings:  Importance of items generally highly endorsed  Most correlations low-moderate (little redundancy) and higher within than across NPM constructs  Free-text responses useful in identifying further factors to include

Phase 2: Development of TARS Generic Aimed to:  Refine factor set using expert survey data  Test ‘relative importance’ from perspective of professionals Revisions to item set (31 Items):  Some items re-worded, dropped or combined  New items added from free-text responses (phase 1)  New items for NPT: Coherence, Cognitive Participation, Reflexive Monitoring

Phase 2: Testing TARS Generic  Survey of health professionals’ perspectives of importance of factors affecting uptake and use of e- health  Sample: Regional NHS Hospitals Trust (potentially 10,000+ respondents). Extensive use of e-Health.  Recruitment: Via site contact (Technical Director)  Response: Extremely low (51 responses)  Analysis: Not particularly useful, but suggestive of different patterns of response between experts and professionals

Phase 3: TARS Specific Aimed to:  Develop a version containing questions framed for an individual’s assessment of a particular e-health system  Test through data collection at (2) different sites: Site 1: Community nurses using PDAs (relatively new) Site 2: Established use of several e-health systems as basis of work (algorithms, information resources, etc)

TARS Specific Instrument  Demographics (eg. Professional role, system/s used, length of time using)  Comfort with using computer-based technology  TARS Items: 30 Items rated from ‘agree strongly’ to ‘disagree strongly’ (7 pt) (Items on hand-out)  ‘Normalisation’ questions:  Whether system was ‘not at all’, ‘partly’, or ‘completely’ in routine use  Perceived likelihood of it becoming routine (5 pt scale)

Phase 3: Sample SITE 1% (n)SITE 2% (n) Response: 46/243 (19%)Response: 231/1351 (17%) Age: 72% aged 45+; 29% aged <35Age: 40% aged 45+; 29% aged <35 Sex: All FemaleSex: 86% Female Working role: Community Enrolled Nurse Community Staff Nurse District Nursing Sister/Charge Nurse Practice Development Nurse Senior Nurse 0 (0) 28 (13) 61 (28) 9 (4) 2 (1) Working role: Call handlers Nurse advisors Team leaders Health Information advisors Other 47 (109) 24 (56) 9 (21) 3 (7) 16 (38) Perceived level of routinisation of e-Health Not at all Partly Completely 0 (0) 68 (30) 32 (14) Perceived level of routinisation of e-Health Not at all Partly Completely 1 (2) 17 (35) 83 (174)

 Non-parametric (cross-tab) analysis  Groups perceiving e-health as ‘completely’ rather than ‘partly’ routine differed in expected direction:  on 12/30 Items at Site 1 (CI=4; RI=4; IW=1; Co; CP; RM  on 9/30 Items at Site 2 (CI=3; RI=3; SW=1; Co; RM)  At Site 2, comparison of call handlers with nursing & related staff indicated differences on 4 items Key results: Normalisation Perceptions

Summary of Results of TARS  Development of NPM/T based questionnaire for assessment of individual’s perceptions of factors relating to normalisation of e-health  Operationalising of NPM/T constructs into plain language questions  Support for NPM/T in terms of constructs – patterns of relations between items  Potential of items representing NPM/T constructs for discriminating between levels of perceived normalisation of e-health

TARS - Limitations  Low response rates – insufficient for scale development work (statistical properties)  Constraints of ‘real’ environments:  ‘Readiness’ assessment is dependent on timing and site characteristics  Lack of access to participants/control over survey reminders etc  TARS should be used/tested in further studies, in sites where predictive utility of TARS can be assessed prospectively

Using NPM/T in Quantitative Research: Wider issues  Potential Benefits  Challenges: 1.Translating theory into plain language 2.Addressing multiple perspectives 3.Standardisation vs specification 4.Operationalising ‘normalisation’  Summary

Using NPM/T Quantitatively: Potential Benefits  The ‘How much?’ question:  Structured surveys have the potential to collect data efficiently, and on a large scale  The ‘what is likely to happen?’ question:  Surveys, used prospectively, may have some predictive utility with respect to outcomes  Potentially useful in comparative research  Surveys are appealing to practitioners and researchers - facilitate take-up of the Theory!

Challenge 1: Translating constructs into plain language Example: How important are the following factors in affecting the normalisation of e-health….. “…… the extent to which organizational effort is allocated to an ehealth system in proportion to the work that the system is intended to do” (CI) Problem: Multi-dimensional constructs difficult to capture in single questions/statements Possible solutions?  Clear definitions of terms (eg effort) or  Establish understanding of terms of reference (eg agreement on what the system is intended to do?) and use several questions to build understanding

Challenge 2: Addressing multiple perspectives  Questions not to be framed around ‘intention’ – instead reflect judgements about others/the organisation  Which stakeholder groups should be included? How do we combine/weight their ratings?  Need for customising questions (or question sets) for different stakeholder groups

Challenge 3: Standardisation vs specification Quantitative validation of NPT would be facilitated by:  Focused effort on scale development in appropriate settings and with adequate resources, and  Comparative analysis of quantitative research using NPT survey across different settings However.....  Can we develop a useful ‘generic’ NPT based structured survey instrument that is useable across settings?  (and if we do, are we denying the complexity that the NPT embraces?)

Challenge 4: How to operationalise ‘normalisation’?  Does the NPM/T yet define ‘normalisation’ adequately for quantitative measurement of it as an ‘outcome’?  Are ‘perceptions’ of how much an intervention/technology/ practice has become ‘part of everyday work/life’ sufficient to test the constructs of the model?  i.e. Are ‘objective’ measures needed also?

Summary & Final Thoughts  Quantitative use of NPT brings challenges, but potential benefits are huge  TARS represents a useful starting point in developing quantitative use of the NPM/T  Need for more focused effort on scale development and validation (MRC Methodology Programme grant planned)  Other quantitative studies are underway (eg. May, Rapley et al. ‘BSPAR Survey’; Newton et al. Midwifery, Melbourne).