Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Assessment Method What is it? How does it help research in health sciences informatics? Roland Grad MD MSc Pierre Pluye MD PhD Vera Granikov.

Similar presentations


Presentation on theme: "Information Assessment Method What is it? How does it help research in health sciences informatics? Roland Grad MD MSc Pierre Pluye MD PhD Vera Granikov."— Presentation transcript:

1 Information Assessment Method What is it? How does it help research in health sciences informatics? Roland Grad MD MSc Pierre Pluye MD PhD Vera Granikov MLIS Janique Johnson-Lafleur MA

2 Consultant: Financial Relationship with Lexi-Comp No financial relationship with any other electronic resources Roland Grad MD

3 Objectives 1.Describe the Information Assessment Method (IAM) 2.Articulate why our changing knowledge infrastructure needs a validated tool like IAM 3.Assess how IAM was used to explore connections between the push and pull of clinical information

4 Sir William Osler “The hardest conviction to get into the mind of a beginner is that the education upon which he is engaged is not a college course, not a medical course, but a life course, for which the work of a few years under teachers is but a preparation."

5 http://www.flickr.com/photos/25388219@N02/3274332856/

6 Optimal Care Actual Care Clinician Patient & family Evidence CME ‘system’ Health care system

7 Haynes, R.B. ACP Journal Club 2006

8 ResearchPractice Electronic Resources Evaluation? Information Assessment Method

9 Information Assessment Method (IAM)

10 Main Purpose of IAM Systematic & comprehensive assessment of clinical information relevance cognitive impact use patient health outcomes

11 IAM: Potential applications 1.Compare electronic knowledge resources 2.Maintain quality of electronic knowledge resources 3.Document brief self-directed e-learning Ecological Momentary Assessment

12 12

13 Three versions of IAM 1.PUSH context e.g. rating an email alert >585,000 POEM ratings from >3,700 MDs 2.PULL context, mobile device, database e.g. rating synopses of clinical research 3.PULL context, web, e-Book e.g. rating highlights retrieved from book chapters

14 14 InfoPOEMs: Patient-Oriented Evidence that Matters Synopses of research-based evidence – filtered, graded Alert family physicians to the latest developments

15 1. READ Clinical question Bottom line Reference Study design Funding Allocation Setting Synopsis 2. CLICK ‘Earn 0.1 credit’

16 3. RATE ‘IMPACT?’ Check all that apply 4. COMMENT & discussion (optional) 5. ‘Submit’

17 What is cognitive impact?

18 Information Assessment Method (IAM) CONCEPTUAL / THEORETICAL FOUNDATION

19 Saracevic T and Kantor KB. Studying the value of library and information services. Part I. Establishing a theoretical framework. Journal of the American Society of Information Science 1997;48:527.

20 IAM: Conceptual framework http://iam2009.pbworks.com Pluye, Grad et al. Journal of Evaluation in Clinical Practice, in press

21

22

23

24 Information Assessment Method (IAM) Developmental Procedures

25 History and development of IAM Major steps 1.Pilot study: Separate qualitative & quantitative studies 2.Residents study: Cohort & multiple case 3.Systematic mixed studies review 4.PUSH mixed methods study: Usage & Validity 5.PULL mixed methods study: Usage & Validity 6.PUSH-PULL feedback to information providers 8-year research program Real world clinical settings Physicians, Residents, Pharmacists, Nurse practitioners Implications: Information, CME, and KT

26 26 WHAT IS MIXED METHODS RESEARCH? Combination of quantitative and qualitative methods to answer complex research questions Helpful in medical informatics (crucial socio-technical issues) First handbook in 2003

27 Qualitative (QUAL) and quantitative (QUAN)

28 28 WHY DO WE MIX METHODS? QUAL data improved by QUANT data (e.g., IAM-derived interview guide stimulates memory of searches for clinical information) QUANT data improved by QUAL data (e.g., refinement of IAM cognitive items) How to interpret QUANT results (see example)

29 The Canadian KTpush project 41 FPs in practice 1-year mixed methods study PUSH and PULL components PUSH: 194 email alerts / 4,500 ratings PULL: 1,767 rated searches / 3,300 rated information hits

30 Research Questions What is the frequency of known-item searches by clinicians? (To what extent does PUSH lead to PULL?) - known-item search = retrieval of an email alert that was previously read - a dyad How does PUSH lead to PULL? Do physicians purposefully (or by serendipity) retrieve email alerts?

31 PUSH Context

32

33 NMeanRange No. rated InfoPOEMs® / MD41 MD111 (51.5)11-189 No. Days / MD41 MD314.2 (54.4)111-341 PUSH Context

34 Relevance vs. positive cognitive impact * Excluding 1,018 InfoPOEM® ratings of ‘No Impact’ PUSH Context Cognitive impact * n= 3,530 Relevant (partially or totally) n= 2,902 (%) Not relevant n= 628 (%) My practice was (will be) changed and improved 866 (29.8)56 (8.9) I learned something new2,076 (71.5)467 (74.4) This information confirmed I did (will do) the right thing 1,292 (44.5)127 (20.2) I was reassured1,478 (50.9)159 (25.3) I am reminded of something I already knew 844 (29.1)98 (15.6) I am motivated to learn more1,393 (48.0)177 (28.2)

35 Relevance vs. negative cognitive impact Cognitive impact n= 3,530 Relevant (partially or totally) n= 2,902 (%) Not relevant n= 628 (%) I am dissatisfied, as this information has no impact on my practice 53 (1.8)73 (11.6) I am dissatisfied, as there is a problem with this information 151 (5.2)107 (17.0) I disagree with this information25 (0.9)12 (1.9) I think this information is potentially harmful45 (1.6)20 (3.2) PUSH Context

36 Relevance vs. use of InfoPOEMs® How the information will be used Relevant for at least one of your patients? # Ratings (%)For thinking To justify or maintain To modify Totally relevant1,671 (36.7)1,275 (76.3)992 (59.4)655 (39.2) Partially relevant1,456 (32.0)1,141(78.4)597 (41.0)410 (28.2) Not relevant1,421 (31.3)0 (N/A) Total4,5482,4161,5891,065 Ratings (%): column percentage Thinking (%), Justify or maintain (%), Modify (%): row percentage PUSH Context

37 Positive cognitive impact vs. use of InfoPOEMs® for a specific patient Excluding 1,018 InfoPOEM® ratings of ‘No Impact’ PUSH Context Cognitive impact n= 3,530 For thinking about this patient n= 2,285 (%) To justify or maintain management n= 1,521 (%) To modify management n= 1,051 (%) My practice was (will be) changed and improved 694 (30.4)364 (23.9)671 (63.8) I learned something new1,689 (73.9)899 (59.1)932 (88.7) This information confirmed I did (will do) the right thing 1,015 (44.4)1098 (72.2)331 (31.5) I was reassured1,214 (53.1)1082 (71.1)530 (50.4) I am reminded of something I already knew 671 (29.4)636 (41.8)233 (22.2) I am motivated to learn more1,176 (51.5)681 (44.8)690 (65.7)

38 PULL

39 Response rate 83% PULL Context

40 Searches and hits per MD NMeanRange Rated searches / MD40 MD44.2 (30.8)6 - 148 Rated hits / MD40 MD82.5 (63.7)9 - 294 Rated hits / rated search1,767 rated searches 3,300 rated hits 1.9 (1.7)1 - 22 PULL Context

41

42 Task Analysis: Reasons for Searches in Family / General Practice Address a clinical question/problem/ decision making about a specific patient1,310 (74%) Look up something I had forgotten672 (38%) Share information with a patient/caregiver624 (35%) Exchange information with other health professionals520 (29%) Search in general or for curiosity496 (28%) Fulfill an educational or research objective434 (25%) Plan, manage, coordinate, delegate or monitor tasks with other health professionals 197 (11%) PULL Context

43 Number of reasons given for searches PULL Context

44 Types of Cognitive Impact 7,275 reported cognitive impacts linked to 3,300 rated hits This information confirmed I did (will do) the right thing151646% I was reassured146845% I learned something new124638% I recalled something113634% My practice was (will be) changed and improved96329% No Impact78024% Negative Impact (a composite of four types)1665% PULL Context

45 166 of 3,300 rated hits (5%) contain reports of Negative Impact I was dissatisfied, as this information had no impact on my practice79 (2.4%) I was dissatisfied, as there was a problem with this information67 (2.0%) I think this information is potentially harmful13 (0.4%) I disagree with this information7 (0.2%) PULL Context

46 1,708 rated hits (52%) were (or will be) used for a specific patient PULL Context

47 How does achieving the search objective influence cognitive impact? 3,300 hits in 1,767 searches Objective met: n= 2,482 hits (75.2%) in 1,336 searches (75.6%) PULL Context Cognitive impact n=3,300 Objective met n= 2,482 (%) Objective not met n= 818 (%) My practice was (will be) changed and improved899 (36.2)64 (7.8) I learned something new1,104 (44.5)142 (17.4) This information confirmed I did (will do) the right thing1,378 (55.5)138 (4.2) I was reassured1,351 (54.4)117 (14.3) I recalled something1,021 (41.1)115 (14.1) I am dissatisfied, as this information has no impact on my practice 10 (0.4)69 (8.4) I am dissatisfied, as there is a problem with this information21 (0.9)46 (5.6) I disagree with this information6 (0.2)1 (0.1) I think this information is potentially harmful11 (0.4)2 (0.2) This item of information had no impact at all on me or my practice 288 (11.6)492 (60.2)

48 How is Cognitive impact associated with the use of information for a specific patient? PULL Context Cognitive impact n=3,300 Used for a specific patient n= 1,708 (51.8%) Not used for a specific patient n=1,592 (48.2%) My practice was (will be) changed and improved709 (41.5)254 (16.0) I learned something new756 (44.3)490 (30.8) This information confirmed I did (will do) the right thing 1081 (63.3)435 (27.3) I was reassured1027 (60.1)441 (27.7) I recalled something803 (47.0)333 (20.9) I am dissatisfied, as this information has no impact on my practice 23 (1.4)56 (3.5) I am dissatisfied, as there is a problem with this information 22 (1.3)45 (2.8) I disagree with this information2 (0.1)5 (0.3) I think this information is potentially harmful5 (0.3)8 (0.5) This item of information had no impact at all on me or my practice 106 (6.2)674 (42.3)

49 From PUSH to PULL

50 DYADS n=21 PUSH 4,937 email alerts ( InfoPOEMs® ) read PULL 1,205 retrieved InfoPOEMs®

51 PurposefulSerendipitous Critical thinking dyads n=4 Known-item dyads n=6 Recognized when re-read n=3 Not recognized when re-read n=7 DYADS n=21* *One participant did not clearly remember one dyad.

52 Information Assessment Method What is it? How does it help research in health sciences informatics? Tool = IAM Design = mixed methods to combine strengths of qualitative and quantitative evaluation QUANT: 21 “push-pull dyads” tracked QUAL: IAM-guided interviews to explore dyads Understand why these rare events happened Effect of email alerts on clinical information retrieval not mediated via 'known item' searches

53 IAM http://iam2009.pbworks.com

54

55 Purposeful known-item dyads “Because I wanted to have a copy […] for teaching purposes […] Well I knew it existed [the InfoPOEM®]. When I first read it, I did not write where the article was from to be able to retrieve it. So I had to retrieve it [the InfoPOEM®] to find which journal it was in.”

56 Purposeful - Critical thinking dyads “I read that InfoPOEM®, and I was very surprised. So I went looking for more information on Pap smears, how accurate they were and more evidence-based material. I did that through Evidence Evidence Plus®, and through Google. And then I went back to review that InfoPOEM®, to make sure I understood what I had read.”

57 Does the Information Assessment Method help to preserve long term memory of email alerts?

58 Instructions

59 Example

60 Average time interval between date of reading an InfoPOEM® and the memory test Read and Rated (n=240) Read Not Rated (n=240) Average number of days between reading and memory test 113.343.5 Minimum number of days6014 Maximum number of days172164

61 Ability of 24 family doctors to remember whether an InfoPOEM® was previously read (or never read) Three Categories of Delivered InfoPOEMs® Read and RatedRead Not RatedNever Read Number correctly remembered / total 166 / 240 (69%)181 / 240 (75%)165 / 240 (69%) Range of correct answers per MD 3-10 / 105-10 / 104-9 / 10

62 Effect of recency of reading on the probability of obtaining an incorrect answer to the question “Have you read this InfoPOEM®?” Logistic Regression Model Model Incorrect Answer = b0+ b1* Rated + b2* Time interval Parameter estimate (SD)P Value Odds ratio [95% C.I.] Rated (1) vs. Read Not Rated (0) -0.27 (0.32)0.390.76 [0.41,1.42] Time interval0.008 (0.003)0.011.01 [1.00,1.02]

63 Copyright ©2009 Canadian Medical Association or its licensors Straus, S. E. et al. CMAJ 2009;181:165-168 Figure 1: The knowledge-to-action framework

64 Too much information10.0% Not enough information310.9% Information poorly written20.1% Information too technical10.0% Other problem (please specify)411.2% I was dissatisfied, as there was a problem with this information PULL Context

65 Do physicians manually update downloadable software on PDA? About 25% of FPs never or rarely updated on their own Many software updates were never installed Only a small number of FPs made all requested updates on their own

66 Semi-automated updating

67 Manual updating

68 IAM, CME and EMR: The Future IAM and CME – We need more effective CME, using most effective methods – Learner-centred – Setting-sensitive: ‘point-of-care’ learning EMR, Infobuttons and IAM

69 Information Assessment Method* http://iam2009.pbwiki.com/FrontPage# *United States patent pending: http://www.techtransfer.mcgill.ca/industry/show_ato.php? 109

70 When are searches rated?

71

72 PUSH

73 Patient health Outcomes

74

75 Known vs. Unknown Item Searching Known item search= user is aware of a discrete information object Unknown item or subject search= user searches in a general topic area Study conducted in a public library to assess information seeking behavior and search strategies of library users OPAC (online public access catalogue) 32 participants 23(female) 9 (male) age 8 to 65 Participants were selected at random and asked if their OPAC searches could be observed and documented Results: 8 (88%) of known item searches were successful compared to 20 (50%) of unknown item or subject searches Sloan D. Encounters with the library OPAC: Online Searching in Public Libraries. J Am Soc Inf Sci 2000;51(8):757-73.

76 76

77 Validity Information Assessment Method (IAM)

78

79

80 Portfolio of practice-based learning and practice improvement

81

82 CPhA-McGill etstudy IAM Questionnaire (CIHR-FRSQ)

83

84 Submit


Download ppt "Information Assessment Method What is it? How does it help research in health sciences informatics? Roland Grad MD MSc Pierre Pluye MD PhD Vera Granikov."

Similar presentations


Ads by Google