Quantitative Research

Slides:



Advertisements
Similar presentations
Systematic reviews and Meta-analyses
Advertisements

What is a review? An article which looks at a question or subject and seeks to summarise and bring together evidence on a health topic.
Andrea M. Landis, PhD, RN UW LEAH
Postgraduate Course 7. Evidence-based management: Research designs.
Observational Studies and RCT Libby Brewin. What are the 3 types of observational studies? Cross-sectional studies Case-control Cohort.
Study Designs in Epidemiologic
Critical Appraisal: Epidemiology 101 POS Lecture Series April 28, 2004.
The Bahrain Branch of the UK Cochrane Centre In Collaboration with Reyada Training & Management Consultancy, Dubai-UAE Cochrane Collaboration and Systematic.
8. Evidence-based management Step 3: Critical appraisal of studies
Reading the Dental Literature
Introduction to Critical Appraisal : Quantitative Research
NURS 505B Library Session Rachael Clemens Spring 2007.
Evidence-Based Practice for Pharmacy Y2 Pamela Corley, MLS, AHIP Joe Pozdol, MLIS Norris Medical Library 2003 Zonal Ave. Los Angeles, CA
Research Methods in Psychology Pertemuan 3 s.d 4 Matakuliah: L0014/Psikologi Umum Tahun: 2007.
Chapter 7. Getting Closer: Grading the Literature and Evaluating the Strength of the Evidence.
By Dr. Ahmed Mostafa Assist. Prof. of anesthesia & I.C.U. Evidence-based medicine.
Introduction to evidence based medicine
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Epidemiological Study Designs And Measures Of Risks (2) Dr. Khalid El Tohami.
Research Methods Ass. Professor, Community Medicine, Community Medicine Dept, College of Medicine.
Qualitative Research Richard Peacock, Clinical Librarian
Making all research results publically available: the cry of systematic reviewers.
Are the results valid? Was the validity of the included studies appraised?
Their contribution to knowledge Morag Heirs. Research Fellow Centre for Reviews and Dissemination University of York PhD student (NIHR funded) Health.
Discussion Gitanjali Batmanabane MD PhD. Do you look like this?
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 7: Gathering Evidence for Practice.
Critical Reading. Critical Appraisal Definition: assessment of methodological quality If you are deciding whether a paper is worth reading – do so on.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Research Methods Key Points What is empirical research? What is the scientific method? How do psychologists conduct research? What are some important.
EBD for Dental Staff Seminar 2: Core Critical Appraisal Dominic Hurst evidenced.qm.
Study Design. Study Designs Descriptive Studies Record events, observations or activities,documentaries No comparison group or intervention Describe.
Epidemiology The Basics Only… Adapted with permission from a class presentation developed by Dr. Charles Lynch – University of Iowa, Iowa City.
Systematic Reviews Professor Kate O’Donnell. Reviews Reviews (or overviews) are a drawing together of material to make a case. These may, or may not,
Research Designs Murray W. Enns Professor of Psychiatry.
دکتر خلیلی 1. Lucid the way to “ Research” And Follow an “ Evidence Based Medicine”
Systematic Reviews.
Lecture 6 Objective 16. Describe the elements of design of observational studies: (current) cohort studies (longitudinal studies). Discuss the advantages.
Evidence Based Medicine Meta-analysis and systematic reviews Ross Lawrenson.
Understanding real research 4. Randomised controlled trials.
Finding Relevant Evidence
Systematic Reviews By Jonathan Tsun & Ilona Blee.
Critical Appraisal of the Scientific Literature
Landmark Trials: Recommendations for Interpretation and Presentation Julianna Burzynski, PharmD, BCOP, BCPS Heme/Onc Clinical Pharmacy Specialist 11/29/07.
EXPERIMENTAL EPIDEMIOLOGY
How to read a paper D. Singh-Ranger. Academic viva 2 papers 1 hour to read both Viva on both papers Summary-what is the paper about.
Causal relationships, bias, and research designs Professor Anthony DiGirolamo.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. The Scientific Method The approach used by social scientists.
Critical Reading. Critical Appraisal Definition: assessment of methodological quality If you are deciding whether a paper is worth reading – do so on.
Study designs. Kate O’Donnell General Practice & Primary Care.
Objectives  Identify the key elements of a good randomised controlled study  To clarify the process of meta analysis and developing a systematic review.
Research Methods Ass. Professor, Community Medicine, Community Medicine Dept, College of Medicine.
Research Approaches and Designs 15. Copyright ©2011 by Pearson Education, Inc. All rights reserved Dental Public Health & Research: Contemporary Practice.
Sifting through the evidence Sarah Fradsham. Types of Evidence Primary Literature Observational studies Case Report Case Series Case Control Study Cohort.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Making childbirth safer: Promoting Evidence-based Care Name of presenter Prevention of Postpartum Hemorrhage Initiative (POPPHI) Project.
EBM --- Journal Reading Presenter :呂宥達 Date : 2005/10/27.
Compliance Original Study Design Randomised Surgical care Medical care.
Lecture 2: Evidence Level and Types of Research. Do you recommend flossing to your patients? Of course YES! Because: I have been taught to. I read textbooks.
Research Design Evidence Based Medicine Concepts and Glossary.
Types of Studies. Aim of epidemiological studies To determine distribution of disease To examine determinants of a disease To judge whether a given exposure.
Course: Research in Biomedicine and Health III Seminar 5: Critical assessment of evidence.
Evidence Based Practice (EBP) Riphah College of Rehabilitation Sciences(RCRS) Riphah International University Islamabad.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Introduction to General Epidemiology (2) By: Dr. Khalid El Tohami.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Chapter 12 Quantitative Questions and Procedures.
Confidence Intervals and p-values
Critical Appraisal Skills quantitative reviews
The objective of this lecture is to know the role of random error (chance) in factor-outcome relation and the types of systematic errors (Bias)
Interpreting Epidemiologic Results.
Presentation transcript:

Quantitative Research                                                                                              Richard Peacock, Clinical Librarian Archway Healthcare Library Ziba Nadimi, Outreach and Information Skills Librarian Camden Primary Care Trust

Definition A study that aims to quantify attitudes or behaviours, measure variables on which they hinge, compare, and point out correlations. It is most often conducted via a survey on a sampling that must be representative so that the results can be extrapolated to the entire population studied. It requires the development of standardised and codifiable measurement instruments (structured questionnaires). Ipsos- A Market Research Co.

Quantitative Data Collection Requires a specific protocol Protocol is specified in advance of data collection Population and sample should be large. The larger the better

Data Analysis Statistical analysis Describes trends, compares groups, relates variables Compares your results with past research

The Anatomy of a Research Paper Introduction Methods Results Discussion

The Anatomy of a Research Paper cont. The introduction summarises the background to the study The methods helps to understand if you are critically appraising a paper The results reports findings objectively without speculation or interpretation In the discussion the authors interpret the findings in light of the study design and other research

Levels of Evidence

Systematic Review Identifies an intervention for a specific disease or other problem in health care Determines whether or not this interventions works Authors locate, appraise and synthesise evidence from as many relevant scientific studies as possible They summarise conclusions about effectiveness They provide a collation of the known evidence on a given topic Statistical methods (meta-analysis) may or may not be used to analyse and summarise the results

Advantages of Systematic Reviews Adhere to a strict design, therefore minimise the chance of bias Provide a scientific rather than subjective summarisation of literature Large amounts of information can be assimilated quickly by healthcare providers, researchers, and policymakers Compare results of different studies to establish generalisability of findings and consistency of results

Disadvantages of Systematic Reviews Experts may select only supportive evidence Papers with more interesting result more likely to be published SRs may be biased due to exclusion of relevant studies and inclusion of inadequate studies SRs include an element of judgement, whatever method is used e.g. authors of tobacco industry affiliated reviews are more likely to conclude that passive smoking is not harmful

Randomised Controlled Trials- RCTs Two or more interventions are compared by being randomly allocated to participants Includes a control intervention or no intervention If possible should be single/ double/ triple blinded

Blinding in RCTs Preventing those involved in a trial from knowing to which comparison group, i.e. experimental and control, a particular participant belongs The risk of bias is minimised Participants, caregivers, outcome assessors and analysts can all be blinded Blinding of certain groups is not always possible, e.g. surgeons in surgical trials Single, double and triple blind are in common use

Cohort Studies An observational study A defined group of people (the cohort) is followed over time Outcomes are compared to examine people who were exposed or not exposed to a particular intervention A retrospective cohort study identifies subjects from past records and follows them to the present A prospective cohort study assembles participants and follows them into the future

Case Control Studies Compares people with a specific disease or outcome of interest (cases) to people from the same population without that disease or outcome (control) Seeks associations between the outcome and prior exposure to particular risk factors e.g. one group may have been exposed to a particular substance that the other was not They are usually retrospective

Case Series A study reporting observations on a series of individuals, usually all receiving the same intervention, with no control group.

Odds-Ratio Diagram (Blobbogram)

Odds-Ratio Diagram Cont. Included in CDSR and other good systematic reviews Presents complicated results and concepts in a clear visual format For each individual trial, the odds-ratio result is represented by a box The vertical line is an odds-ratio of one known as “line of no effect” The horizontal line is the “confidence interval” for that result

Odds-Ratio Diagram Cont. Confidence Interval - the range in which we are 95% or 99% confident that the ‘real’ result of the study lies when is extrapolated to the whole of the population sampled in the study The diamond is called meta analysis – the statistical method of combining the results of different RCTs for the same intervention

Odds-Ratio Diagram Cont. Results to the left of the line of no effect = less of the outcome in the experimental group Results to the right of the line of no effect = more of the outcome in the experimental group It is important to note whether the outcome is good or bad

Odds-Ratio Diagram Cont. If CI crosses line of no effect = inconclusive results Longer CI = smaller study (less confident of results) Shorter CI = bigger study (more confident of results)

P Values P for probability (ranging from zero to one) The result could have occurred by chance if in reality the null hypothesis was true The null hypothesis- the factor of interest (e.g. treatment) has no impact on outcome (e.g. risk of death) P value of less than 0.05 means likelihood of results being due to chance is less than 1 in 20 = “Statistically significant”

References McGovern, D.P.B. etal, Evidence-based medicine, BIOS Scientific Publishers Ltd., 2001 Greenhalgh, T., How to read a paper, 3rd ed., BMJ Publishing Group, 2006 Kelsey, K.D., (Lecture 2) Quantitative and qualitative approaches to research- PowerPoint Presentation Ward, L., Critical reading made easy:effectiveness and experience, University Hospitals of Leicester NHS Trust- PowerPoint Presentation Jackson, N., Conducting systematic reviews of public health and health promotion interventions, Cochrane Health Promotion and Public Health Field- PowerPoint Presentation The Cochrane Library’s Glossary of Terms, Wiley InterScience Andrew Hayward, Critical Appraisal of Analytical Studies- PowerPoint Presentation