Developing a framework for evaluating qualitative research Liz Spencer, Jane Ritchie, Jane Lewis, Lucy Dillon NatCen Team 24 June 2004.

Slides:



Advertisements
Similar presentations
Program Evaluation Alternative Approaches and Practical Guidelines
Advertisements

Performance Assessment
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Year Two Year Three Year One Research methods teaching in the social sciences: An integrated approach to inquiry- based learning.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Students’ experience of the process of practice assessment; a multi-professional case study from Social work, Midwifery and Emergency Care. Tracey Proctor-Childs;
Protocol Development.
Making Evidence-Based Education Policy Ontario Research Chairs in Public Policy Symposium Carol Campbell Ontario Institute for Studies in Education, University.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Asia’s Best in Powerpoint Presentation D I A M O N D A W A R D First Place.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Participation Requirements for a Guideline Panel PGIN Representative.
Reviewing and Critiquing Research
© Grant Thornton UK LLP. All rights reserved. Review of Sickness Absence Vale of Glamorgan Council Final Report- November 2009.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
Publishing qualitative studies H Maisonneuve April 2015 Edinburgh, Scotland.
ISYS 3015 Research Methods ISYS3015 Analytical Methods for Information systems professionals Week 2 Lecture 1: The Research Process.
Business research methods: data sources
Reporting and Evaluating Research
Learning and Development Developing leaders and managers
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
Proposal Writing.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
1 A proposed skills framework for all 11- to 19-year-olds.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Professional Certificate – Managing Public Accounts Committees Ian “Ren” Rennie.
Exploring the use of QSR Software for understanding quality - from a research funder’s perspective Janice Fong Research Officer Strategies in Qualitative.
APAPDC National Safe Schools Framework Project. Aim of the project To assist schools with no or limited systemic support to align their policies, programs.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Chapter 24 Trustworthiness and Integrity in Qualitative Research
WORKSHOP PERSPECTIVES: NORTH / SOUTH RESEARCH PARTNERSHIPS ICT IN EDUCATION GeSCI’s thematic focus areas and meta-review of ICT in education research Patti.
QUICK OVERVIEW - REDUCING REOFFENDING EVALUATION PACK INTRODUCTION TO A 4 STEP EVALUATION APPROACH.
Overview of Chapter The issues of evidence-based medicine reflect the question of how to apply clinical research literature: Why do disease and injury.
FOR 500 PRINCIPLES OF RESEARCH: PROPOSAL WRITING PROCESS
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Results The final report was presented to NICE and published by NICE and WHO. See
Deciding how much confidence to place in a systematic review What do we mean by confidence in a systematic review and in an estimate of effect? How should.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
BCO Impact Assessment Component 3 Scoping Study David Souter.
STRATEGIC ENVIRONMENTAL ASSESSMENT METHODOLOGY AND TECHNIQUES.
BMH CLINICAL GUIDELINES IN EUROPE. OUTLINE Background to the project Objectives The AGREE Instrument: validation process and results Outcomes.
Qualitative Research January 19, Selecting A Topic Trying to be original while balancing need to be realistic—so you can master a reasonable amount.
An overview of multi-criteria analysis techniques The main role of the techniques is to deal with the difficulties that human decision-makers have been.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Chapter 8 New Wave Research: Contemporary Applied Approaches.
Making sense of it all analysing and interpreting data.
Relevance and Impact of Qualitative Research Qualitative Research Methods.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
PREMA International Workshop El Parc Cientific de Barcelona, 25-26/1/07 Kathy Kikis-Papadakis.
1 Assessing quality in systematic reviews of the effectiveness of health promotion and public health (HP/PH): Areas of consensus and dissension Dr Jonathan.
1 Qualitative Research: Challenges and Opportunities Presented by: Anne Smyth Liz Dimitriadis.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 7 Understanding Theory and Research Frameworks.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
UNEP EIA Training Resource ManualTopic 14Slide 1 What is SEA? F systematic, transparent process F instrument for decision-making F addresses environmental.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
What is convincing evidence? Naved Chowdhury & Enrique Mendizabal Objective of the session: –To arrive at a definition of CONVINCING evidence –what makes.
Quality in qualitative research ESRC research methods festival July 2008 Jane Lewis, NCB.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
PRACTICED BASED RESEARCH Overview 25 th November 2013.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
AES 2010 Kate McKegg, Nan Wehipeihana, Jane Davidson.
What makes a successful transition into adulthood for disabled young people (14-25 years of age)?: a reflection of undertaking a PhD Sally Rees.
Leacock, Warrican and Rose (2009)
Style You need to demonstrate knowledge and understanding beyond undergraduate level and should also reach a level of scope and depth beyond that taught.
Meta-analysis, systematic reviews and research syntheses
Presentation transcript:

Developing a framework for evaluating qualitative research Liz Spencer, Jane Ritchie, Jane Lewis, Lucy Dillon NatCen Team 24 June 2004

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The context Within the Cabinet Office –responsibility for encouraging ‘excellence in government research and evaluation’ –commitment to the ‘contribution of research and research synthesis to evidence-based policy and practice’ –concern about lack of agreed standards for ‘what constitutes high quality qualitative methods of policy evaluation’

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The brief: aims and objectives identify a set of standards / produce a provisional set of criteria against which qualitative policy evaluation studies can be critically appraised to help determine whether particular qualitative studies / studies reported in the literature meet the agreed standards of validity, reliability, relevance, and robustness to be included in the evidence base for effective policy making produce guidance on how the standards / criteria can be applied in practice

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The brief: suggested approach literature review (debates about qualitative methods and criteria, other frameworks / sets of guidelines, research reports) interviews with academics, research practitioners, funders, commissioners and users / policy makers NatCen proposed –a workshop –trial application

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Issues arising from the brief the meaning / scope of ‘policy evaluation studies’ the notion of standards / criteria in qualitative research the call for application guidelines the assumption that reliability, validity, relevance and robustness are key considerations / components the relationship between evidence and policy making epistemological assumptions and implications

Liz Spencer, New perspectives, 2004 (member of the NatCen team) ‘Qualitative policy evaluation’ evaluation versus research? –different aims, timescales, and ways of assessing particular contribution of qualitative approaches to evaluation adopted inclusive interpretation of policy evaluation –policy research, studies of practice, assessment of interventions/ initiatives/programmes –background/context setting, development, implementation, outcomes broadened scope to include ‘empirical’ qualitative research

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Evaluation traditions how to address different aims of evaluation? –generation of information to aid decision making –participation –enlightenment –reform –emancipation focussed on evaluation which utilises qualitative research methods, where aim is to produce defensible knowledge claims and quality of research still matters

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The relationship between evidence and policy tension between evidence-based practice/policy and qualitative research based on questionable assumptions that –there is a hierarchy of methods –qualitative research is a last resort / soft option –explicit procedures are superior to informed judgements –aggregation and synthesis are superior to mapping –policy can be forged from ‘brute facts’ / evidence gives the answer reinforced decision to to concentrate on quality of qualitative findings (issues of timeliness, feasibility, and political will left to policy makers)

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The whole idea of qualitative standards or criteria Many different ‘positions’ –rejection of criteria for philosophical or methodological reasons –proposal of ‘alternative’ criteria (unrelated to notions of rigour or credibility) –proposal of ‘parallel’ criteria (addressing notions of rigour or credibility) –adoption of traditional ‘scientific’ criteria (to be applied rather differently)

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The idea of criteria (contd.) concern about rigid checklists concern about ‘tick box’ mentality avoided the term ‘criteria’ adopted series of flexible open-ended questions around guiding principles and quality issues retained centrality of experience and judgement

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Addressing philosophical debates philosophical assumptions crucial to acceptance/rejection of ‘criteria’ and to bases of quality assessment confusing/confrontational labelling of positions adopted ‘elemental’ approach for scope of framework –‘reality’ mediated through human constructions –shared meanings –neutrality as guiding ideal –reflexive practice –no methodological hierarchy –flexible but rigorous conduct

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Addressing the ‘holy trinity’ no escape from ‘validity’, ‘reliability’ and ‘objectivity’ identified underlying themes: –internal validity (procedural/methodological;interpretive / accuracy or credibility of findings; relational / outcomes in relation to participants) –external validity (relevance; generalisability; auditability; contextual detail) –reliability (replication; consistency; auditability) –objectivity (neutral/value free; auditability; reflexivity) –soundness / well-foundedness vs goodness / worthwhileness

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Identification of some underlying central concerns and principles defensibility of approach rigour of conduct (research practice and relationship to those being researched / evaluated) credibility of claims contribution and wider impact

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Determining content series of readings on central concerns and principles relating to all stages of inquiry reflecting recurrent themes within literature and amongst interviewees reflecting context of the commission reflecting locus, experience and skills of the research team

Liz Spencer, New perspectives, 2004 (member of the NatCen team) The structure of the framework Three tiers: –4 central principles –18 appraisal questions (indicative, discretionary, and avoiding yes/no answers, no scoring) –series of quality indicators (illustrative rather than exhaustive or prescriptive, no scoring)

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Guidance on application framework intended to be used flexibly discretion and judgement remain central importance of further trials / assessment contribution to an evolving process

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Interpretation of the brief flexible response achieved –by stressing Cabinet Office objectives for framework to have wider credibility / utility –by analysing recurrent themes in the literature –by reviewing other sets of guidelines / frameworks –by reflecting views of participants (during interviews and at workshop) –because of the National Centre’s role and position

Liz Spencer, New perspectives, 2004 (member of the NatCen team) Full copy of the framework and accompanying report can be found at: