Name of presentation 28 April, 2019

Slides:



Advertisements
Similar presentations
Areas of Research Specific issues. Clinical Trials Phase I First use in humans of an experimental drug or treatment In a small group of healthy volunteers.
Advertisements

Different types of trial design
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
Opening the "black box" of PDSA cycles: Achieving a scientific and pragmatic approach to improving patient care Chris McNicholas, Professor Derek Bell,
Campbell Collaboration Colloquium 2012 Copenhagen, Denmark The effectiveness of volunteer tutoring programmes Dr Sarah Miller Centre.
RSBM Business School Research in the real world: the users dilemma Dr Gill Green.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Talk to Us Approach to Evaluation Lindsay Wager 1 st April 2014.
Dr. Tracey Bywater Dr. Judy Hutchings The Incredible Years (IY) Programmes: Programmes for children, teachers & parents were developed by Professor Webster-Stratton,
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
Background Treatment fidelity in group based parent training: Predicting change in parent and child behaviour Dr. Catrin Eames, Bangor University, UK
Conducting and Reading Research in Health and Human Performance.
Chapter 4 Practical Issues in Planning Your Research.
Exploring the experiences of arts practitioners engaging with Educational Development programmes Dr. Karen Treasure Plymouth University VC Teaching and.
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
Critical Realism and Realist Synthesis Sam Porter School of Nursing and Midwifery March 2016.
Critical Realist Evaluation
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
Applied Methodologies in PHI Session 5: Evaluation Kath Roberts (NOO/EMPHO)
How to evaluate the impact of CPD? 28 th April 2016.
Analytical Interventional Studies
PHILOSOPHY AS A SECOND ORDER DISCIPLINE
Intro to Research Methods
Dr Alex Buckley 16 February 2017, University of Stirling
Distinguish between an experiment and other types of scientific investigations where variables are not controlled,
Dr Patrycja Kaszynska AHRC Cultural Value Project and the Cultural Value Scoping Project The Relevant Museum – Between Core Values and Hard Cash 1 November.
Couples Therapy and Depression
Computational Reasoning in High School Science and Math
Qualitative research: an overview
Triangulation.
Classification of Research
Reducing bias in randomised controlled trials involving therapists:
The General Education Core in CLAS
Research Methods and Methodology Introduction for INFO1010
Conducting Efficacy Trials
PO 326 Introduction to Political Science
The added value of evaluation
© LOUIS COHEN, LAWRENCE MANION AND KEITH MORRISON
Cluster Randomized Trials
YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES
Student Ambassador STEM outreach Activity: lessons from the USA
Research Methods - Descriptive
Chapter 1: Introduction to Scientific Thinking
Research proposal MGT-602.
Investigating science
What is Research? A research study is a study conducted to collect and analyse information in order to increase our understanding of a topic or an issue.
Building a Strong Outcome Portfolio
Developing and Evaluating Theories of Behavior
Impact Evaluation Methods
Social Research Methods
Presentation by Dr Eureta Rosenberg
Lesson Using Studies Wisely.
Dr. Matthew Keough August 8th, 2018 Summer School
Research Methods in Education Session 1
Narrowing the evaluation gap
RESEARCH BASICS What is research?.
Professional Learning Community (PLC)
HCI Evaluation Techniques
To RCT, or not RCT: that is the question
Takeyourplace.ac.uk Using data to address ethical challenges in RCTs: activity tracking and process focus Sonia Ilie & Silvia Lucato-Hadeler.
Catherine Beswick University of Nottingham April 2019
Seminar on the Evaluation of AUT STEM Programme
Debate issues Sabine Mendes Lima Moura Issues in Research Methodology
“Developing an understanding of the Independent Foster Care Sector in Northern Ireland, through the perspectives of Independent Foster Carers, Independent.
Developing Indicators of Success
The Impact of Peer Learning on Assessment Literacy and Feedback Orientation
Follow up workshop on the Evaluation of AUT STEM Programme
Faye Nicholson, P7 Class Teacher, Kingsland Primary School
Ciara Lynch, Medical Social Worker
Times they are changing: Evaluating outreach in a new era
Presentation transcript:

Name of presentation 28 April, 2019 Methodological and ethical issues encountered in evaluating the Wohl Reach Out Lab Roberts Zivtins & Dr. Annalisa Alexander 28/02/19

Research Context- Wohl Reach Out Lab Name of presentation 28 April, 2019 Research Context- Wohl Reach Out Lab Imperial College London’s dedicated outreach space Founded in 2010 by Professor Robert Winston 16,126 pupils attended 1,227 activities since the WROL opened Focus on practicals for students who otherwise would not have access to ‘hands-on’ science Programmes are typical of university based interventions- summer school, school visits, STEM potential, etc.

Name of presentation 28 April, 2019 It is important who visits the WROL- these students are selected on various criteria (dependent on the programme), POLAR1 students, FSM, carers, first generation to attend university, etc.

How to evaluate outreach programmes? Name of presentation 28 April, 2019 How to evaluate outreach programmes? Intervention Student pre-intervention Student post- intervention Scriven describes black box evaluation, fundamentally follows the experimental model, start at position A, do something and then measure at position B, you can infer that the intervention caused the change. Cf. drug trial (Scriven 1994)

How to evaluate outreach programmes? Name of presentation 28 April, 2019 How to evaluate outreach programmes? Intervention Student pre-intervention Student post- intervention Scriven describes black box evaluation, fundamentally follows the experimental model, start at position A, do something and then measure at position B, you can infer that the intervention caused the change. Cf. drug trial “Randomised controlled trials are the most rigorous way of determining whether a cause-effect relation exists between treatment and outcome” (Sibbald & Roland 1998) (Scriven 1994)

The problem with attributing causality Name of presentation 28 April, 2019 The problem with attributing causality Intervention Student pre-intervention Student post- intervention Widened participation with science Problem with using this model for evaluation, where an intervention happens a long time before the intended outcome we cannot attribute causality. (e.g. Banerjee 2017)

The problem with attributing causality Name of presentation 28 April, 2019 The problem with attributing causality Intervention Student pre-intervention Student post- intervention Widened participation with science Problem with using this model for evaluation, where an intervention happens a long time before the intended outcome we cannot attribute causality. “All intervention groups are treated identically except for the experimental treatment” (Sibbald & Roland 1998) (e.g. Banerjee 2017)

Using an appropriate intermediary measure Name of presentation 28 April, 2019 Using an appropriate intermediary measure Intervention Student science capital pre-intervention Student science capital post-intervention Science capital is highly predictive of students’ intention to study science and likely future engagement with science- informed by a significant body of theoretical literature. (Archer et al. 2015; DeWitt et al. 2016)

‘Does the programme work’? Name of presentation 28 April, 2019 ‘Does the programme work’? Student science capital post-intervention Intervention Student science capital pre-intervention Achieved through students questionnaire, pre and post intervention Does visiting the WROL increase students levels of science capital?

Re-focus away from outcomes alone? ‘How does the programme work’? Name of presentation 28 April, 2019 Re-focus away from outcomes alone? ‘How does the programme work’? Student science capital post-intervention Student science capital pre-intervention How might the WROL build science capital in students? Achieved partially through the interviews with academic leaders, but also through observation of the sessions themselves, looking at the fine pedagogy. Pawson & Tilley discuss ‘mechanisms’ as the generative factors which bring about any changes. Kazi says critical realism recognises the world as an ‘open system’ and helps account for uncontrollable variables in evaluation- importance of context Are there opportunities for WROL to develop science capital that schools don’t have? (Pawson & Tilley 1997; Kazi 2003; Deaton & Cartwright 2018)

Ethical approval Ethical approval sought through ICREC Name of presentation 28 April, 2019 Ethical approval Ethical approval sought through ICREC Institutional processes focus on clinical/medical trials Tension between ethical and logistical issues- Response rate vs. parental consent Issues compounded by WP factors? School consent in loco parentis? ICREC typically deal with medical studies so are very tight on consent (Macenaite & Kosta, 2017; BERA 2018)

Name of presentation 28 April, 2019 Concluding remarks A traditional experimental model to evaluate outreach cannot fully account for the complexity of human behaviour and social systems Long-term outcomes are not easily attributable to outreach Use of theory, proximal outcomes and proper contextualisation can go some way to counter these shortcomings Investigating how programmes work is vital to supplement our understanding Institutional ethical approval can be difficult, but is possible

References Name of presentation 28 April, 2019 Archer, L. et al. (2015) ‘Science Capital: A Conceptual, Methodological, and Empirical Argument for Extending Bourdieusian Notions of Capital Beyond the Arts’, Journal of Research in Science Teaching, 52, pp. 922–948. Banerjee, P. (2017) ‘Is informal education the answer to increasing and widening participation in STEM education?’, Review of Education, 5(2), pp. 202–224. British Educational Research Association [BERA] (2018) Ethical Guidelines for Educational Research, fourth edition, London. Available at: https://www.bera.ac.uk/researchers-resources/publications/ethical-guidelines-for-educational-research-2018 Deaton, A. & Cartwright, N. (2018) Understanding and misunderstanding randomized controlled trials. Social science and medicine, 210, pp. 2-21 Dewitt, J., Archer, L. & Mau, A. (2016) Dimensions of science capital: exploring its potential for understanding students’ science participation. International Journal of Science Education, 38(16) pp. 2431- 2449 Kazi, M. (2003) ‘Realist Evaluation for Practice British Journal of Social Work 33, 803–818 Macenaite, M. and Kosta, E. (2017) ‘Consent for processing children’s personal data in the EU: following in US footsteps?’, Information and Communications Technology Law. Taylor & Francis, 26(2), pp. 146–197. Pawson, R. & Tilley, N. (1997) Realistic Evaluation. 1st edition. London: SAGE Publications Sibbald, B. & Roland, M. (1998) Why are randomised controlled trials important? BMJ, 316