Download presentation
Presentation is loading. Please wait.
1
Name of presentation 28 April, 2019 Methodological and ethical issues encountered in evaluating the Wohl Reach Out Lab Roberts Zivtins & Dr. Annalisa Alexander 28/02/19
2
Research Context- Wohl Reach Out Lab
Name of presentation 28 April, 2019 Research Context- Wohl Reach Out Lab Imperial College London’s dedicated outreach space Founded in 2010 by Professor Robert Winston 16,126 pupils attended 1,227 activities since the WROL opened Focus on practicals for students who otherwise would not have access to ‘hands-on’ science Programmes are typical of university based interventions- summer school, school visits, STEM potential, etc.
3
Name of presentation 28 April, 2019
It is important who visits the WROL- these students are selected on various criteria (dependent on the programme), POLAR1 students, FSM, carers, first generation to attend university, etc.
4
How to evaluate outreach programmes?
Name of presentation 28 April, 2019 How to evaluate outreach programmes? Intervention Student pre-intervention Student post- intervention Scriven describes black box evaluation, fundamentally follows the experimental model, start at position A, do something and then measure at position B, you can infer that the intervention caused the change. Cf. drug trial (Scriven 1994)
5
How to evaluate outreach programmes?
Name of presentation 28 April, 2019 How to evaluate outreach programmes? Intervention Student pre-intervention Student post- intervention Scriven describes black box evaluation, fundamentally follows the experimental model, start at position A, do something and then measure at position B, you can infer that the intervention caused the change. Cf. drug trial “Randomised controlled trials are the most rigorous way of determining whether a cause-effect relation exists between treatment and outcome” (Sibbald & Roland 1998) (Scriven 1994)
6
The problem with attributing causality
Name of presentation 28 April, 2019 The problem with attributing causality Intervention Student pre-intervention Student post- intervention Widened participation with science Problem with using this model for evaluation, where an intervention happens a long time before the intended outcome we cannot attribute causality. (e.g. Banerjee 2017)
7
The problem with attributing causality
Name of presentation 28 April, 2019 The problem with attributing causality Intervention Student pre-intervention Student post- intervention Widened participation with science Problem with using this model for evaluation, where an intervention happens a long time before the intended outcome we cannot attribute causality. “All intervention groups are treated identically except for the experimental treatment” (Sibbald & Roland 1998) (e.g. Banerjee 2017)
8
Using an appropriate intermediary measure
Name of presentation 28 April, 2019 Using an appropriate intermediary measure Intervention Student science capital pre-intervention Student science capital post-intervention Science capital is highly predictive of students’ intention to study science and likely future engagement with science- informed by a significant body of theoretical literature. (Archer et al. 2015; DeWitt et al. 2016)
9
‘Does the programme work’?
Name of presentation 28 April, 2019 ‘Does the programme work’? Student science capital post-intervention Intervention Student science capital pre-intervention Achieved through students questionnaire, pre and post intervention Does visiting the WROL increase students levels of science capital?
10
Re-focus away from outcomes alone? ‘How does the programme work’?
Name of presentation 28 April, 2019 Re-focus away from outcomes alone? ‘How does the programme work’? Student science capital post-intervention Student science capital pre-intervention How might the WROL build science capital in students? Achieved partially through the interviews with academic leaders, but also through observation of the sessions themselves, looking at the fine pedagogy. Pawson & Tilley discuss ‘mechanisms’ as the generative factors which bring about any changes. Kazi says critical realism recognises the world as an ‘open system’ and helps account for uncontrollable variables in evaluation- importance of context Are there opportunities for WROL to develop science capital that schools don’t have? (Pawson & Tilley 1997; Kazi 2003; Deaton & Cartwright 2018)
11
Ethical approval Ethical approval sought through ICREC
Name of presentation 28 April, 2019 Ethical approval Ethical approval sought through ICREC Institutional processes focus on clinical/medical trials Tension between ethical and logistical issues- Response rate vs. parental consent Issues compounded by WP factors? School consent in loco parentis? ICREC typically deal with medical studies so are very tight on consent (Macenaite & Kosta, 2017; BERA 2018)
12
Name of presentation 28 April, 2019 Concluding remarks A traditional experimental model to evaluate outreach cannot fully account for the complexity of human behaviour and social systems Long-term outcomes are not easily attributable to outreach Use of theory, proximal outcomes and proper contextualisation can go some way to counter these shortcomings Investigating how programmes work is vital to supplement our understanding Institutional ethical approval can be difficult, but is possible
13
References Name of presentation 28 April, 2019
Archer, L. et al. (2015) ‘Science Capital: A Conceptual, Methodological, and Empirical Argument for Extending Bourdieusian Notions of Capital Beyond the Arts’, Journal of Research in Science Teaching, 52, pp. 922–948. Banerjee, P. (2017) ‘Is informal education the answer to increasing and widening participation in STEM education?’, Review of Education, 5(2), pp. 202–224. British Educational Research Association [BERA] (2018) Ethical Guidelines for Educational Research, fourth edition, London. Available at: Deaton, A. & Cartwright, N. (2018) Understanding and misunderstanding randomized controlled trials. Social science and medicine, 210, pp. 2-21 Dewitt, J., Archer, L. & Mau, A. (2016) Dimensions of science capital: exploring its potential for understanding students’ science participation. International Journal of Science Education, 38(16) pp Kazi, M. (2003) ‘Realist Evaluation for Practice British Journal of Social Work 33, 803–818 Macenaite, M. and Kosta, E. (2017) ‘Consent for processing children’s personal data in the EU: following in US footsteps?’, Information and Communications Technology Law. Taylor & Francis, 26(2), pp. 146–197. Pawson, R. & Tilley, N. (1997) Realistic Evaluation. 1st edition. London: SAGE Publications Sibbald, B. & Roland, M. (1998) Why are randomised controlled trials important? BMJ, 316
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.