Download presentation
Presentation is loading. Please wait.
1
Implementing a Randomised Controlled Trial for the evaluation of Probation supervision Presenter:Dominic Pearson Collaborators: David Torgerson Cynthia McDougall Roger Bowles
2
Overview of Presentation Citizenship programme Commissioned evaluation Local implementation constraints Solutions adopted Results of deployment Conclusions Methodological questions
3
Citizenship programme Designed in County Durham Probation Area –Operational staff and managers, supported by a consultant (Professor Clive Hollin) Structured one-to-one work A planned response to assessed crime-related needs using specific modules (e.g. alcohol) Aims to raise awareness, motivate to change and link in with external resources
4
Commissioned evaluation Commissioner required a regional evaluation –3 Areas, different profiles Aims –Identify whether the programme reduces reconvictions –Identify whether the programme promotes engagement with relevant community provision –Identify the benefit-costs relationship
5
General barriers to an RCT in the National Probation Service Ethical resistance to random allocation –Judiciary –Probation managers –Probation practitioners Programme management concerns –Difficult for practitioners to manage cases in two different ways –Related quality control issues –Staff training: a randomised phased allocation of the programme = more training events
6
Specific local barriers to an RCT Area A - had already implemented at the time of commissioning Area B – had resource issues linked to workload climate Area C – had performance management issues in some sectors
7
Solutions adopted Area A (had already implemented) Retrospective design Area B (had resource/workload issues) Opted for a single Area-wide launch with senior management support (‘big bang’) – retrospective design Area C (had offices with performance issues) Opted for a ‘stepped wedge’ design with offices randomly allocated in sequence to begin programme deployment Office 1 → April 2007 Office 2 → June 2007 Office 3 → August 2007 Office 4 → October 2007 Office 5 → December 2007 Office 6 → February 2008
8
Results of deployment - take up rates Area A –75% (3072 / 4078) Area B –27% (2499 / 9749) Office 4: 40% (14 / 35) Office 5: 31% (21 / 67) Office 6: 65% (11 / 17) Area C –44% (188 / 426) Office 1: 47% (26 / 65) Office 2: 51% (50 / 98) Office 3: 46% (66 / 144)
9
Conclusions on Implementation Area A was a different case –Implemented in a previous period and over a longer time Area B vs. Area C –Phased deployment produced better results at less cost Area B needed to re-launch (‘big bang’ #2) Progress in Area C could be monitored and managed Differential take up can be adjusted for in the analyses Lesson in the importance of discussing evaluation at the time of implementation
10
Methodological questions Overall / Regional evaluation –3 stages –Different methodologies and Area profiles Citizenship programme is the constant Positive results from the first stage evaluation Results from the (deficient) retrospective designs will be illuminated by the RCT How will the effects associated with the different methodologies compare?
11
Thank you for listening We welcome your comments and thought- provoking questions!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.