Evaluation of SEED in Romania and England Angela Sorsby Joanna Shapland University of Sheffield Funded by National Offender Management Service (England)

Slides:



Advertisements
Similar presentations
The Course experience questionnaire (P. Ramsden) Designed as a performance indicator 24 statements relating to 5 aspects 1 overall satisfaction statement.
Advertisements

Clinical Governance VTS Scheme Presentation Feb 2003 Matt Walsh.
A Department of Geographical and Life Sciences b Learning and Teaching Enhancement Unit
Strategic Targeting of Recidivism through Evaluation And Monitoring (STREAM) Rob Canton.
Extended Project Research Skills 1 st Feb Aims of this session  Developing a clear focus of what you are trying to achieve in your Extended Project.
Justice Data Lab Joint winners of the RSS 2014 Excellence in Official Statistics Award RSS Professional Statisticians' Forum Meeting Georgina Eaton & Tillie.
Survey research II Interviewing. In person surveys ► Instead of respondents reading questionnaires and recording their own responses, ► Interviewers ask.
PPA 502 – Program Evaluation Lecture 5b – Collecting Data from Agency Records.
Case management training and qualifications Rob Canton De Montfort University, Leicester UK.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
Person-centred Care & Patient Activation Richard Owen NHS England Dr Natalie Armstrong University of Leicester.
How to Develop the Right Research Questions for Program Evaluation
The European Probation Rules Rob Canton Professor of Criminal Justice De Montfort University, Leicester.
T Teaching the Teachers Merete Jørgensen and Klaus Witt Department of General Practice, University of Copenhagen.
MANAGEMENT OF MARKETING
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
OSSE School Improvement Data Workshop Workshop #4 June 30, 2015 Office of the State Superintendent of Education.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Outputs and Outcomes Building Better Opportunities Neil King - Director – CERT Ltd.
Promoting quality in probation supervision: Evaluating the SEED programme in Romania Presentation to STREAM final conference Malta, October 2014.
Mentoring for Excluded Groups and Networks (MEGAN) Peer Review Report Dr. Ioan Durnescu Brussels
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Developing a result-oriented Operational Plan Training
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Military Family Services Program Participant Survey Training Presentation.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Level 1 Business Studies AS90837 Demonstrate an understanding of internal factors of a small business.
Professional Administrative Support for Adult Learning Pro- SAL PROJECT INFORMATION.
The Brief Overview of the SEEDs Romanian Pilot Experience …a style that gives more meaning to our practice…
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Barbara F. Schloman, Ph.D., TRAILS Project Director Julie A. Gedeon, Ph.D., TRAILS Assessment Coordinator Kent State University Libraries and Media Services.
VALIDITY AND VALIDATION: AN INTRODUCTION Note: I have included explanatory notes for each slide. To access these, you will probably have to save the file.
Clinical Supervision Foundations Module Four Supervisory Modalities and Methods.
Copyright 2010, The World Bank Group. All Rights Reserved. Uses of Statistics on Crime, Justice & Security Part 1 Crime, Justice & Security Statistics.
Copyright © 2007 Pearson Education Canada 7-1 Chapter 7: Audit Planning and Documentation.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Development Impact Evaluation in Finance and Private Sector 1.
Evaluation: Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida
3.4 How do businesses operate1 Unit 4.3 What Aids Decision Making?
Doing an External Audit Work out the project in some detail – what is the desired outcome? Consider the main “themes” of the project e.g. Youth, increasing.
Chapter 14: Affective Assessment
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
PRESENTATION BY THE GHANA TEAM By Eunice Dapaah Senior Education Specialist World Bank- Ghana Office.
Self Assessment SELF ASSESSMENT FOR YOU Ann Pike 30 th September 2010.
Probation supervision and restorative justice practices: how to effectively reduce reoffending? Prof. Ioan Durnescu Prague, September 2015.
The purpose of evaluation is not to prove, but to improve.
Discuss how researchers analyze data obtained in observational research.
Level 1 Business Studies AS90837 Demonstrate an understanding of internal factors of a small business.
Peer feedback on a draft TMA answer: Is it usable? Is it used? Mirabelle Walker Department of Communication and Systems.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
Evaluating Training The Kirkpatrick Model.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Developing a proposal Dónal O’Mathúna, PhD Senior Lecturer in Ethics, Decision-Making & Evidence
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Questionnaire-Part 2. Translating a questionnaire Quality of the obtained data increases if the questionnaire is presented in the respondents’ own mother.
Improved socio-economic services for a more social microfinance.
Day 8 Usability testing.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
How to evaluate the impact of CPD? 28 th April 2016.
Monitoring, Evaluation and Learning
Project No EPP EL-EPPKA2-CBHE-JP ( ) UES TEAM
Evaluating performance management
GETTING ‘EVEN’ BETTER: FIVE LEVELS FOR PD EVALUATION
Lesson Planning (2) (A.E.T. Wk 11).
VTS Scheme Presentation Dr Matt Walsh
VTS Scheme Presentation Feb 2003 Matt Walsh
Presentation transcript:

Evaluation of SEED in Romania and England Angela Sorsby Joanna Shapland University of Sheffield Funded by National Offender Management Service (England) and European Commission

SEED aims to improve practitioner/ convicted person engagement. It focuses on what practitioners do with convicted persons in supervision sessions and aims to enhance the effectiveness of one-to-one work.

In the evaluation we were interested in: practitioners’ views about the training they had received, particularly in terms of whether it had improved their engagement skills managers’ views views of convicted persons about their supervision, for those supervised by SEED trained staff and those supervised by staff that are not SEED trained whether there is any impact on compliance.

Evaluation of impact requires trained and comparison groups. Research design and methods to some extent determined by resources and practicalities.

Random assignment not possible Random assignment of probation teams would require excessively large number of participating offices. Random allocation of practitioners to training does not fit with the SEED model of training teams (or most of the team) together. Random assignment of convicted persons to practitioners not practical for a number of reasons e.g. geographical location.

In England the whole team trained together. For each SEED trained team we had a comparison team that was not SEED trained. The disadvantage was that, almost inevitably, the profile of convicted persons was not identical for trained and comparison teams. Multivariate statistics were used to statistically take account of differences.

In Romania, originally conceived as feasibility study but decided to do scaled-down replication of English Evaluation. Most but not all the practitioners in each team took part in SEED training. Those that did not take part formed the comparison group. Likely that convicted person profile will be more similar for the two groups, compared to England. Disadvantage possible contamination.

SEED evaluation methods Capturing staff views Observation of training and informal talks with staff. Questionnaires to staff undergoing training at every training session. Interviews with managers, practitioners and trainers.

SEED evaluation methods Capturing convicted persons’ views Questionnaires to convicted persons in SEED and comparison groups asking them about the supervision they are receiving. Is there any difference between the groups and between the countries? In-depth interviews with convicted persons near the beginning and end of their order.

SEED evaluation methods Assessing whether there is any impact on outcomes Comparison of available compliance data for SEED and comparison groups.

Practical observations in relation to some of the methods Observation of training Extremely useful, gave us a direct overall impression of practitioners’ reactions and their views in relation to potential issues, especially as training includes SWOT analyses, focus groups and other forms of feedback. Resource intensive, observed almost all initial training and follow-up training in England, in Romania could only observe initial training in Bucharest.

Practical observations in relation to some of the methods Questionnaires to practitioners participating in training Fairly straightforward, easy to administer, gave us more detailed quantifiable data about staff reactions. Indicated slightly different things appreciated in each of the two countries.

Practical observations in relation to some of the methods Questionnaires for convicted persons - issues for staff Substantial administrative burden for probation staff in terms of getting them to the right people at the right time - only to be given to people who commenced orders after staff completed their initial training and to be administered at a certain point in the order. Issues in relation to local availability of data about who commenced orders after training and who was at the right stage in their order. In England local offices had to obtain lists from centralised IT departments, problems with delays and lists including people outside of the relevant office or parameters of the study, hence required sifting.

Practical observations in relation to some of the methods Questionnaires for convicted persons issues in relation to convicted persons Potential literacy issue in some jurisdictions. Completion rate and written comments indicated this may not have been too much of an issue in England and Romania. Study in Romania intended as a feasibility study of whether evaluation can be applied to another EU jurisdiction. Likert type scales Never o------o------o------o------o Almost every session did not seem to be so clearly understood in Romania.

Practical observations in relation to some of the methods Compliance data Lots of practical issues around what data is collected, how it is stored and how it can be provided for research purposes. Case management systems tend not to be designed for research purposes.

Practical observations in relation to some of the methods Compliance data In England, data fields that can be used for queries were overwritten when there were changes. For example, if someone was recalled to prison, the release date was removed from the release date field, so if you tried to extract people released between certain dates using the release date field, that person wouldn’t appear in the data. Regular downloads therefore required to avoid losing people. Queries also tended to capture people outside the parameters of the project so considerable work for evaluators in determining who should and shouldn’t be in the data set.

Practical observations in relation to some of the methods Compliance data In Romania systems set up to run off cases by office but not individual probation counsellors. As SEED trained and comparison probation counsellors were in the same office, we needed cases organised by probation counsellor or by whether trained or not trained. This was done by probation staff so required quite a bit of work by them. Probation orders in Romania are long, an average of five years, so within the timescale of the project we can only look at compliance over a small proportion of the order.

In conclusion We have obtained a lot of worthwhile data but this has involved considerable effort from SEED trainers, probation counsellors, administrators, volunteers, IT departments, the National Offender Management Service in England and the Ministry of Justice in Romania, all of whom I am grateful to.

Questions and discussion