Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating outreach initiatives

Similar presentations


Presentation on theme: "Evaluating outreach initiatives"— Presentation transcript:

1 Evaluating outreach initiatives

2 Overview of project Aim is to produce guidance for institutions on how best to evaluate outreach initiatives Focus is on programmes assisting potential young entrants from low SES backgrounds i.e., not mature students or retention/progression Funded by Office for Fair Access and Sutton Trust

3 Overview of project Approach
Scope common outreach initiatives to ensure guidance is appropriate for most institutions/programmes Scope existing evaluation approaches and skills/experience of potential evaluators to understand where to pitch guidance/how to build on existing work Review scoping material and use to design guidance Test and update guidance in practice Disseminate guidance

4 Overview of project Scoping phase
Design and circulate online questionnaire to all institutions to obtain basic overview across sector Undertake detailed follow-up interviews with a small number of institutions to flesh out details

5 The Questionnaire

6 Questionnaire 305 programmes/activities
78 responses (59% response rate) All of the institutions said that they do carry out outreach activities with disadvantaged young people. 305 programmes/activities

7 98% of the activities are delivered face-to-face

8 Year Groups Targeted No-one offered interventions to children younger than 5 (Year 1). Most targeted Year 9 – Year 13 (ages 13 – 18).

9 Resources Resource intensive activities
Summer school or other HE residential programme (77% rated this as ‘High). Long-term multi-activity intervention (41% rated this as ‘High’). Low resource activities Information, advice and guidance (75% rated this as ‘Low’). Campus Visit (54% rated this as ‘Low’).

10 Outcomes Hoped For For each activity that respondents told us about they were asked ‘What outcomes do you hope young people from disadvantaged backgrounds will achieve/improve through this programme/activity?’. Attitudes to or aspirations for university (89.8%) Acquisition of knowledge (84.6%) Skills relevant to HE progression (76%) Applications to any institution (58.4%) Attainment at school/college (56.7%) Type of course applied for or attended (54.1%) Applications to own institution (51.1%) Entry to any institution (47.2%) Entry to own institution (45.6%) Type of institution applied for or attended (27.3%)

11 Evaluation

12 Percentage of institutions
Confidence The questionnaire asked ‘How confident do you/your team feel about being asked to take a more critical approach to evaluating your institution’s outreach work in future?’. Percentage of institutions very confident 24.1% quite confident 55.1% not very confident 19.2% not at all confident 1.3% No real relationship with ‘institution type’

13 Skills and Experience Skills Experience Expert Adept Limited Mode 1
Mean 1.69 1.93 4.69 Range 0 - 14 0 - 12 SD 2.33 2.48 13.45 Very experienced Experienced Limited experience Mode 1 Mean 1.33 2.93 4.32 Range 0 - 6 0 - 19 SD 1.42 3.83 13.32

14 Evaluation Work Impact Evaluation Formative Evaluation
91% of the institutions said that they do undertake impact evaluation. Formative Evaluation 95% of the institutions said that they do undertake formative evaluation. Only one institution doesn’t do either type of evaluation.

15 Evaluation Approaches: Definitions
‘After’ data only: we collect data on the outcomes of participants once the programme has finished (but not before). ‘Before-after’ approach: we look at the change in outcomes that participants have experienced between the start and end of our programme and/or track the development of their outcomes throughout. ‘Control group’ approach: we compare the outcomes of participants on our programmes with the outcomes of some individuals who didn’t receive the intervention (the ‘control group’ or ‘comparator group’), or use some other benchmark against which to assess the progress of our participants (e.g. a ‘counterfactual analysis’). Outside expertise: we work with evaluation experts (e.g. academics or specialist charities or companies) to evaluate our programmes/activities on our behalf.

16 The proportion of approaches used for:
The most common approach is a ‘Before-after approach’. The least common approach is a ‘Control group’ approach.

17 What happens next Currently reviewing material and developing guidance
Aim is that guidance will be practical Once developed, we seek feedback and test it Intention is then to disseminate it across the sector

18 What happens next Outputs
A matrix of potential evaluation methods (matching different interventions with appropriate qual/quant evaluation methods). This will help institutions to understand what might be considered 'best practice' for the different types of programmes Practical and pragmatic guidance for those undertaking the evaluation of outreach initiatives, to provide clearer information on what the next steps might be in adopting best practice evaluation methods, wherever institutions are currently located in terms of evaluation skills and experience Case studies that seek to test the evaluation guidance and whose experiences are written up to help inform other practitioners


Download ppt "Evaluating outreach initiatives"

Similar presentations


Ads by Google