Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off?

Slides:



Advertisements
Similar presentations
Championing Young Peoples Learning Foundation Learning Regional Briefing David Sangster Policy Manager (FL) Championing Young Peoples Learning.
Advertisements

Customised training: Learner Voice and Post-16 Citizenship.
5 th June 2008 Peer review and development Penny Silvester Divisional Manager Learning and skills.
Working Together in Faith, Hope and Love
Practical resource Evidence based – developed through our research study Draws on the framework of impact, practicalities of capturing impact & lessons.
Inclusive Learning New teaching and learning support qualifications for staff working with disabled learners.
Representing Central Government in the South East Monday, 27 April 2015 Vivien Lines DCSF Safeguarding Adviser VCS Safeguarding Seminar 17 December 2009.
Disability and special educational needs: local area responsibilities under the Children and Families Act, 2014 Charlie Henry HMI National lead for disability.
Equality and Inspection – an Ofsted perspective of Impact NATSPEC/LSIS June 2011.
CPD4k Skills Competitions, CIF & PS
Overall Teacher Judgements
Transforming lives through learning Profiling 3-18.
Outputs, Outcomes and Impact Andrew Harris – Derwen College Fiona Voysey – National Star College.
STRATEGIC DIRECTION UPDATE JANUARY THE VISION AND MISSION THE VISION: ENRICHING LIVES AND CREATING SUCCESSFUL FUTURES. THE MISSION: EDUCATION EXCELLENCE.
The inspection of local area responsibilities for disabled children and young people and those who have special educational needs Charlie Henry HMI National.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Demonstrating Effectiveness Background and Context.
Skills for Care Conference 2015 Leadership Workshop Achieving success through: effective leadership and management workforce development Maureen Hinds:
Partnership in ISCs and examples of good practice Sue Preece HMI February 10 th 2010.
ECAS Meeting Friday 17 th June  The Achievement for All (AfA) project aimed to improve the outcomes of all children and young people with special.
The inspection of local area responsibilities for disabled children and young people and those who have special educational needs Mary Rayner HMI Lesley.
Typical student timetable. What is the Foundation Learning Tier?
Improving Local Indicators Project 3 rd Consultation Workshop David Hume Chair of Project Board.
Changes in the context of evaluation and assessment: the impact of the European Lifelong Learning strategy Romuald Normand, Institute of Education Lyon,
SENJIT Code of Practice update and SEND Support Plans.
Ofsted Common Inspection Framework Mapping to Career Ready September 2015.
Jane Shuttleworth Consultancy and Training THE SCIENCE AND ART OF COMMISSIONING COMMISSIONING AS A DESIGN PROCESS JUNE 2006.
Beyond school The role for FE NAHT 2014 Alison Boulton.
Inspection and Sustainable Development Melanie Hunt Director, Learning & Skills.
NATSPEC Annual Conference Raising Aspirations –Transforming Lives Gill Reay SHMI Birmingham May 2013.
Being The Best We Can A self-evaluation & improvement process for libraries Key results for Victoria’s public library services.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
1 What is the Inclusion Development Programme? Online : CD Rom / DVD + supporting booklets 
Commissioning and the Children and Families Act 2014 Claire Dorer - NASS.
Customised training: Controversial issues and post-16 citizenship.
A Brave New World: Using the New Ofsted Framework on DWP Inspections Karen Adriaanse March 2010.
Raising standards improving lives The revised Learning and Skills Common Inspection Framework: AELP 2011.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Preparing for Observation The New Common Inspection Framework.
Supporting local and national accountability Spring Workshops 2016.
Introducing Victorian Curriculum - Towards Foundation Levels A to D.
Stages of Research and Development
CORC Best Practice Framework: Self-Assessment & Accreditation Tool
Post-16 provision and the Children and Families Act 2014
Knowledge for Healthcare: Driver Diagrams October 2016
GOVERNING BODY SELF-EVALUATION TOOLKIT
National Forum on Guidance
The inspection of local areas effectiveness in identifying and meeting the needs of children and young people who have special educational needs and/or.
Introduction to evaluating and measuring impact in career development Presented by – Date – Doubt, the essential preliminary of all improvement and.
Self-Evaluation Cycle for Governing Bodies
Thursday 2nd of February 2017 College Development Network
National Commissioning and Contracting Conference 2016
Preparing for Observation The New Common Inspection Framework
Implementation of the SEND Reforms
Jane Sinson Educational Psychologist
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
Introducing Victorian Curriculum - Towards Foundation Levels A to D
SEND LOCAL AREA INSPECTION
School of Dentistry Education Research Fund (SDERF)
school self-evaluation and improvement toolkit
Improvement through Self-Assessment
Promoting Improvement
Progress Where am I now? A resource designed to help learners consider their long term goals and aspirations. A short PowerPoint presentation for staff.
Demonstrating successful outcomes ISC PRD workshop
Governors Monitoring Performance Related Pay
Thursday July 7th 2011 Allie O’Brien
Securing the effective engagement of young people Post 19 EHCPs Workshop for Professional Jude Thompson QTVI.
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
OTLA Report Writing Training
Reading Paper discussion – Week 4
Presentation transcript:

Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off? Ruth Thomas – Derwen College Claire Dorer - NASS

Outline To explore the concept of outcomes and how they relate to ISCs politically and in practice To consider work by NASS on devising an outcomes framework for special schools To consider the current Natspec QSR project and outcomes for learners with complex needs

How do you know your school or college is good? Ofsted / CSCI inspection Internal QA Preferred provider with Local Authorities (for schools) We see the progress children and young people make with us

But… No national framework for outcomes What providers value as outcomes are not always LA priorities or what LSC asks us to measure Confusion and disagreement over what we should be measuring and how

Why measure outcomes? To show the effect a placement at your school/college has on a child or young person. To show potential placers that you provide a high quality service – this will be an important new dimension for colleges from next year But- can you meet both functions with the same set of data?

What do you want to know? Output – a tangible result of a given action e.g. a safeguarding policy or new training programme for staff Outcomes – the result of that output e.g. decrease in adult protection referrals or 95% of all staff trained Impact – what difference does this make to the young people receiving the service e.g. young people report that they now feel more confident sharing issues with staff and they now feel happier at college

Who wants to know it? Outputs are relatively easy to measure. For monitoring, it’s feasible for LAs / Ofsted to ask to see policies, minutes of meetings etc. This is useful in setting minimum requirements for providers. Most schools & colleges currently receive this level of monitoring from LAs/Ofsted who purchase places from them. Outcomes are more difficult and tend to require a degree of evaluation. Providers may or may not value the information requested and LAs / Ofsted currently appear to lack the capacity to monitor and to make comparisons between providers. Impact is most difficult to measure – doesn’t lend itself to quantitative evaluation simply BUT provides the richest and most useful information about what an ongoing placement means for an individual young person.

NASS and Outcomes Work within a suite of national contracts – the contracts for children’s homes and fostering both have outcomes framework and pressure for schools contract to have one too. Recognise the need LAs have to make evidence-based decisions about placements Want to develop a framework that moves beyond numbers and percentages to demonstrate the impact of placements on children and young people

What we concluded KPIs needed to be small in number and relevant to all settings We need to work on cultural change with commissioners in order that qualitative, person-focused outcomes are valued as highly as quantitative data We want a focus on the consistency and rigor of the process, whilst leaving flexibility around the content.

NASS’S Outcomes Principles for Schools Setting of life goals should be the starting point – before discussions about placement start These goals should inform placement The school/college has responsibility for translating life-goals into targets, stages and outcomes within a framework of ECM outcomes Existing reviews and monitoring activities should focus on progress against outcomes Schools and colleges can measure impact – have goals been achieved. What might this young person’s life be like if goals are partially achieved or not achieved?

Tools Schools about to get access to SEN Progress guide that helps measure progress of young people with severe and complex LDD NASS producing self-assessment toolkit to support schools with gathering and monitoring data Hoping to link to external QA projects such as BILDs Quality Networks reviews

Natspec and QSR A National ISC PRD group (National Star, RNCB, Treloars, Henshaws & Derwen) last year began looking at measuring the success rates of learners with complex needs. This was to be based on the rigorous measurement of progress made towards long term goals in any areas from PSD, functional skills, vocational and employability skills and ILS, enabling confident benchmarking across the sector and in GFE LLDD provision. The data would to used to benchmark across the sector, including GFE LLDD. In addition the data may be included in ILR and FfE. Latterly the group was joined by representatives from AoC, NASS & Ofsted.

The Project The challenge: To produce quantitative data on personal/individual success that allows comparison across providers. The Proposal: To produce annual data on the achievement of predicted Every Citizen Matters (ECM) outcomes for individual learners, identifying the numbers and percentages of learners who are ‘ahead/over’, ‘in line with/on’ or ‘behind/under’ the learning needed to meet their goal, amalgamated for the provider as a whole and against each ECM theme.

The Benefits Allows personalisation within a nationally recognised framework for consistency (RARPA plus ECM) Measures success in outcomes which are valuable to learners and which are controlled by the provider Supports self-assessment and evidences ‘distance travelled’/value added Does not prescribe or constrain curriculum offer, programme or provider type, enables links to FLT Links to local authority outcomes and Ofsted inspection Measures success in outcomes which are valued by stakeholders and commissioners

Next steps Pilots to test process, establish guidance and criteria for levels of performance Guidance on process including what might be included under each ECM outcome, including PI’s Guidance on how to best contextualise the data including use of evaluative criteria based upon CIF. Parameters for small numbers of learners Clarify definitions of complex needs and learners for whom this approach is appropriate Validation and quality assurance (requires robust RARPA processes and self-assessment with validation through peer review and external tests through Ofsted). Establish links to ILR

Other recommendations The use of destinations against predictions could be a useful indicator but should not be used as a measure of success as there are too many issues outside the control of the provider. Students who die or whose health deteriorates such that continued attendance is impossible should be removed from success rates and retention data Your thoughts, ideas, suggestions would be welcomed.

Points to consider Learners individual learning goals and ECM outcomes? Are your RARPA processes robust? How do you achieve this? The concept of ‘ahead/over target’, ‘in line with/on target’, behind/under target. Consider the use of percentages in data collection. What would you change or add to the pilot? Any further suggestions, recommendations or comments?

Thank you