Outputs, Outcomes and Impact Andrew Harris – Derwen College Fiona Voysey – National Star College.

Slides:



Advertisements
Similar presentations
5 th June 2008 Peer review and development Penny Silvester Divisional Manager Learning and skills.
Advertisements

Inclusion Quality Mark for Wales
Audit Commission Hertfordshire Housing Conference Housing Inspectorate Developments in inspection and assessment Roy Irwin Chief Inspector of Housing.
NEW STATUTORY REGULATIONS FOR TEACHER APPRAISAL AND CAPABILITY 2012 Mary Higgins, Advisor.
Narrowing the achievement gap through curriculum development – probe 6 Natalia Buckler (CUREE) & Michael Jopling (University of Wolverhampton)
Learning and Development
A Whole College Approach to Success Highbury College A Whole College Approach to Success Deborah See, Executive Director Collegiate College (Curriculum,
WELCOME TO THE NATSPEC ANNUAL CONFERENCE 2015
It was the best of times, it was the worst of times Kathryn Rudd OBE Chair Natspec.
Evaluating the impact of careers guidance for continuous improvement
Work Experience Quality Standard Caron Matchet Partnerships & Quality Manager Fair Train The Association of National Specialist Colleges Employment Forum.
Partnership Working with Schools TASK – Impact Analysis What are the Advantages of a More Formal Partnership with Schools? Are there any disadvantages.
Slide No:1 This presentation is designed to assist your organisation in its staff development and training regarding the measures of success. It contains.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Theory & Practice – the new Common Inspection Framework and what it means to governors UCU-LSIS-UNISON FE Staff Governors’ Conference 3 December 2012 Lorna.
Observation System Kidderminster College June 2012.
The role of governance in self-assessment NATSPEC conference Sue Preece HMI March
1 Enhanced Commonwealth Performance Framework: A Programme Manager’s Perspective Government Programmes Community of Practice Forum – 23 March 2015 Suzanne.
© Crown copyright 2008 Subject Leaders’ Development Meeting Mathematics Spring Term 2009 Session 1: APP and AfL.
Add presentation title to master slide | 1 New inspection methodology from September 2009 NATSPEC conference Lorna Fitzjohn May 2009.
Transforming lives through learning Education Scotland and the Strategic Guidance Alan Armstrong Strategic Director Lifelong Learning Directorate October.
Support for Excellence Programme Peer Review and Development Natspec Conference workshop Elisabeth Johnson, Norma Curtis, David Syson.
Raising standards, improving lives. Tackling disadvantage – lessons from Ofsted inspections and research John Kennedy Interim Regional Director, London.
Important Information Have you got a username and password for the school SRF account? If your school has not registered before then you can do this if.
Transforming Community Services Commissioning Information for Community Services Stakeholder Workshop 14 October 2009 Coleen Milligan – Project Manager.
Demonstrating Effectiveness Background and Context.
Partnership in ISCs and examples of good practice Sue Preece HMI February 10 th 2010.
Value Added and Distance Travelled and the Learner Achievement Tracker
RARPA: Quality Assurance Peer Review Triangles This Education & Training Foundation funded project builds on the success of ‘Improving the quality of provision.
Welcome Framework for Excellence Provider Briefing Event.
1 Part 4 Self-review and certification Your progress as a curriculum developer Part 4 Self-review and certification Your progress as a curriculum developer.
The Future of Education Inspection Overview: Key points from the new Common Inspection Framework (CIF) Highlight Ofsted new way of working Priorities.
Changes in the context of evaluation and assessment: the impact of the European Lifelong Learning strategy Romuald Normand, Institute of Education Lyon,
Citizens’ Curriculum Pilot in Kirkholt, Rochdale Helen Chicot Rochdale Borough Council
The National Improvement Framework - vision
National Improvement Framework Aims of this presentation: Share information on the draft National Improvement Framework To discuss and share views on.
Ofsted Common Inspection Framework Mapping to Career Ready September 2015.
Mindset 2000 LtdSlide 1 Train to Gain Provider Support Programme October 2007 Self assessment - introduction.
Beyond school The role for FE NAHT 2014 Alison Boulton.
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
The implications of poverty for educational effectiveness in all schools School Effectiveness & Socio-economic Disadvantage.
NATSPEC Annual Conference Raising Aspirations –Transforming Lives Gill Reay SHMI Birmingham May 2013.
Developing a Monitoring and Evaluation Framework: Insight into internal stakeholder learnings Beth Ferguson AES Conference Sydney 2 September 2011.
RARPA The Bare Necessities What does RARPA stand for?
‘Education has for its purpose not the imparting of particular knowledge but the strengthening of mental faculties’ (Kant c1770)
Customised training: Controversial issues and post-16 citizenship.
A Brave New World: Using the New Ofsted Framework on DWP Inspections Karen Adriaanse March 2010.
Devon Enhanced C&I Programme. © Babcock Integration LLP, No unauthorised copying permitted. 2 Priorities To.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Social and Emotional Aspects Of Learning - SEAL Secondary SEAL Devon – June 2009.
Developing a Strategic Framework for Early Intervention: Children, Young People and Families Faith Mann Director of Targeted and Early Intervention Services.
NLDC November 2014 Recognising and Recording Progress and Achievement (RARPA) A quality improvement perspective.
Preparing for Observation The New Common Inspection Framework.
Maths – Emerging Themes from Ofsted Reports Ian Goodwin Ofsted part-time Inspector 18 th March 2016.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Enterprise in the Curriculum Review Meeting 22 nd September 2010.
Introduction and Overview
Post-16 provision and the Children and Families Act 2014
Is medical revalidation building trust and assurance in doctors
The inspection of local areas effectiveness in identifying and meeting the needs of children and young people who have special educational needs and/or.
Self-Evaluation Cycle for Governing Bodies
Outputs, Outcomes and Impact – Let’s Call the Whole Thing Off?
Introduction and Overview
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
A curriculum without qualifications
Demonstrating successful outcomes ISC PRD workshop
Thursday July 7th 2011 Allie O’Brien
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
OTLA Report Writing Training
Review process What’s involved?.
Presentation transcript:

Outputs, Outcomes and Impact Andrew Harris – Derwen College Fiona Voysey – National Star College

Natspec and QSR A National ISC PRD group (National Star, RNCB, Treloars, Henshaws & Derwen) The data would to used to benchmark across the sector, including GFE LLDD. In addition the data may be included in ILR and FfE. For a short period the group was joined by representatives from AoC, NASS & Ofsted.

Outline To explore the concept of outcomes and how they relate to ISCs politically and in practice To consider the current Natspec QSR project and outcomes for learners with complex needs

How do you know your college is good? Ofsted / CSCI inspection Internal QA The progress young people make with us

But… No national framework for outcomes What providers value as outcomes are not always the priorities of others Confusion and some disagreement over what we should be measuring and how

Why measure outcomes? To show the effect a placement at your college has on a young person. To show potential placers that you provide a high quality service – But- can you meet both functions with the same set of data?

The Project The challenge: To produce quantitative data on personal/individual success that allows comparison across providers. The Proposal: To produce annual data on the achievement of predicted Every Citizen Matters (ECM) outcomes for individual learners, identifying the numbers and percentages of learners who are ‘ahead/over’, ‘in line with/on’ or ‘behind/under’ the learning needed to meet their goal, amalgamated for the provider as a whole and against each ECM theme.

The Benefits Allows personalisation within a nationally recognised framework for consistency (RARPA plus ECM) Measures success in outcomes which are valuable to learners and which are controlled by the provider Supports self-assessment and evidences ‘distance travelled’/value added Does not prescribe or constrain curriculum offer, programme or provider type, enables links to FL Links to local authority outcomes and Ofsted inspection Measures success in outcomes which are valued by stakeholders and commissioners

Steps… Pilots to test process, establish guidance and criteria for levels of performance Guidance on process including what might be included under each ECM outcome, including PI’s Guidance on how to best contextualise the data including use of evaluative criteria based upon CIF. Parameters for small numbers of learners Clarify definitions of complex needs and learners for whom this approach is appropriate Validation and quality assurance (requires robust RARPA processes and self-assessment with validation through peer review and external tests through Ofsted). Establish links to ILR

Other recommendations The use of destinations against predictions could be a useful indicator but should not be used as a measure of success as there are too many issues outside the control of the provider. Students who die or whose health deteriorates such that continued attendance is impossible should be removed from success rates and retention data

Conclusions – June 2010 Learners individual learning goals (ILGs) and ECM outcomes? Are your RARPA processes robust? How do you achieve this? The concept of ‘ahead/over target’, ‘in line with/on target’, behind/under target. Consider the use of percentages in data collection.