Professor Jacqueline Stevenson, Sheffield Hallam University

Slides:



Advertisements
Similar presentations
Demystifying Domain 9: Performance Management Strategies and Resources
Advertisements

A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
Monitoring and Evaluation in the CSO Sector in Ghana
First Evaluation of Good Governance for Medicines Programme Brief Summary of Findings.
IT Governance Navigating for Value Michael Vitale 6 May 2003 CIO Conference Steering the Enterprise Through Stormy Seas Image source: Access2000.
Professional Learning Communities in Schools Online Workshop.
The Business Driven PMO Andy Jordan. Presented by Andy Jordan June 11 th 2015 PMO Conference, London The Business Driven PMO The role of the PMO in driving.
Learning and Development Developing leaders and managers
Lori Smith Vice President Business Intelligence Universal Technical Institute Chosen by Industry. Ready to Work.™
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
The Method to My Madness Rapid City Area Schools Administrative Retreat August 9, 2010.
School Effectiveness Framework Building effective learning communities together October 2009 Michelle Jones Professional Adviser WAG.
Integral Health Solutions We make healthcare systems work in harmony.
A short introduction to the Strengthened Approach to supporting PFM reforms.
Kathy Corbiere Service Delivery and Performance Commission
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Dr. Christine Tom Griffith University School-based Assessment for and in Learning.
1 Chapter 9 Implementing Six Sigma. Top 8 Reasons for Six Sigma Project Failure 8. The training was not practical. 7. The project was too small for DMAIC.
ICMA Mountain Plains Regional Summit Grapevine, Texas May 1-2, 2014
Welcome and Introduction January 11, 2017
Using Data to Drive Decision-Making
Knowing What Is Expected Of Me
MODULE 15 – ORGANISATIONAL COMMUNICATION
SCHOOL BASED SELF – EVALUATION
Approaches to Partnership
MODULE 12 – STRATEGIC MANAGEMENT
Learning and Development Developing leaders and managers
An Update and Consultation
Thursday 2nd of February 2017 College Development Network
School Improvement School to Circuit to District.
Strategic Marketing, 3rd edition
National Association For Court Management
Department of Political Science & Sociology North South University
Creating a culture of greatness
VASSP Conference – June 2016
Building the foundations for innovation
Software Engineering (CSI 321)
HEALTH IN POLICIES TRAINING
Improve Business Satisfaction by 10% Through Business Relationship Management Relationship management is the #1 driver of business satisfaction with IT.
CEA Case Study Marianne Farrugia.
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
TSMO Program Plan Development
Visioning with CBPR Model
Managing Change and Other Keys to Successful Implementation
Somalia NGO Consortium
Learning and Development Developing leaders and managers
Student QEP Workshop Developing Student Engagement in Quality Assurance and Enhancement Eve Lewis Director.
Leadership for Standards-Based Education
Jean Scott & Logan Searcy July 22, MEGA
WOMEN AS AGENT OF CHANGE- GOOD GOVERNANCE
Benefits and Barriers: ICTs in M&E
Capacity Building for HMIS Leads
Presented by: Skyline College SLOAC Committee Fall 2007
Using Data Summary.
February 21-22, 2018.
Linking Evaluation to Coaching and Mentoring Models
Assessment Day Strategy
OGB Partner Advocacy Workshop 18th & 19th March 2010
The National and Local context
Management reporting Project support overview.
Leaders We Need Now.
Tracie Wills Senior Commissioning Officer
THE PRACTICE OF PARTICIPATION Some operational issues
Why do we request a PIR? The information provided in the PIR helps inspectors to understand how the service meets the five key questions and the plans.
Reviewing RIS3 in Catalonia
THE PRACTICE OF PARTICIPATION Some operational issues
UCD Access & Lifelong Learning
Strategic Management and
Strategic Management and
Presentation transcript:

On the ethics of data: how are approaches to data collection driving institutional practices Professor Jacqueline Stevenson, Sheffield Hallam University Twitter: ProfJStevenson

Overview of talk A bit about me (Un)ethical data collection approaches Analytical maturity Developing an analytical ecosystem Involving staff and students Key principles

Torture the data, and it will confess to anything Ronald Coase, winner of the Nobel Prize in Economics

(Un)ethical data collection Lack of transparency Ignoring data that doesn't fit with what HEI wants to do Students only involved to 'rubber stamp' Lack of data communication plan Data being ‘made’ to perform just political work (reputation, competition)

(Un)ethical data collection Power/dominance over the questions asked Collating 'pointless' data Validity of data Valorising only certain forms of data Under-use of data (data silos) Confirmation vs. curiosity-driven approaches Lack of predictive modelling

Analytical maturity Challenged Level 1. Individual level - individuals own and control data and use it to tackle day-to-day functional issues. Firefighting mode, project to project. Little or no support or technology for a culture of evidence. Level 2. Departmental level - departments take control of their information and start to produce performance reports and metrics for their function; systems are isolated into information silos and not well-aligned at the institution level. Foundational Level 3. Enterprise-level institution integrates information from across functional areas into an institution- wide information environment with clear support from leadership; reporting and analysis are effective and accurate, and used to make decisions. Clear internal information chain. Progressive Level 4. Optimisation level - quality data and advanced analytical capabilities used to optimise outcomes across the institution leading to tangible improvements in key functions and metrics. Level 5. Innovation level - data supports new ways to achieve priorities and enhance success SAS Organization Maturity Model: https://www.sas.com/content/dam/SAS/en_us/doc/whitepaper1/increasing-student-success-with-big-data-in-education-108483.pdf

Developing an analytical ecosystem Is the system collaboratively planned by all stakeholders? How does it feed in to annual planning rounds? What level of granularity is supported? Do we understand the 'why'? How is qualitative data being fed in? How is data being used to inform change (and how are these decisions being made and by whom?) How is data being used to assess 'what works'? (and how is what works understood in relation to sub-groups?)

Involving staff and students What access do staff and students have to data? Are they able to fully interrogate and understand it? What is the culture within which staff and students engage in discussions of data? How transparent is the data? what is 'hidden' and why? Are staff and/or students empowered to act on findings? How are recommendations for action disseminated up and down? How are decisions about acting/not acting on findings made? and by whom? How can this be challenged? How is the communication loop closed?

Key principles: institutions Accept, interrogate and act on data Responsibility for reviewing data + implementing change to be devolved Institutional data PLUS formal research and informal dialogue with staff and students. Monitor student behaviour/performance (not characteristics used to label students ‘at risk’) Need action plans ready Evaluate using qualitative and survey methods PLUS institutional data Effective approaches to be shared with colleagues, especially in cognate disciplines Students as partners Thomas and Jones https://www.birmingham.ac.uk/Documents/college-eps/college/stem/using-data-he-strem-transition.pdf

Key principles: students Individual approaches* Build students’ capacity to access, analyse, and use data Enable them to use data to identify their strengths, weaknesses, and patterns to improve their work. analyse their own progress. use data to set goals and reflect on their progress over time See https://www.srhe.ac.uk/downloads/reports-2016/LizBennet-scoping2016.pdf Develop Participatory Action Research approaches to effect institutional change PAR is driven by participants and based on their own concerns. It is therefore a form of action research which is built on research and action with people rather than simply for people. *Adapted from https://www.kqed.org/mindshift/53426/four-research-based-strategies-to-ignite-intrinsic-motivation-in-students Durham University offers a very helpful guide on PAR and how to develop a PAR approach: https://www.dur.ac.uk/resources/beacon/PARtoolkit.pdf

The whole enterprise of teaching managers is steeped in the ethic of data-driven analytical support. The problem is, the data is only available about the past. So the way we’ve taught managers to make decisions and consultants to analyze problems condemns them to taking action when it’s too late. Clayton M. Christensen, management professor at Harvard

Concluding thoughts What is our real purpose in gathering data, engaging with students, and thinking about success? And is it really at the heart of students' best interests? And if it is not, what should we be doing differently?