User Views on Quality Reporting Sarah Green, Jacqui Jones & John-Mark Frost UK Office for National Statistics
Aim of Presentation To provide an overview of the: Standard quality reporting system in ONS Initial results of a user consultation on the use and usefulness of Summary Quality Reports
Overview of Presentation Quality Reporting in ONS Initial User Views User Consultation Drive Methodology Results Proposals for the Way Forward
Quality Reporting in ONS What are quality reports? Based on the European standards Structured according to ESS dimensions of quality Content? Describe the methods used to compile the output Defined Key Quality Measures (KQMs) Subset of output quality measures as defined by ONS Audience? Primarily external users Purpose? Assess strengths and limitations of outputs Promote knowledge and trust of users
ONS Standard Quality Reports Basic Quality Information focussed release-specific quality information Mostly quantitative Integrated into the actual release Includes revisions analysis to support accuracy Summary Quality Report Static, stand alone reports relating to the whole series of releases Updated only where necessary Highlights strengths & limitations of data Provide examples of BQI and SQR and hand out here for reference over the next slides Explain how the two fit together Both are kept to a minimum so that they are not overly-burdensome for output providers. SQRs are about 4 pages long on average, and contain numerous links to other sources of information for more interested users. 5
Initial User Views Pilot of Summary Quality Reports in 2005 provided user feedback Bank of England and HM Treasury Positive Reduced time needed to spend analysing and making data quality assessments ONS staff also highlighted benefits Reduction in the number of data queries over the phone from users Both organisations are key users of ONS National Accounts outputs which have associated Standard Quality Reports and the feedback from both was very positive. Both felt that the system of reports, across the range of the UK’s statistical outputs,reports aided them as users greatly, reducing the time they needed to spend analysing the ONS releases and making data quality assessments themselves. Reduction in the number of data queries over the phone from users, probably a direct result they feel of having the information contained in the reports readily available and accessible
Driver for Further User Consultation Low number of web visits recorded Access? Lack of interest in Summary Quality Reports ? Summary Quality Reports not meeting user needs? Resource intensive No user review conducted since pilot Expert users Assess: Needs of all users How far Summary Quality Reports meet these
Methodology for User Consultation Considered how to assess: Identification and contact with users data collection mode and design Respondents: Known users Professional government groups Those who wished to participate having been invited to by the ONS Contact Centre Those who wished to participate having seen the questionnaire on the ONS website User satisfaction survey designed Dispatched electronically Financially viable to widely distribute Achieved good response to the survey 138 responses
Results from User Consultation (1) Only 30% of respondents had previously seen an SQR Consider: Access Usage
User Views on Access web
User Views on Usage Other: we are considering whether to include the material (or links to it) in our Knowledge Management system for pensions analysts To find a contact name for further information UKSA Assessments
Results from User Consultation (2) Having looked at a Summary Quality Report most respondents felt it was easy to: Follow the layout Understand the language and content There was evidence that they had a little more trouble: Clearly determining limitations of statistics Interpreting statistics in more detail Determining if statistics were fit for their purposes
User Specified Changes to SQRs What, if anything, would you like to change about Summary Quality Reports? 21% of respondents would change the access 23% would change the level of detail 26% would change nothing at all 33% commented on specific changes Key themes from comments Split opinion on level of detail and technicality Accessibility specifically raised as a problem
Proposals for the Way Forward Continue to produce Summary Quality Reports Web visits low but: when they are accessed are thought to be useful Better accessibility - Ensure they are highly visible on output product pages Split opinion on level of detail and technicality Need to highlight key points more readily within reports Potential for an ‘Executive Summary’ section in reports
Thank you sarah.green@ons.gov.uk