Download presentation
Presentation is loading. Please wait.
Published byMorgan Lawson Modified over 9 years ago
1
How to Use Mistakes to Improve Credibility By Leon Østergaard, Statistics Denmark UNECE Work Session on Statistical Dissemination and Communication 12-14 September 2006 Washington D.C., USA
2
The problem with errors Knowledge is not gathered systematically –We lack the information necessary to learn from our mistakes –We are not able to quality label our statistics Hiding mistakes puts credibility at risk –Not admitting errors is contrary to user experience –Disguising errors is even worse!
3
Making mistakes visible Gathering information on errors –Being able to learn from mistakes –Making mistakes visible internally Publishing errors loud and clear –A chance to show that errors are not hidden –The error in itself will not add to credibility, –but a limited amount, corrected loud and clear, will increase credibility!
4
Classification of errors Blemishes –users not affected –never in statistical figures Minor errors –unlikely that users are misled –only minor groups at risk Serious errors –real possibility that more than minor groups are misled
5
Blemishes Users not affected Never in statistical figures Examples: –Faulty links in pdf-documents –Reference period not updated in figure (but everywhere else) –Spelling errors and punctuation
6
Dealing with blemishes Corrected online Print not corrected No mention of the correction
7
Minor errors Unlikely that users are misled Only minor groups at risk Examples: –Percentage changes in regional unemployment were wrong. Comment and actual unemployment figures were correct –Two nationalities were mixed up in statistics on overnight stays. Comment and all other figures were correct –In a table on males and females, ’total’ and ’females’ were mixed up. Comment and all other figures were correct
8
Dealing with minor errors Corrected online Note inserted in corrected statistics Note inserted in list of statistics Print not corrected (but books includes link to updated version) No action to actively forward information
9
Serious errors Real possibility that more than minor groups are misled Examples: –Error in both total and regional figures on overnight stays –Unemployment figures by sex and age groups were wrong. Comment and other figures were correct –Sale of beer in 2005 dropped by 0.8 per cent, not by 26 per cent(!)
10
Dealing with serious errors Corrected online – or removed Note inserted on website frontpage Paragraph on the error inserted in corrected statistics Note inserted in list of statistics All known users receive corrected statistics
11
Collecting information on errors User interface on intranet asking: –type of publication? –description of error? –discovered when and by whom? –classification of error? –causes and responsibility? –actions taken to correct and when? –actions to prevent future errors?
12
Collecting information on errors Reporting done by news release editor Information stored automatically E-mails generated to statisticians, communication and management Transparency of the process
13
Setting a standard for errors Blemishes do not count In 2010: Minor or serious errors in less than 1 per cent of statistics In 2006: Minor or serious errors in less than 1.5 per cent of statistics Not more than 15 errors in 2006
14
A preliminary statistics on errors First seven months: –19 blemishes –5 minor errors –7 serious errors Forecast for 2006: –blemishes in 3 per cent of publications –errors in 2 per cent of publications
15
What’s next? Using experience to reduce the occurrence of errors Develop similar tool for collecting information on errors in databanks Setting a standard for errors in databanks Continued monitoring of credibility More in-depth monitoring
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.