A realistic look at open government data Sharon Dawes.

Slides:



Advertisements
Similar presentations
South Africas MTEF Effective expenditure for development Malawi Poverty Monitoring System Workshop July 2002.
Advertisements

From Research to Advocacy
PARTNERSHIPS OF GOVERNMENTAL AND NON-GOVERNMENTAL ORGANIZATIONS (ASSOCIATIONS) IN THE SPHERE OF TOURISM: RUSSIAN AND FOREIGN EXPERIENCE Moscow, Russian.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Introduction to Research Methodology
Theoretical Structure of Financial Accounting
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Building the Knowledge Base
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluating with experts
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
XP Introduction1 Succeeding in Business Applications with MS Office 2003 Introduction to Problem Solving with Microsoft Office 2003 “You’ve got to seize.
ICAICT202A - Work and communicate effectively in an IT environment
Formulating the research design
Standards and Guidelines for Quality Assurance in the European
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
GAAP PowerPoint #2. Understandability Decision Usefulness Relevance Predictive Value Feedback Value Timeliness Reliability Verifiability Neutrality Representational.
Copyright © 2014 by The University of Kansas Acting as a Watchdog.
Assessing Organizational Communication Quality
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
Strategic Commissioning
Principles of User Centred Design Howell Istance.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Sept - Dec w1d11 Beyond Accuracy: What Data Quality Means to Data Consumers CMPT 455/826 - Week 1, Day 1 (based on R.Y. Wang & D.M. Strong)
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Microsoft Corporation Teaching with Technology. Ice Breaker.
Guidance for AONB Partnership Members Welsh Member Training January 26/
1 Building the Knowledge Base. 2 New Parameters In crossing international borders, a firm encounters parameters not found in domestic business. Examples.
Understanding Customer Needs
1 Croatia: Project Partnership for social inclusion September 16, Progress P rogram m of Europ ean U ni on
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
Project Administration Chapter-4. Project Administration Project Administration is the process which involves different kinds of activities of managing.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
T RUST AND R ESPONSIBILITY FOR B IG D ATA P RACTICES Heike Felzmann Centre of Bioethical Research & Analysis; InPrime, Insight Centre for Data Analytics;
CHAPTER 1 Understanding RESEARCH
GAAP PowerPoint #2. Understandability Decision Usefulness Relevance Predictive Value Feedback Value Timeliness Reliability Verifiability Neutrality Representational.
Policy & Jurisdiction Discussion Group Report Sharon Dawes, SUNY Albany.
Enterprise Content Management: Building a Collaborative Framework 32 nd Meeting of the Section of International Organizations, International Council on.
LESSON 3. Properties of Well-Engineered Software The attributes or properties of a software product are characteristics displayed by the product once.
Requirements Engineering Lesson 2. Terminologies:  Software Acquisition is where requirement engineering significantly meets business strategy.  Software.
Acting as a Watchdog. What is a watchdog? A watchdog is an individual or group (generally non-profit) that keeps an eye on a particular entity or a particular.
Private Health Sector Assessments (PHSA) April Harding 2011.
SECONDARY DATA. DATA SOURCES  Primary Data: The data which is collected first hand specially for the purpose of study. It is collected for addressing.
Sixteen Questions About Software Reuse William B. Frakes and Christopher J. Fox Communications of the ACM.
Introduction to Finance & Accounts A2 Module 4 Marketing, Accounting & Finance.
INCREASING PUBLIC VALUE THROUGH OPEN GOVERNMENT AND OPEN DATA Barbara-Chiara Ubaldi Project Manager, E-government Public Sector Reform Directorate for.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Group 9: Matilda Akkola, Reetta Arokoski, Lauri Kokkila, Miikka Laitila CROWDSOURCING: HOW TO BENEFIT FROM (TOO) MANY GREAT IDEAS? “The article gives recommendations.
 System Requirement Specification and System Planning.
The Quality Gateway Chapter 11. The Quality Gateway.
Finance and Accounts.
PLANNING, MATERIALITY AND ASSESSING THE RISK OF MISSTATEMENT
Opening Gateways: A Practical Toolkit for Designing Information Access Programs Sharon S. Dawes.
THIS IS TO EVIDENCE YOUR WORK AND GET THE BEST GRADE POSSIBLE
1.01 Generally Accepted Accounting Principles – Qualities of Accounting Information GAAP PowerPoint #2.
Research Methods Lesson 1 choosing a research method types of data
Starter Look at the photograph, As a sociologist, you want to study a particular group in school. In pairs think about the following questions… Which group.
Teaching with Instructional Software
Chapter 3 Evaluating Information
Evidence Based Policing Knowledge-Oriented Approaches
Strategic Environmental Assessment (SEA)
Introduction to Finance & Accounts
Organizational Aspects of Data Management
Presentation transcript:

A realistic look at open government data Sharon Dawes

Open data philosophy If government –publishes its data in structured, machine readable form –provides easy one-stop public access to all data from all departments –without fees or other restrictions on access or use Then –social, economic, and democratic benefits will flow to society

A built-in problem Supply-side perspective + Myths + Ambiguity = Disappointing results

Supply-side perspective Focus is mainly on what government does or should do: Adopt open data policies Devise and implement open data practices Publish the data Step aside and great things will happen

Supply-side perspective Focus is mainly on what government does or should do: Adopt open data policies Devise and implement open data practices Publish the data Step aside and great things will happen Problem 1: a government-centric view Problem 2: ignores the limits of resources and capabilities Problem 3: promises value will be created by someone else

Myths (Janssen, Charalabidis, & Zuiderwijk, 2012) Data publication = data use = public benefits All government needs to do is publish data All constituents can use published data All data should be published without restriction Open data will produce open government M. Janssen, Y. Charalabidis & A. Zuiderwijk (2012)

Myths (Janssen, Charalabidis, & Zuiderwijk, 2012) Data publication = data use = public benefits All government needs to do is publish data All constituents can use published data All data should be published without restriction Open data will produce open government Problem 1: simplistic Problem 2: naïve Problem 3: magical thinking M. Janssen, Y. Charalabidis & A. Zuiderwijk (2012)

Ambiguity About the purpose of open data: (Yu & Robinson, 2012) –Transparency and accountability (to see what the government is doing and how it is doing it) –Economic and social development (to create new products and services for society) About who is able to use open data –Rhetoric is about “citizens” –Actual users are expert analysts and application developers

Ambiguity About the purpose of open data: (Yu & Robinson, 2012) –Transparency and accountability (to see what the government is doing and how it is doing it) –Economic and social development (to create new products and services for society) About who is able to use open data –Rhetoric is about “citizens” –Actual users are expert analysts and application developers Problem 1: sends mixed messages to the public Problem 2: sends mixed messages to administrators

Result: disappointment In uptake by citizens and businesses In the type of applications produced In the sustainability and economic value of applications in the market In the effect on openness and democracy

Sidney Harris, 2012 In a nutshell...

An OGD miracle depends on whether... The policies and purposes are clear The practices are effective The data are desirable and good quality The users (analysts and developers) are capable The public wants and can use what the users create

The miracle depends on whether... The policies and purposes are clear The practices are effective The data are desirable and good quality The users (analysts and developers) are capable The public wants and can use what the users create

A different approach to the data dimension of OGD The value of open data lies in data use. This value of depends on perspective and capabilities of data users and consumers outside the government Value generated depends on the quality of the data for a given use by a given user/consumer. Consequently, there can be no one standard for data quality but rather data need to be “fit for use” (Wang & Strong, 1996, Ballou & Pazer, 1995)

Data quality challenges Conventional wisdom Provenance Practices Consequences Underuse Misuse Non-use Shifting costs and responsibilities

Conventional wisdom aka “untested assumptions” Quantitative data is “better” than qualitative data Digital data is “better” than other formats The data you need –is available and sufficient –objectively neutral –understandable –relevant for your purpose Government organizations record and organize their data in the same, predictable way

Provenance or “where do open data come from?” Administrative systems Embedded in program or service operations Governed by specific policies and laws Gathered in particular contexts for certain internal purposes By people with different kinds and levels of knowledge and expertise

Practices & processes that produce data Data definition Data collection Data management & maintenance Documentation Audit and quality control Change management Security Priorities & capabilities for all the above

Three examples

Example 1: Give me shelter

Example 2: Cadastral records

Example 3: Where does the money go?

Data quality = fitness for use Matters most from the user’s point of view Depends on the user’s purpose Usually involves trade offs –e.g., timeliness vs. completeness (Wang & Strong, 1996, Ballou & Pazer, 1995)

Dimensions of data quality AccessibilityExtent to which data is available, or easily and quickly retrievable Appropriate Amount of DataExtent to which the volume of data is appropriate for the task at hand BelievabilityExtent to which data is regarded as true and credible Completeness Extent to which data is not missing and is of sufficient breadth and depth for the task at hand Concise RepresentationExtent to which data is compactly represented Consistent RepresentationExtent to which data is presented in the same format Ease of ManipulationExtent to which data is easy to manipulate and apply to different tasks Free-of-ErrorExtent to which data is correct and reliable Interpretability Extent to which data is in appropriate languages, symbols, and units, and the definitions are clear ObjectivityExtent to which data is unbiased, unprejudiced, and impartial RelevancyExtent to which data is applicable and helpful for the task at hand ReputationExtent to which data is highly regarded in terms of its source or content SecurityExtent to which access to data is restricted appropriately to maintain its security TimelinessExtent to which the data is sufficiently up-to-date for the task at hand UnderstandabilityExtent to which data is easily comprehended Value-AddedExtent to which data is beneficial and provides advantages from its use Pipino, et al, 2002

637 pages Metadata

Data quality “tools” For open data providers Appreciate data as an asset, a source of value Adopt information policies to preserve and enhance usability Create and maintain metadata to support unknown users Adopt stewardship practices For open data users Be skeptical, ask questions Demand good quality metadata Understand the nature and context of the data Use data sets with caution Combine data sets with great caution

What else can governments do? Engage with the civic technology community about data needs and data problems Set publication priorities around known data needs of potential users and beneficiaries Support topical data communities Direct contests or challenges toward solving specific public problems (e.g. affordable housing, transportation congestion, neighborhood safety)

management technology data policy context

Thank