MIS5101: Business Intelligence Outcomes Measurement and Data Quality

Slides:



Advertisements
Similar presentations
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT
Advertisements

Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
EXAMPLES Attributes of Information. Overview Information Systems help companies achieve their goals. How do they do it? 1. By processing raw data into.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Chapter 6 Training Evaluation
MIS2502: Data Analytics Extract, Transform, Load
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Testing – A Methodology of Science and Art. Agenda To show, A global Test Process which work Like a solution Black Box for an Software Implementation.
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
MI draft of IDEIA 2004 (Nov 2009) WHAT HAS CHANGED? How LD is identified:  Discrepancy model strongly discouraged  Response To Instruction/Intervention.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
GETTING THE DATA INTO THE WAREHOUSE: EXTRACT, TRANSFORM, LOAD MIS2502 Data Analytics.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
ES Design, Development and Operation Dr. Ahmed Elfaig Knowledge model, knowledge structure, presentation and organization are the bottleneck of expert.
Chapter 6 Training Evaluation
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
1 Organizational Project Management Maturity Model (OPM3 ) استاد مربوطه : جناب آقای دکتر ساعدی جناب آقای دکتر ساعدی تهیه و تنظیم : ارمغان خلیل زادگان ارمغان.
T T Population Sample Size Calculations Purpose Allows the analyst to analyze the sample size necessary to conduct "statistically significant"
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
As Control Supervisor for the project your job is to ensure the scientific integrity of the process. Keep your group focused on the key points mentioned.
(6) Estimating Computer’s efficiency Software Estimation The objective of Software Estimation is to provide the skills needed to accurately predict the.
COBIT. The Control Objectives for Information and related Technology (COBIT) A set of best practices (framework) for information technology (IT) management.
Supported by: Student Hubs is a registered charity in England and Wales, number Impact Measurement Proving and Improving your Social Impact.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Quality Assurance.
Dr.V.Jaiganesh Professor
3.1.1 Data, Information, knowledge and processing
Critical Reading for College and Beyond
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Lean Six Sigma DMAIC Improvement Story
Organisation Control KPI’s & an industry Review
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
EU Smart Cities and Communities information system
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
Unit 3 Introduction to Marketing
Lean Six Sigma DMAIC Improvement Story
پرسشنامه کارگاه.
Project Kickoff Meeting
Lean Six Sigma DMAIC Improvement Story
Ways of Working and Tools Measures
MIS5101: Business Intelligence Access versus Accuracy
MIS5101 Week 3: Data-Driven Management
Successful IT Projects By Darren Dalcher & Lindsey Brodie
MIS2502: Data Analytics Extract, Transform, Load
Monitoring vs Evaluation
3 Stages of Backward Design
Attributes of Information
Aerospace Research.
EC Strategy, Globalization, and SMEs
Session 9 Recap on LFM and IL.
MIS2502: Data Analytics Extract, Transform, Load
Team Charter Project Name: Executive Sponsor: Project Purpose:
Texas Teacher Evaluation and Support System (T-TESS)
Orchestrating Intelligent Systems
We Need Your Feedback Please complete your learning evaluations
Unit 1: Reliability of Measurements
HIGHER OUTCOME 1 Support and feedback.
Training Evaluation Chapter 6
Community-Level Indicators
Results Based Management for Monitoring & Evaluation
MIS2502: Data Analytics Data Extract, Transform, Load(ETL)
Results Based Management for Monitoring & Evaluation
OGB Partner Advocacy Workshop 18th & 19th March 2010
Monitoring and Evaluation in Communication Management
Compare replicates And standards.
TLQAA STANDARDS & TOOLS
Data for PRS Monitoring: Institutional and Technical Challenges
Linking Internal Verification to the QA Criteria
The different aspects of D&T
Presentation transcript:

MIS5101: Business Intelligence Outcomes Measurement and Data Quality

Data Quality The degree to which the data reflects the actual environment. Do we have the right data? Is the collection process reliable? Is the data accurate?

? Finding the right data Key Performance Indicator Consistent with goals of the analysis Measures what it claims to measure Include analysts during data selection Key Performance Indicator ? Adapted from http://www2.ed.gov/about/offices/list/os/technology/plan/2004/ site/docs_and_pdf/Data_Quality_Audits_from_ESP_Solutions_Group.pdf

What do these have in common? KPI Criteria: SMART Specific purpose for the business Measurable Achievable by the organization Relevant to success Time-phased What do these have in common?

Ensuring accuracy How do these impact the balance with access? Know where the data comes from Manual verification through sampling Use of knowledge experts Verify calculations for derived measures How do these impact the balance with access? Adapted from http://www2.ed.gov/about/offices/list/os/technology/plan/2004/ site/docs_and_pdf/Data_Quality_Audits_from_ESP_Solutions_Group.pdf

Reliability of the collection process Build fault tolerance into the process Check logs (if you can) Periodically run reports and verify results Keep up with (and communicate) changes What can you do when you find an error after collection has begun? Adapted from http://www2.ed.gov/about/offices/list/os/technology/plan/2004/ site/docs_and_pdf/Data_Quality_Audits_from_ESP_Solutions_Group.pdf

Evaluating System Usability What are some good KPIs? Why are they good? Why doesn’t everyone rigorously measure usability? What are the benefits of compound metrics? Are goal-based comparisons better than expert-based comparisons?