Xxxxxxxxxxxxx Name: xxxxxxxxxxxx ID: 2005HZxxxxxxx Organization: xxxxxxxxxxx Company Location: Bangalore.

Slides:



Advertisements
Similar presentations
1 Use or disclosure of data contained on this sheet is subject to the restriction on the title page of this proposal or quotation. An Introduction to Data.
Advertisements

BY LECTURER/ AISHA DAWOOD DW Lab # 4 Overview of Extraction, Transformation, and Loading.
Data Manager Business Intelligence Solutions. Data Mart and Data Warehouse Data Warehouse Architecture Dimensional Data Structure Extract, transform and.
Jaros Jaros Overview. Jaros Overview - History Founded 1999 as consulting company GE Medical Systems IT Sigma Aldrich Smurfit-Stone Container Transitioned.
SQL Server Accelerator for Business Intelligence (SSABI)
Technical BI Project Lifecycle
Data Warehouse Architecture Sakthi Angappamudali Data Architect, The Oregon State University, Corvallis 16 th May, 2005.
University of Southern California Enterprise Wide Information Systems Functionality and the Reference Model Instructor: Richard W. Vawter.
Components and Architecture CS 543 – Data Warehousing.
Data Warehouse success depends on metadata
Introduction to Building a BI Solution 권오주 OLAPForum
Data Warehousing: Defined and Its Applications Pete Johnson April 2002.
Components of the Data Warehouse Michael A. Fudge, Jr.
MDS enables users to curate Sets of Objects. This capability is powerful in a wide variety of scenarios across all organization levels.
ACL Solutions for Continuous Auditing and Monitoring John Verver CA, CISA, CMC Vice President, Professional Services & Product Strategy ACL Services Ltd.
M ODULE 5 Metadata, Tools, and Data Warehousing Section 4 Data Warehouse Administration 1 ITEC 450.
Data Conversion to a Data warehouse Presented By Sanjay Gunasekaran.
ETL By Dr. Gabriel.
Agenda Common terms used in the software of data warehousing and what they mean. Difference between a database and a data warehouse - the difference in.
BUSINESS INTELLIGENCE/DATA INTEGRATION/ETL/INTEGRATION AN INTRODUCTION Presented by: Gautam Sinha.
MDC Open Information Model West Virginia University CS486 Presentation Feb 18, 2000 Lijian Liu (OIM:
Page 1 Water Technologies IBI/WebFocus Pittsburgh Users Forum Thursday Oct. 18, 2012 Presented by: Susan Swanger Copyright © Siemens AG All rights.
L/O/G/O Metadata Business Intelligence Erwin Moeyaert.
SSIS Over DTS Sagayaraj Putti (139460). 5 September What is DTS?  Data Transformation Services (DTS)  DTS is a set of objects and utilities that.
Overview of SQL Server Alka Arora.
Best Practices for Data Warehousing. 2 Agenda – Best Practices for DW-BI Best Practices in Data Modeling Best Practices in ETL Best Practices in Reporting.
Data Warehousing Seminar Chapter 5. Data Warehouse Design Methodology Data Warehousing Lab. HyeYoung Cho.
DAT336 SQL Server “Yukon” – The Future of Business Intelligence Jason Carlson Product Unit Manager SQL Server Microsoft Corporation Brian Welcker Microsoft.
Dashboard & Scorecard Case Study. Introduction Hagemeyer Case Study – Background – Situation – Strategic CPM Vision – Solution – Benefits Assimil8 Overview.
Activity Running Time DurationIntro0 2 min Setup scenario 2 2 min SQL BI components & concepts 4 5 min Data input (Let’s go shopping) 9 7 min Whiteboard.
ETL Overview February 24, DS User Group - ETL - February ETL Overview “ETL is the heart and soul of business intelligence (BI).” -- TDWI ETL.
Session 4: The HANA Curriculum and Demos Dr. Bjarne Berg Associate professor Computer Science Lenoir-Rhyne University.
OLAP Theory-English version On-Line Analytical processing (Business Intelligence) [Ing.J.Skorkovský,CSc.] Department of corporate economy.
Business Intelligence Zamaneh Jahed. What is Business Intelligence? Business Intelligence (BI) is a broad category of applications and technologies for.
1 The Instant Data Warehouse Released 15/01/ Hello and Welcome!! Today I am very pleased to announce the release of the 'Instant Data Warehouse'.
© 2008 IBM Corporation ® IBM Cognos Business Viewpoint Miguel Garcia - Solutions Architect.
2 Copyright © Oracle Corporation, All rights reserved. Defining Data Warehouse Concepts and Terminology.
Right In Time Presented By: Maria Baron Written By: Rajesh Gadodia
Decision Support and Date Warehouse Jingyi Lu. Outline Decision Support System OLAP vs. OLTP What is Date Warehouse? Dimensional Modeling Extract, Transform,
Data Staging Data Loading and Cleaning Marakas pg. 25 BCIS 4660 Spring 2012.
Chapter 5 DATA WAREHOUSING Study Sections 5.2, 5.3, 5.5, Pages: & Snowflake schema.
Building Dashboards SharePoint and Business Intelligence.
Building a Data Warehouse for Business Reporting Presented by – Arpit Desai Faculty Advisor – Dr. Meiliu Lu CSC Department – Spring 2006 California State.
September 2007 Movex User Association John Calvey European B.I. Manager NewellRubbermaid -
Do It Strategically with Microsoft Business Intelligence! Bojan Ciric Strategic Consultant
Rajesh Bhat Director, PLM Analytics Applications
© 2012 Saturn Infotech. All Rights Reserved. Oracle Hyperion Data Relationship Management Presented by: Prasad Bhavsar Saturn Infotech, Inc.
1 Copyright © 2008, Oracle. All rights reserved. I Course Introduction.
Metric Studio Cognos 8 BI. Objectives  In this module, we will examine:  Concepts and Overview  An Introduction to Metric Studio  Cognos 8 BI Integration.
1 Copyright © 2011, Oracle and/or its affiliates. All rights reserved. Introduction to Essbase.
Copyright © 2006, Oracle. All rights reserved. Czinkóczki László oktató Using the Oracle Warehouse Builder.
Data Warehouse – Your Key to Success. Data Warehouse A data warehouse is a  subject-oriented  Integrated  Time-variant  Non-volatile  Restructure.
Enterprise Resource Planning - PeopleSoft. An ERP system is a business support system that maintains in a single database the data needed for a variety.
C Copyright © 2007, Oracle. All rights reserved. Introduction to Data Warehousing Fundamentals.
2 Copyright © 2006, Oracle. All rights reserved. Defining Data Warehouse Concepts and Terminology.
3 Copyright © 2013, Oracle and/or its affiliates. All rights reserved. PeopleSoft General Ledger 9.2 New Features 9.2 Release New Features.
Bartek Doruch, Managing Partner, Kamil Karbowiak, Managing Partner, Using Power BI in a Corporate.
Designing and Implementing an ETL Framework
Overview – SOE PatchTT November 2015.
Overview – SOE PatchTT December 2013.
IBM DATASTAGE online Training at GoLogica
Business Intelligence for Project Server/Online
Introduction to Essbase
An Introduction to Data Warehousing
SSDT and Database Project Basics
Data Warehouse.
Data Warehousing Concepts
EiB Analytics 2018 Excel in Business.
David Gilmore & Richard Blevins Senior Consultants April 17th, 2012
Presentation transcript:

xxxxxxxxxxxxx Name: xxxxxxxxxxxx ID: 2005HZxxxxxxx Organization: xxxxxxxxxxx Company Location: Bangalore

Introduction The LaQuinta Enterprise Reporting Architecture is a Strategic and Tactical BI platform for the analysis and reporting of LaQuinta enterprise data. The following business groups are the major users. OPERATIONS MARKETING SALES FINANCE REVENUE MANAGEMENT CALL CENTER INFORMATION TECHNOLOGY

Architecture The Enterprise Architecture is based on a 4-Tier structure with "LaQuinta Data Warehouse 2.0" as the focal point. The four tiers consist of: Legacy and Transaction Source Systems Data Staging LaQuinta Data Warehouse Cognos Presentation

Fact Tables With LAQ DW2.0, there are 5 new fact tables added along with providing more dimension data in the existing fact table. The following are the fact tables used in the LaQuinta DW2.0 project. Reservation Fact ( Exists in DW 1.7 ) Group and Tour Fact Central Reservation Office FACT Denials Fact Call Center Fact Budget Fact

Key Features of Version 2.0 Subject orientation - Multiple subjects such as Folio, Call Center, Budgets, Sales, HR, etc. Conformal dimensions - These are dimensions that can be shared across subject areas Multiple Aggregations - Need based aggregation for specific summary level reporting Standard naming conventions across the board. Optimized data structures using standard data warehousing principles.

Architecture Goals Improved visibility of Enterprise BI –Subject orientation - Add new areas for analysis and reporting –Enable user teams to create strategic and tactical dashboards –Make available ONE Enterprise Metadata dictionary (Business/Functional/Operational) –Consistency in handling time, booking, stay, etc. across the board –Fix existing problems – (Data Gaps, Cognos issues) Remove inefficiencies: Data structure, ETL, Reports, Cubes –Standardize DW practices (features, design, naming, reference data…) –Lighter, simpler Cognos presentation layer –Scalable, flexible structure and processes –Ease of configuration and maintenance Build upon the existing foundation

Why Version 2.0? LaQuinta DW was the existing data warehouse application build by the other vendor. But in the on going business, there were lot of features needed and the existing systems could not scale up to accommodate the changes and also consisted issues/bugs. To accommodate, we developed and planned to release a version called LaQuinta DW 2.0. LaQuinta uses the application called MQC (Mercury Quality Center) to log the issues and bugs and suggestions with respect to all the projects.

Project Schedule from MS Project Plan MonthWeekTask June1Submit the Outline June2,3Functional and Technical Requirement Gathering June4Design Review July1,2UI Design Review and Assisting for the Architectural Design July3Designing the Database July4Integrate with Company Standard and Codes August1,2,3,4Developing the ETL Operations September1,2Unit, Peer and Integration Testing September3Production Deployment September4Production Support and Bug Fixings ( Regular Activity ) October1Upload and submit the Final Report

ETL Tool Used in the Warehouse ETL Tool used in this application is SQL SERVER 2000 DTS packages and following are the packages information used in the LaQuinta Data Warehouse 2.0 project with respect to Reservation Fact

Reservation Fact The transactional data from Night Vision Corporate (NVCorp) is extracted using the master DTS script in the Data Warehouse Server at 07:00am EST everyday and loads the data in the Reservation Fact. The data flows into NVCorp from the Property Management System (PMS) at night and this process is call the Night Audit. The night audit process captures all the transaction that happens in a PMS and consolidates the data in NVCorp. The following are the detailed information about the sub packages executed by the master DTS script. The DTS package used for loading the data into the Reservation Fact is: LOAD DATA WAREHOUSE

Use Case Diagram for Reservation Fact

Reservation Fact The following are the Packages used with respect to Reservation Fact. Daily Data Warehouse Dimension Load Daily Data Warehouse Staging Load Transactional Staging Feed Transactional Fact Feed

Reservation Fact Daily Data Warehouse Dimension Load This is a sub package of the Master DTS package that runs on a daily basis to load the LAQ Daily Data Warehouse fact table. This process will load the following dimensions: ROOMTYPE COMPANY

Reservation Fact Daily Data Warehouse Staging Load Each day the Master Data Warehouse Scripts call this sub package and loads the data to the staging environment. The name of this DTS Package is Load Static Data to STAGING and it is located on the DW server. Generally the following items moved into the Staging tables. Room Type CRS Numbers Company Data Aptech Properites

Reservation Fact Daily Data Warehouse Staging Feeds This process load the following items into the Staging Tables. Daily Folio Charges Cash Sale Foliolink Reservation Rate Zero Fact

Reservation Fact Daily Transactional Fact Feeds This process load all the reservation details from staging tables to the Reservation Fact table and the following items are loaded into the Fact Table. Daily Folio Charges Daily Consumed Room Nights Daily Consumed Revenue Daily Cancels Booking Fact Booking Horizon

Reservation Fact Load Data Warehouse Master Script This is the MASTER DTS package that gets executed once a day to load the Reservation Fact. It also loads the Booking Horizon and Booking Pace fact tables on a daily basis as well. It is a schedule job on the DW server. It is scheduled to run each day starting at 7:00 AM Eastern Standard Time. It will only start and process if the Night Audit posting has completed 99% on that day. The job that it polls for to see if it is complete is a DTS package called “Load Booking Data". If that job has not completed by 7:00 AM the job will not execute.

Reservation Fact Data Audit Tool This is a daily process that Collects Occupancy and Revenue Numbers from NVCorp, APTECH and Data Warehouse tables and compares it by property by stay date. The Occupancy and revenue difference between NVCorp and APTECH and NVCorp and Data Warehouse is calculated by stay date. The comparison is stored in a table called DATA_AUDIT_TOOL on staging database of DW box.

Group and Tour Fact Group and Tour Booking fact is the other fact table used to store only the Group Booking booked through call center by the VAS agents.

Use Case Diagram for GNT Fact

Group and Tour Fact Extract Group Link Data In NV, there are 3 files of importance and they are Group Master, Group Header and Group Link. Every time a property creates a Group, it uses the Group Master first. Group information is available only on a 'daily' basis and not by the hour. The hierarchy is like Group Master --> Group Header --> Group Link. Group link is similar to a folio link and contains information such as: GROUPLINK NUMBER SALES PERSON To reduce the authorized count on a group booking, the associated folio link records have to cancel first. The folio number of the group reservation is designated as the group master. To indentify the Group booking details, we have to use the foliolink table in NVCORP database using the master number and group link number to identify the group.

CRO Booking and Denials Fact CRO Booking The Pegasus BDE file is used to pull daily Gross, Net and Cancellations from the daily extract. The extract is done by filtering on Reservation Center ID. There are currently 5 reservation center id's that are used to filter Call Center Reservation Data. They are: 'RETRC','NBCRC','LQCRC','VAOPS','NETRZ‘ Denials Each day as part of the Pegasus feeds we get the Yield records better known as Denials data. LaQuinta receives both System generated and User generated denials and following can be the reason for the denials. Closed To Arrival Location Does Not Meet Needs Minimum Length Of Stay Restriction No Availability - Hotel Is Sold Out

Use Case Diagram for CRO Booking Fact

Call Center Fact Call Center Fact loads the data with call center details like agent operator who made the reservations, his skill sets, Denials information and Call Center location. The Daily VAS extract files include all CRO feeds such as: DNIS Reference DNIS Activity Skillset Reference Skillset Activity Agent operator – VAS Agent Mapping

Budget Fact This ETL process is normally meant to be run on a yearly basis. However, it has been designed to run any time on demand. It is capable of detecting and updating changes any time during the year. The data pertains to the budget values (room revenue, room occupied percentage and rooms available) for all properties for all 365 days of the year. Extract Budget Data from Aptech Datamart Database and load it into the LAQUINTA_DW data warehouse. The budget data is available only at a property level in the TransBudget table of Aptech datamart.

Use Case Diagram for Budget Fact

UAT Environment How Data Initialization was made to start with the project? The following steps were used to initialize all the dimension tables used in the LaQuinta DW 2.0 Project. Truncate all new tables. Populate Dimension Data for each new dimension table from existing dimension tables after applying the required transformation logic. Run the Fact Load process for loading New Fact tables using data extracted from source.

UAT Environment Method used for Data Validation Following methods are used to do the data validation both in ETL and Cognos side. Cube/Report Validation –Run Queries against the Data warehouse –Values from cubes/reports should match query results ETL Validation –Run Queries against the source data –Run Queries against the Data warehouse These two query results must match.

29 Staging_DW 72 Reports (47-DW 25-Aptech) 6 Cubes Source NVCorp BDE VAS CURRENT: DW1.7 on AM 1 PM Staging and Dimension Delta Load 11AM 3 PM DW Load LaQuinta_DW Staging_DWH DW2.0 on Server Cube spin DW Load LaQuinta_DWH UAT Deployment Diagram Run two ETL processes from NVCorp. Two sets of cubes and reports for UAT comparison the next day 8AM CST. Copy production cubes to the TEST portal and rename it. 9 PM 3AM 1AM 5 AM 47 Reports 10 Cubes Staging and Dimension Delta Load Cube spin

ETL Test Results from MQC LaQuinta uses the application called MQC (Mercury Quality Center) to log the issues and bugs and suggestions with respect to all the projects. During the UAT environment, we ran the test cases in the issue tracking tool to log the test results and below is the test results graph.

Target Deployment Diagram

Summary With the implementation of DW2.0, we have been able to provide the LaQuinta business users with the benefits of in-depth data analysis/projections along with including other business components as a part of the data warehouse model. We were also able to streamline the existing version to meet the data warehousing standards. Along with the above inclusions, we were able to provide the customer with performance enhancements on the complete data warehouse model as an value-add and also been able to fix few know issues/bugs.

Future Work Once the DW2.0 is released into production, we’ll move into the Production Support phase where the entire data warehouse environment would be monitored closely until the system stabilizes and also be able to fix any issues that crop- up during the initial stages. Once the data warehouse system is stable, we’ll start with the next level of enhancements as a part of DW2.1.

Thank You