Baseline Data Measure Kaizen Facilitation. Objectives Define data types and purpose Explain concepts of efficiency and effectiveness Provide tips on establishing.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Market Research Ms. Roberts 10/12. Definition: The process of obtaining the information needed to make sound marketing decisions.
Data Collection Six Sigma Foundations Continuous Improvement Training Six Sigma Foundations Continuous Improvement Training Six Sigma Simplicity.
IE673Session 5 - Process Management1 Process Management.
Computer Engineering 203 R Smith Project Tracking 12/ Project Tracking Why do we want to track a project? What is the projects MOV? – Why is tracking.
6/1/2015WM-001 Planning for Measurement - copyright Paul Sorenson slide 1 Planning for Measurement WM Software Process and Quality Measurement is.
Dr. Ron Lembke SCM 462.  Financial return  Impact on customers and organizational effectiveness  Probability of success  Impact on employees  Fit.
Chapter 10 Quality Control McGraw-Hill/Irwin
Chapter 4 Topics –Sampling –Hard data –Workflow analysis –Archival documents.
Total Quality Management BUS 3 – 142 Statistics for Variables Week of Mar 14, 2011.
Chapter 5: Supply Chain Performance Measurement and Financial Analysis
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
12 Steps to Useful Software Metrics
Performance Planning Presented by: Mike Dougherty AVP for Human Resources.
Overview of Lean Six Sigma
Overview of DMAIC A Systematic Framework for Problem Solving
HUMAN RESOURCE MANAGEMENT Introduction Human Resource Strategy Human Resource Planning Recruitment and Selection Training and Development Performance Management.
Chapter 3 Needs Assessment
Release & Deployment ITIL Version 3
Codex Guidelines for the Application of HACCP
1. 2 What is Six Sigma? What: Data driven method of identifying and resolving variations in processes. How: Driven by close understanding of customer.
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
Six Sigma By: Tim Bauman April 2, Overview What is Six Sigma? Key Concepts Methodologies Roles Examples of Six Sigma Benefits Criticisms.
Mantova 18/10/2002 "A Roadmap to New Product Development" Supporting Innovation Through The NPD Process and the Creation of Spin-off Companies.
S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY:
BSBIMN501A QUEENSLAND INTERNATIONAL BUSINESS ACADEMY.
9 Closing the Project Teaching Strategies
Quality Function Deployment
Software Testing Lifecycle Practice
N By: Md Rezaul Huda Reza n
Define the problem to be solved. Measure the current performance. M.
1 Accreditation and Certification: Definition  Certification: Procedures by which a third party gives written assurance that a product, process or service.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Measuring the effectiveness of your volunteer program Meals On Wheels Association of America 2011 Annual Conference August 30, 2011 Stacey McKeever, MPH,
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
Your LogoYour own footer. Production & Operations Management Chapter : The Role of Operations Management Business Process Reengineering Inventory Management.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Why do we … Life cycle processes … simplified !!!.
Service Transition & Planning Service Validation & Testing
1 LSSG Green Belt Training Control: How do we make it happen at a controlled speed?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Software Requirements Engineering: What, Why, Who, When, and How
Software Project Management With Usage of Metrics Candaş BOZKURT - Tekin MENTEŞ Delta Aerospace May 21, 2004.
Market research in Business
1 Unit 1 Information for management. 2 Introduction Decision-making is the primary role of the management function. The manager’s decision will depend.
Chapter 16 Problem Solving and Decision Making. Objectives After reading the chapter and reviewing the materials presented the students will be able to:
Process Walk & SIPOC Define Kaizen Facilitation. Objectives Understand the process as a “system” Describe the concept of an entity and how it relates.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Georgia Institute of Technology CS 4320 Fall 2003.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Accommodation & Hospitality Services STAFF BRIEFING – No 15 ISO Quality Management.
1 TenStep Project Management Process ™ PM00.9 PM00.9 Project Management Preparation for Success * Manage Quality *
Concepts in Enterprise Resource Planning Fourth Edition
Market Research & Product Management.
Chapter 16 Implementing Quality Concepts Cost Accounting Foundations and Evolutions Kinney, Prather, Raiborn.
Information, Analysis, and Knowledge Management in the Baldrige Criteria Examines how an organization selects, gathers, analyzes, manages, and improves.
Chapter 6: THE EIGHT STEP PROCESS FOCUS: This chapter provides a description of the application of customer-driven project management.
Data Collection & Analysis ETI 6134 Dr. Karla Moore.
Quality Control Copyright © 2015 McGraw-Hill Education. All rights reserved. No reproduction or distribution without the prior written consent of McGraw-Hill.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
Nominal Group Process (NGP) A well researched technique (Delbecq et al., 1986) that is effective in facilitating a group to come to the best combined judgements.
Alex Ezrakhovich Process Approach for an Integrated Management System Change driven.
TM 720: Statistical Process Control DMAIC Problem Solving
Bell Ringer List five reasons why you think that some new businesses have almost immediate success while others fail miserably.
PROJECT QUALITY MANAGEMENT Teknik Elektro FT UNDIP
Six Sigma Green Belt Training
IENG 451 / 452 Data Collection: Data Types, Collection Protocols
Production and Operations Management
Software Testing Lifecycle Practice
Presentation transcript:

Baseline Data Measure Kaizen Facilitation

Objectives Define data types and purpose Explain concepts of efficiency and effectiveness Provide tips on establishing baseline metrics Understand importance of data collection Review elements for data planning and usage 2

Types of Data (Definition) A metric obtained by observing a population, product, process, or service... Qualitative Job is expensive and takes too long Quantitative ATTRIBUTE DATA (Categories, Yes/No, Pass/ Fail, Machine 1 vs. Machine 2, etc) CONTINUOUS DATA (Time, Temperature, Weight, Pressure, etc) 3

Purpose of Data Collection The “generic” purpose of all data (metrics) is to help make better decisions A Metric is an input to a decision system which has the following elements: Who looks at it? When / Where / How do they look at it? What do they compare it to? What are their choices? What actions are taken? Better Data = Better Decisions 4

System Feedback How do you know if your system is working right? All processes exist to fulfill needs Your system is working ‘right’ if it fulfills those needs Who defines these needs? The customer (recipient of good or service) Typically associated with the output of the process The business (provider of good or service) Typically associated with the process itself 5

Establishing Metrics For each product or process requirement, there should be one or more objective measures These measures (metrics) allow us to verify that requirements have been met through the general relationship: 6 Measured Performance _ Required Performance = (Delta or Variance)

Sources of Process Feedback Typical organization has two main sources: Metrics Quantitative performance indicators are compared to defined requirements If these indicators do not meet or exceed the requirements, then a “problem” is said to exist Squeaky Wheel Anecdotal and experiential feedback for the process indicate the potential for a “problem” Squeaky wheels can be a trigger for the creation of a metric 7

Flow versus Performance Process mapping helps us understand process flow Now we must collect data to more objectively evaluate and measure performance To be successful, we must meet both Customer and Business requirements Business (VOB) - efficiency measures Customer (VOC) - effectiveness measures Measurement systems incorporate both 8

Business Feedback (VOB) - Efficiency In order to remain profitable, a business must deliver goods and services such that their cost is lower than their price Business systems should operate at: High speed Low cost With minimum resources (task sequence, strain, waste) Voice Of the Business are “Efficiency” measures 9

What to Measure? (VOB) - Efficiency Outputs Highest level Look for: Quality Defects, Yield, Rework (Overall and by process step) Cycle Time Lead time (Overall and by process step) Cost / Bottlenecks (Overall and by process step) 10

VOB Metrics (How Capable is the Process?) Quality: Does the product or service meet the customer requirements (specifications)? Cycle time: How much time do various steps in the process take? Are there delays in some steps? Bottlenecks: What types of bottlenecks are you seeing? How frequently? How long is the delay? 11

Customer Feedback (VOC) - Effectiveness Goods and services that your Customer receives is an output of your process Customers want goods and services: On-time With highest quality At an excellent value (competitive price, et al) Voice Of the Customer are “Effectiveness” measures 12

What to Measure? (VOC) - Effectiveness Unacceptable product or service Customer complaints High warranty costs Decreased market share Backlog Redoing completed work (cost) Late output Incomplete output (yield) 13

VOC Metrics (How Satisfied is the Customer?) On-time: How many of your products or services out of the total that were made or delivered meet customer requirements? Quality: Does the product or service meet the customer requirements? Cost: How much does it cost to produce the product or service and how does the cost compare to your competitor’s costs? 14

Plan Data Collection USAGE Elements of Data Collection Data collection is the process of gathering the information you need to be able to make a better decision 15

Planning: Clarifying Goals Decide why you are collecting the data Determine factors that could cause the measurement of an item to vary Find ways to reduce the impact of those factors Decide how the data will help you Decide what you will do with the data once you have it 16

Planning: Consistency and Stability Know your process Decide what data you need to collect Fresh (current) data or historical (past) data Develop an “operational definition” of metric Test your data collection forms Make it easy to collect data Communicate the what and why to the data collectors and process participants 17

Usage: Operational Procedures Decide what you are trying to evaluate Decide how you will attach a value to what you are trying to measure Decide if you need to collect new data If so, decide how you will collect the data Decide how you will record the data Determine the period of time you will study Estimate how many observations you will need 18

Usage: Oversight Required Train everyone who will be collecting data Make data collection procedures error-proof Be there in the beginning to oversee data collection Confirm understanding of operational definitions Check to make sure data measurements are stable and that the data looks ‘reasonable’ 19

Entered as part of workflow Mined with queries and reports Database Sheets that move with product or file Travelers Every item vs. selected ones Links data to a specific record Logs Easy to use Aggregates data Check sheets Collection: Sources 20

Collection: How Much ? Too Little Data Bad decision making Extends project timeline Extra work to collect additional data Too Much Data Wasted time collecting data Frustrated team members Wasted time spent managing data Important information hidden in piles of data 21

Types (Pros and Cons) Attribute Data Few Tools for Analysis Need more data Continuous Data Gain more insight to behavior of the process Can use many tools to analyze Need less data Whenever possible its generally better to use Continuous Data 22

Data Continuous Attribute Binary Measured Categorical Ordinal Count Types (Safety example) 23 Time of Incident Gender Type of Incident Severity Rating Number of LTI’s (Can be sub-divided)

Collecting data on a small subset of a population to make inferences about the characteristics of the whole population Benefits Lower cost Saves time Drawbacks Possibility of errors Sampling uses this… Sampling 24

…to predict about all of this Sampling 25

Integrity (Some things to think about) Is recorded data what we meant to record? Does it contain information that was intended? Does the measure discriminate between items that are different? Does it reliably predict future performance? Does it agree with other measures designed to get at the same thing? Is the measure stable over time? Is the data correct in the system? Evaluate what’s right and wrong with your process 26

The way baseline data is collected can have an adverse effect on the outcome of the project A data collection plan is an organized, written strategy for gathering information for your project Data are reliable every time Only relevant data are collected All necessary data (VOB, VOC) are collected Resources are used effectively Data collection form – a very important check sheet for your project… Tips 27

Review Don’t assume people know how to collect data Take a careful, methodical approach Develop a systematic method to collect, review and analyze the data Chart data and Analyze for trends Ask: Do the results pass the common sense test? Monitor process using your metrics See how trends project into the future Share the results with appropriate personnel 28