Benchmarking in the Information Technology Age

Slides:



Advertisements
Similar presentations
Starting an ESC Chapter in Your State. Energy Services Coalition Mission To promote the benefits of, provide education on, and serve as an advocate for.
Advertisements

“The Honeywell Web-based Corrective Action Solution”
Engaging Online Faculty and Administrators in the Assessment Process at the American Public University System Assessment and Student Learning: Direct and.
Benchmarking & Metrics On-Line System Benchmarking & Metrics On-Line System TM Last Updated: April 21, 2005.
Continuous Value Enhancement Process
OECD Short-Term Economic Statistics Working PartyJune Analysis of revisions for short-term economic statistics Richard McKenzie OECD OECD Short.
The Value to Project Managers at Company X Construction Industry Institute BENCHMARKING WITH CII ® Updated: March 1, 2007.
Pre-Project Planning Lessons from the Construction Industry Institute Construction Industry Institute Michael Davis, P. Eng, PMP Ontario Power Generation.
Blackboard 9.1 Presented by: Kim Shaver Associate Director of Educational Technology Assisted by : Alicia Harkless, Educational Technology Specialist,
CPI Conference 2001 Education Tools and How to Access Them Education Committee Chris Hyvonen Kiewit Chris Hyvonen Kiewit.
COAA Industry Benchmarking If you are not keeping score you are just practicing “
32 nd National Energy & Environmental Conference September 19, 2005 Benchmarking the Engineering & Construction Industry.
Why Is Benchmarking Important For Company X? PROJECT BENCHMARKING WITH CII Construction Industry Institute ® Last update: March 1, 2007.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Benchmarking in the Information Technology Age Dave Hile Cherne Contracting Corporation CII Benchmarking & Metrics Committee 2000 CII Annual Conference.
1. Stephen R. Thomas Candace L. Macken Tae Hwan Chung Inho Kim Construction Industry Institute Funded by NIST 2.
CHAPTER 2 SYSTEM PLANNING DFC4013 System Analysis & Design.
National Board Process
Integrating CII Best Practices into Project Teams
Local Points of Contact Webinar
John Philbrick PE, CCP, PMP
Board Roles & Responsibilities
Introducing pdri for mega projects As a Planning quality Control tool
Blackboard Assignments & Feedback Rubrics
What every benchmarking coordinator needs to know
Exploring Supplier Development
CII USE OF TECHNOLOGY ASSISTED LEARNING
An agency of the Office of the Secretary of Education and the Arts
Bridges To Success “Effective Advising in Guided Pathways: Executing advising plans that transform departments and institutions to help students achieve.
NCJA ZoomGrants Overview Presented by: Lindsey Johnson
Grades 3-12 Reading, Mathematics, Writing, Science and Social Studies
Benchmarking for the Next Millennium
Benchmarking as a Best Practice
Overview of VAdata Virginia’s Sexual and Domestic Violence Data Collection System.
Education Tools and How to Access Them
Child Outcomes Summary (COS) Process Training Module
Hyper-V Cloud Proof of Concept Kickoff Meeting <Customer Name>
Enterprise Content Management, Shared Services, & Contract Management
Chapter 16 Nursing Informatics: Improving Workflow and Meaningful Use
Online Tools We make it easy for plan sponsors and participants
Business and Management Research
Child Outcomes Summary (COS) Process Training Module
Milwee Middle School Math Night
Sequencing Writing Assignments
Writing to Learn vs. Writing in the Disciplines
Sequencing Writing Assignments
MD Online IEP System Instructional Series – PD Activity #7
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
Penn State Educational Programming Record (EPR) Guide
Coordinator Application and My Credits Module
Designing a Research Package
Assessments to Support Student Learning:
An Introduction to Senior Friendly Care
AASHTO / TRB State Reps Meeting
Child Outcomes Summary (COS) Process Training Module
Your Library: Explore, Learn, Read, Connect
In-Plants: Evaluating Opportunities to Create Value
IHSAA Tournament Officials Ratings System
IEP Team Meeting Facilitation: What is it and How can it benefit Georgia districts? Today we are here to introduce to you a new and exciting initiative.
Update from ECO: Possible Approaches to Measuring Outcomes
INTRODUCTION TO THE GRANT CENTER
COMMON CORE State Standards Initiative
Measuring What Matters
Measure Phase Wrap Up and Action Items
Fahrig, R. SI Reorg Presentation: DCSI
Professional Learning Network
Got Outcomes! 2019 A Project of the National Coalition Institute at
Experts by Experience Group Observations.
APMP Professional Certification
Presentation transcript:

Benchmarking in the Information Technology Age Dave Hile Cherne Contracting Corporation CII Benchmarking & Metrics Committee 2000 CII Annual Conference Nashville, Tennessee

BM&M Committee

“Benchmarking is the practice of being humble enough to admit that someone else is better at something and wise enough to try and learn how to match and even surpass them at it.” Everyone has their own concept of what benchmarking is…when it comes down to it, it is a recognition that others may be better at certain tasks than us, and a willingness to learn from them. This quote from APQC says it well. The American Productivity & Quality Center (APQC) is a world-renowned resource for process and performance improvement for organizations of all sizes across all industries. APQC boasts a distinguished list of achievements that includes providing private-sector input for the first White House Conference on Productivity, spearheading the creation and design of the Malcolm Baldrige National Quality Award in 1987, and jointly administering the award for its first three years. http://www.apqc.org/ APQC, 1998

CII BM&M Program 1999 — We listened, we focused, and we created the Vision. 2000 — We are developing, we are improving, we are delivering against the Vision. 2001 — We will have the most efficient, cost-effective, credible, and open benchmarking system available. Just a brief history of where we have been in the recent past: In 1999 we collected your inputs, we focused our efforts, and we laid the foundation for significant improvements in CII’s BM&M Program. With this input began improvements, and I am here today to report that the program is more customer focused, more responsive, and is using evolving information technologies to do its part in “Leading the Knowledge Revolution” in the engineering and construction industries. In 2000 we are already seeing the benefits of these efforts. Data is collected on-line using a web based questionnaire so that you can input data when its available, from virtually any location. Accuracy is improved as is cycle time from input to receipt of reports. In the products display room we are demonstrating a prototype web based Key Report that permits users to score performance and practice use as the data is entered. When we field this report in the coming months, you no longer will have to submit your data and wait months to receive your feedback. Neither will you have to wait until the project is complete to get an assessment. You will able to enter data during project execution and receive an indication of likely project outcomes while you are on-line. Many other automation and content improvements are being implemented as well. I will share some of these with you later in my presentation. We are realizing many of the improvements we projected in our work plan ahead of schedule. By 2001 the content and technology enabled improvements will offer you the most efficient, cost effective, credible, and open benchmarking system available - A system you can use for benchmarking essentially all of your projects, no matter what the size. In fact we are now able able to open the program to almost all of your projects… we announced at our July training session that we will accept projects as small as $250K. So where are we now?

CII BM&M Database More than 900 projects International Worth approximately $50B International 226 Domestic 675 We have more than 900 projects in the database These projects are worth approximately $50 billion dollars.

BM&M Program Improvements Development & fielding of Web questionnaire I thought I would take a few minutes and highlight some of the more significant program improvements for you. Last fall we released our web questionnaire for data collection. This questionnaire has been very well received. Already this year we have provided a significant upgrade to the web questionnaire by adding new practices and incorporating the full PDRI. The web questionnaire offers many benefits over the paper version (next slide)

Web Questionnaire Enter project data: online during project execution with less effort with greater accuracy Data can now be entered online, during project execution, with less effort, and with greater accuracy. Questionnaires can be completed as an on-going effort during project executive when the data is readily available. When the project is complete, only closeout data need be entered and the questionnaire can be submitted as part of your closeout procedures. Entering data as the project unfolds makes this a less onerous task than trying to assemble all data from project files as the team is disbanding and moving on to new assignments.

Web Questionnaire Content Improvements: Online scoring of PDRI New practices Materials Management Planning for Startup Online scoring of PDRI Disputes Resolution Design Effectiveness The PDRI’s for industrial and building projects, two of CII’s most popular tools, have been added to the questionnaire to assist in assessing the level of front end planning. Automated scoring of the PDRI’s makes the tools more user-friendly and permits the establishment of norms for the scores. In addition, we have also expanded the number of best practices surveyed. Material Management and Planning for Start-up best practices have been added, bringing to 8, the number of practices surveyed. The Benchmarking Committee is working with the Knowledge Team to validate all best practices providing feedback on the value of practice use. Two other best practices, Disputes Resolution and Improving Early Estimates are being programmed at present and should be added to the questionnaire this fall.

Custom Key Reports New custom Key Reports with project results plotted on database charts Better There have been many automation improvements besides the questionnaire. Key Reports, confidential reports returned to companies inputting projects, contained graphical summaries of results… fulfilling a request of many users. These charts are provided as well as the tabular summaries of previous years. Project and company data are plotted on charts produced from data of similar projects in the database providing ready comparisons.

Web Key Reports Web-based Key Reports for real-time scoring of performance and practice use metrics Key reports are also being moved to the web to provide near real-time feedback when data is keyed in. No longer will participants have to submit data and wait months to obtain results. Since data on practice use can be entered at anytime during project execution, the on-line key report can be used as an indicator of expected outcomes based on planned practice use. Used in this manner, the web key report can serve as a tool to guide practice use and improve project performance during project execution rather than merely report on performance months after completion.

Electronic Data Report User-friendly interactive electronic data report that produces charts and data tables on demand One of the more ambitious of our automation tasks has been programming of the Data Report. The Data Report is a reference book providing performance and practice use norms for the entire database. Automation of analysis and chart generation has enabled us to publish many more findings than in previous years. The Data Report grew from approximately 275 pages last year to nearly 2000 pages this year. Rather than attempting to publish an ever increasing volume of findings, we have created a program that reads data from statistical summaries of the database and generates charts and tables, on demand, for any slice of data you choose. The user picks the desired data slice using a tree-like menu and the desired chart and data table is generated. This will enable you to obtain more apples to apples comparisons for your projects. It also provides the data in electronic format so you can use it in company analyses and presentations.

Electronic Data Report Built-in graphics engine produces charts and tables from data file. Better Shown here is a sample schedule growth chart and table. The program also will let you plot your individual project performance versus the appropriate data slice you select as shown here. Reports can be printed or captured for use in other presentations.

Trend Reporting Trend analysis charts for performance and practice use Another feature we added this year is trend analyses for performance and practice use. At present the trends are for CII aggregated metrics, but will will be expanded to trend metrics at the company level. The value of these reports at company level will of course depend on the consistency and level of participation by individual companies.

Value of Best Practices Improved value of best practice analysis Pre-Project Planning Constructability Change Mgmt. Design/Info Tech Team Building Safety (Zero Accidents) Project Complexity Project Cost Growth = 0.2334 - 0.0083 * Pre-Project Planning - 0.0040 * Constructability - 0.0020 * Change Mgmt. - 0.0081 * Design/Info Tech - 0.0070 * Team Building Practice - 0.0131 * Safety Practice + 0.0115 * Project Complexity Not all of the improvements have been in data collection and reporting. Techniques of analysis have also improved. The evaluation of Value of Best Practices has been made much more rigorous by moving to multiple regression methods of analyses. Here rather than evaluate each practice in isolation, all of the practices are placed into a single mathematical model permitting assessment of the relative benefit of each practice. This helps us control for the overlap of benefits and better assess the actual impact of each practice.

Value of Best Practice Reports Relative and bottom-line benefits quantified for each industry group Pre-Project Planning $331K Constructability $336K Project Change Management $441K Design/Information Technology $366K Safety $531K Team Building $306K These analyses have provided the basis for improved reporting as well. Efforts have been made to make reports more meaningful by calculating expected savings for both owners and contractors for each industry group. Shown here are the savings that you could expect for greater use of these practices for the typical $50MM heavy industrial project. Your actual saving would depend on your degree of implementation of these practices and your specific project size. Respondent: Contractors; Industry: Heavy Industrial, $50MM Project

Other Program Improvements Built the CII Benchmarking Web site: http://www.cii-benchmarking.org/ Established a path forward for new performance/productivity metrics. Conducted the first CII Benchmarking User’s Forum. Established an annual CII Benchmarking User Award. There are many other notable milestones that were achieved in the program this past year. Shown here are a few: The BM&M web site provides an provides an excellent means of communicating Committee activities. It has been operational for over a year now and is used to kept benchmarking participants informed on upcoming activities. After much deliberation we have established a path forward on performance and productivity metrics: Following a presentation to the Board of Advisors in April, a focus group met in June and a workshop was held in July. Metric categories have been established and … In May of this year the BM&M Committee conducted its first annual Benchmarking Forum in Atlanta. Member companies joined together to discuss applications of BM & to share success stories. This forum lead to the 4th bullet shown here. The Committee established a Benchmarking User’s Award to be presented annually by the Benchmarking Committee to an owner and contractor company.

Benchmarking User Award Criteria for award: Best application of benchmarking for project improvement Contributions to the BM&M program through active participation Criteria for the award include: Best application of Benchmarking for project improvement. This can be CII Benchmarking, internal benchmarking, or use of a 3rd party program. And contributions to the CII BM program through attendance at training & feedback sessions and the submission of projects for the database.

Benchmarking User Award Champion International Jacobs Engineering Recipients of this year’s award are shown here: Champion International (now IP) accepted the Owner’s Award in a ceremony in ???, in June of this year. The Contractor’s award was presented to Jacobs Engineering in a ceremony in Austin Texas, also in June. We have asked representatives of both companies to join us in our Implementation session this morning to make a short presentation on their use of benchmarking and to answer your questions.

BM&M Implementation Session Panel Dave Hile Cherne Contracting Moderator Chad Zollar International Paper (formerly Champion International) Bob Herrington Jacobs Engineering Steve Thomas CII The panel for our Implementation session is shown on this slide: I will serve as moderator for the session Chad Zoeller will make the Champion Presentation for Ron Newberry, who recently retired. Many of you will remember the excellent presentation Ron made for Champion last year. Bob Herrington, who received the award for Jacobs will present his companies application of benchmarking. Steve Thomas, CII Assoc Director for Benchmarking will also participate on the panel to assist in answering your questions on CII BM products and activities. NCC 204

“If you are not keeping score, you’re only practicing.” Why Get Involved? “If you are not keeping score, you’re only practicing.” Arthur M. Schneiderman Journal of Strategic Performance Measurement January 1999 I thought I would leave you with one last thought on the subject of Benchmarking: “If you are not keeping score, you’re only practicing” How many of you can afford to be practicing in today’s competitive climate? Please join us for our Implementation Session to see how you can learn from the best and adopt it for your company. Thank you and good morning.