Presentation is loading. Please wait.

Presentation is loading. Please wait.

TCOM 541 Session 6. Network Design Process Define Problem Collect data Refine data Choose/build/ run Tool Examine results Each stage influences both next.

Similar presentations


Presentation on theme: "TCOM 541 Session 6. Network Design Process Define Problem Collect data Refine data Choose/build/ run Tool Examine results Each stage influences both next."— Presentation transcript:

1 TCOM 541 Session 6

2 Network Design Process Define Problem Collect data Refine data Choose/build/ run Tool Examine results Each stage influences both next and the previous stages

3 Define Problem What are we trying to do –Build a network from scratch? –Replace an existing network –Expand an existing network? –Merge two networks? –Solve a reliability/performance problem? –Implement a new technology? –Reduce costs? –Etc., etc

4 Collect Data What types of data are available? How complete is the data? How reliable is the data? Very often, discoveries will be made at this stage that will lead to a redefinition of the problem

5 Refine Data Check to identify and eliminate bad data –E.g., circuits that go nowhere, circuits that are suspiciously cheap or expensive,... Fill in gaps –Estimation, modeling, or guesswork... Put into form required by model

6 Choose/Build/Run Tool Choice of design tool is heavily influenced by type of problem and data availability Only the simplest problems are solvable without use of automated Tools The user is to a large extent at the mercy of the algorithms built into the Tool

7 Examine Results ALWAYS examine the design outputs carefully –“Sanity checks” –Global statistics (hops, latency, cost distribution,...) –Look for anomalies –Vary design parameters Repeat process until satisfied... or time/budget runs out

8 Acquisition vs. Design At present, users are often more concerned with acquiring services (e.g., VPN) than designing a network as such This presents a completely different set of problems and opportunities –Essentially, the vendors do the design, the user picks the best

9 Acquisition vs. Design (2) But data is still essential –Accurate information to vendors –Accurate projection of costs to support evaluation Managing the competitive process is critical –Private users have much more freedom than Government – Federal Acquisition Regulations are restrictive

10 Acquisition Case Study Back in the 1970’s, the General Services Administration ran what was essentially a private phone company, called Federal Telephone Service (FTS) –Long distance voice only –On-net only –Leased lines, switches, … –Not very efficient Cost $0.27 to $0.40/minute, depending on how the accounting was done About 30% to 100% more than commercial rates

11 FTS2000 FTS was replaced in 1988 by FTS2000 –Services-oriented 10 year contract with significant volume banding (I.e., lower prices at higher volumes) –Mandatory use by agencies –Voice, packet-switched data, dedicated circuits –Two vendors, mandated 60/40 split –Two internal recompetitions at years 4 and 7

12 FTS2000 – Scope FTS2000 Network A FTS2000 Network B 0 375 million minutes/month 0 $44.0 million/month 0 1.7 million users 0 4,200 locations Customer Premises Equipment (GSA) Local Exchange Carrier Inter- Exchange Carrier Local Exchange Carrier Customer Premises Equipment (Agency) User Service Delivery Points

13 198919901991199219931994 30 25 20 15 10 5 0 Negotiated Price Decreases FTS FTS2000 Cutover Complete Price Redetermination Switched Voice Price (Cents Per Minute) FTS2000 Cutover Begins 1995 Year End FTS2000 Price History Current FTS2000 price is 1.5 cents/minute less than best commercial price

14 FTS2000 Replacement Strategy development started in 1994 for planned award in 1998 Called FTS2001 Situation changed –No mandatory use –Technology advances –Deregulation/local competition

15 FTS Program Objectives –Provide high quality telecommunications services that meet users’ needs –Leverage the large volumes of Government traffic to obtain the best prices Characteristics –Flexibility as a means to deal with uncertainty (technology, market, regulatory, requirements) –Maximize competition –Agency choice –Market oriented –Rely on private sector –Use commercial best practices

16 Strategic Alternatives Continue Current Compre- hensive Contracts (Alternative 1) Integration Contractor (Alternative 2) Span-Specific Contracts (Alternative 3) Regional Compre- hensive Contracts (Alternative 4) Integrated Business Process Solutions (Alternative 5) Service- Specific Contracts (Alternative 6) Service/ Span-Specific Contracts (Alternative 7) Government-wide Approach? Individual Agency Acquisitions (Alternative 8) Agencies Agree on Coordinated Approach Partitioned by Service by Span Comprehensive by Span and Service Yes No

17 Strategy Choice High-level agency (customer) working group Decision support tools (Analytic Hierarchy Process) Inputs from oversight bodies

18 Strategy Choice (2) Comprehensive contracts –1 or 2 contracts, 8 years –Expanded technology suite –Internal recompetition –Provision for local access competition –Need for significant commitment $750M Minimum Revenue Guarantees Agency commitment to support

19 Pricing Structure - Objectives Pricing structure must support objectives: –Award two or three contracts with almost equal prices over likely range of traffic (e.g., 20% to 60%) –Don’t leave money on the table Want total FTS2001 cost close to lowest possible –Facilitate internal competition

20 Pricing Structure – Problems Whatever range of traffic volumes is used for evaluation, offerors will likely bid at least these three price break points: –Just below the lower limit –Just below the upper limit –Just below 100% Probably will bid more breaks

21 Pricing Structure – Problems (2) Two or three awards probably will cost more than single award, in the short-term –Second and third vendors for any service will likely raise prices Declining price-volume curves –FTS2000 experience - initial 2-vendor award cost ~15% more over first 4 years Break-even not achieved until Year 7

22 Pricing Structure Problems (3) Effect worse when 100% prices are significantly lower than 20% - 60% prices Possible exception if vendors have significantly different prices for high- volume services –Effective candidates limited to SVS, 800/900 and DTS

23 Pricing Structure - Example of Possible Bids Evaluation Range 100% Price Volume Offeror A Offeror B 50%

24 Pricing Structure - Example of Resulting Problems Evaluation Range 100% Price Volume Offeror A Offeror B 50% Two here cost more than one here Premium paid for two contractors A is better within the evaluation range, but B is better at 100%

25 Pricing Structure - The 100% Problem If 100% is not evaluated, GSA leaves an unknown amount of money on the table –And may inadvertently award 100% to a higher bidder –Someone will work out how much and publicize it If 100% is evaluated, offerors will most likely structure their bids to drive to a winner-take-all outcome – Cannot recoup excess costs by later internal competition –May actually result in significant savings to Government over first 4 years

26 Pricing Structure - Summary As initially constrained with price-volume banding, GSA could not win

27 Pricing Structure – Partial Solution Change constraints: –Flatten prices by eliminating or severely restricting volume discounts Reduces gaming by offerors Eliminates government volume-band chasing after award Conforms more closely to commercial practice Simplifies pricing and billing

28 Still A Problem Elimination of volume bands removes the 100% problem But does not move closer to ensuring two providers with low and nearly-equal prices

29 Complete Solution Move to two-stage award process Initial award for nominal (but not guaranteed) 50% of network Publish prices at aggregate level Allow offerors to bid new prices for remaining nominal 50% –Structure this as a contract modification for winner of round one –Means these prices apply to his initial 50% whether or not he wins the additional 50%

30 30 Redetermine Comp Range Evaluate Resubmission Evaluate Proposals Establish Comp Range Negotiations and Resubmission Request BAFO Evaluate BAFO Repeat BAFOBAFO Received Make Award(s) Publish Prices Stop Request APO If Only One Award Initial Evaluation BAFO EvaluationAPO Evaluation Receive Proposals Evaluate APO Repeat APO APO Received Make Award Stop AWARD PROCESS Stop

31 Results Sprint won Round One with very low prices –Chose not to bid for second 50% WorldCom won second 50% with prices a few percent lower than Sprint Objectives achieved.

32

33


Download ppt "TCOM 541 Session 6. Network Design Process Define Problem Collect data Refine data Choose/build/ run Tool Examine results Each stage influences both next."

Similar presentations


Ads by Google