Presentation is loading. Please wait.

Presentation is loading. Please wait.

Paul Goodman Introducing Software Metrics via Benchmarking.

Similar presentations


Presentation on theme: "Paul Goodman Introducing Software Metrics via Benchmarking."— Presentation transcript:

1 Paul Goodman Paul.Goodman@metagroup.com Introducing Software Metrics via Benchmarking

2 © 2004 META Group Consulting 01252 819494 metagroup.com 2 Topics for Discussion  Metrics Programme risks and reasons for failure  Benchmarking as a means to mitigating these risks  Moving beyond Benchmarking

3 © 2004 META Group Consulting 01252 819494 metagroup.com 3 “80% of software measurement initiatives fail” (Howard Rubin – META Group) Failure is defined as: (within 2 years);  Initiative is formally cancelled!  Resources are quietly reassigned  The work continues in some way but the results are ignored within the organisation Introducing Metrics – “A Series of Unfortunate Events” (Lemony Snickett) Introducing a metrics programme to an organisation is a risky business

4 © 2004 META Group Consulting 01252 819494 metagroup.com 4 Key Reasons for Metrics Programme Failure  Expectations are not managed!  Data becomes confused with information – of which there is a serious lack!  We build an ivory tower – we don’t involve the senior or project managers (the customers of the measurement initiative)!  The devil is in the detail – or how many Function Points can we get on the head of a pin!  Senior management commitment is neither gained nor maintained!

5 © 2004 META Group Consulting 01252 819494 metagroup.com 5 Some Good News  Over the last four years, I have seen more successful measurement initiatives than at any other period (and I have been in this game for a long time)  These have included measurement initiatives in:  Financial organisations  Local and Central Government sector  Telecoms  Manufacturing sector

6 © 2004 META Group Consulting 01252 819494 metagroup.com 6 Why the Change?  My observation is that there are two main factors contributing to this greater success:  More narrow focus – Don’t try to do too much, too quickly  The client side “pull” for benchmarking in its various forms, primarily: – Traditional comparative benchmarking – Value for money and viability assessments of proposals – Outsource contract management As an industry we could be approaching a 50/50 measurement initiative success rate (so it is still risky!)

7 © 2004 META Group Consulting 01252 819494 metagroup.com 7 Measurement via Benchmarking  What follows is not just a sales pitch for Meta Group Benchmarking – honest!  Benchmarking brings its own issues and pains but also some significant benefits First point to note:  Benchmarking is based on the provision of metrics data  I want to consider the move to the point where that data collection becomes inherently beneficial to the organisations providing the data

8 Can Benchmarking Address the Causes of Failure? (Reverse order)

9 © 2004 META Group Consulting 01252 819494 metagroup.com 9 Management Commitment Facts of Life:  Benchmarking costs money!  (ISBSG is a much cheaper alternative but limited in scope)  Spending money involves management signoff – often at a very senior level  Involving a third party (the “Benchmarker”) has political implications (Non-disclosure agreements at the very least)  First delivery of results is always mandated to be within months – not years!  I.e. within the “management window” of 2 years

10 © 2004 META Group Consulting 01252 819494 metagroup.com 10 Management Commitment All of this generates:  Visibility of the benchmarking initiative within the organisation  (Senior) management involvement during the benchmark procurement  Focus on results rather than process (not good for the long term but we start from where we are) Which in turn leads to management commitment to the Benchmark

11 © 2004 META Group Consulting 01252 819494 metagroup.com 11 Devils and Detail – Short circuiting the “metrics debates”  Benchmarkers rely on comparability of data across organisations  Benchmarkers utilise standards wherever possible (IFPUG V4.2, Mark II, CMM-I etc)  Base metrics definitions are provided to the client as part of the deal – these are not debatable as comparability would be lost (although support for mapping client definitions to the Benchmark definitions should be provided)  Benchmarkers should be able to support (argue for) their definitions – i.e. they should be workable!

12 © 2004 META Group Consulting 01252 819494 metagroup.com 12 Devils and Detail – Short circuiting the “metrics debates” - Downsides  Benchmarkers cannot be too innovative within their benchmarks  E.g. there is currently little benchmark data around based on COSMIC  Some of the hiding places are suddenly not there (you mean you don’t know how many defects were reported during warranty?)  This can lead to overt or covert resistance to the benchmark which needs to be managed

13 © 2004 META Group Consulting 01252 819494 metagroup.com 13 Avoiding the Ivory Tower The benchmark depends on a benchmarking model and two data sources: The model:  This may be defined within the benchmark offering as non-negotiable – management will have been involved in accepting this during the sales negotiation  Definition of the model may be part of the benchmarking process (Meta ADM Benchmark) – involving client representatives

14 © 2004 META Group Consulting 01252 819494 metagroup.com 14 Avoiding the Ivory Tower Data:  The benchmark database from the benchmark supplier  Project and Application Support data from the client  Typically the latter is less readily available than the client may have believed  This often leads to a recognition that basic management information is lacking  It usually needs the involvement of the client Project Managers to get the data into the benchmark

15 © 2004 META Group Consulting 01252 819494 metagroup.com 15 Avoiding the Ivory Tower  It is common practice to establish a Benchmarking Steering Group  This meets regularly under a defined Terms of Reference  Acts as a Measurement Co-ordination Group for the Benchmark  Will become a driving force for measurement within the client organisation  The Benchmarkers process should be well proven with other client reference sites

16 © 2004 META Group Consulting 01252 819494 metagroup.com 16 Avoiding the Ivory Tower  The benchmarking process should incorporate the collection of contextual project information Avoidance of ivory towers is achieved through:  Involvement of the client in the process  The use of a tried and tested process  The Benchmarkers experience of applied measurement in the real world

17 © 2004 META Group Consulting 01252 819494 metagroup.com 17 From Data to Information  The primary focus of the Benchmarking initiative is on the resulting report to management  This report communicates not simply the raw data but meaningful management information, e.g.  Comparative costs  Productivity  Time to Market  Delivered Quality  Process Maturity

18 © 2004 META Group Consulting 01252 819494 metagroup.com 18 Managed Expectations  It is a fundamental rule that the Benchmarker should manage the expectations of the client with respect to:  The scope and limitations of the benchmarking model  The use of a transparent process  Involvement of the relevant stakeholders

19 © 2004 META Group Consulting 01252 819494 metagroup.com 19 All is not perfect  During the first benchmarking cycle issues will arise  These are likely to include:  Peer Group identification making the client think about what they do: – How is work categorised? – How is this work managed?  The client realising that some basic data is not as readily available as initially assumed  The growing recognition on the part of the client that much of the required data would add value if available on a more regular basis The latter point can lead to a more general measurement programme requirement!

20 © 2004 META Group Consulting 01252 819494 metagroup.com 20 Programme Set Up  The client can mitigate some of these issues by:  Working to identify an agreed (by internal stakeholders or the client and supplier for outsourced services) Application Portfolio  Establishing the set of “live” projects  Ensuring that data (including plan data as well as actuals) for recently closed projects is retained and is readily available  Establishing the concept of a warranty period for completed projects

21 © 2004 META Group Consulting 01252 819494 metagroup.com 21 Moving Beyond Benchmarking  When the client and the Benchmarker work together to achieve a successful benchmark this often generates the recognition of the added value of metrics data  The Benchmarker will often provide advice on widening the benchmark initiative even if they are not directly involved  The Benchmark Steering Group can be a key catalyst in moving things forward  Client recognition that the measurement function needs resources

22 © 2004 META Group Consulting 01252 819494 metagroup.com 22 Moving Beyond Benchmarking That software metrics becomes a natural component of software development, support and enhancement activities within the organisation Benchmarking truly succeeds when it evolves to become the client’s measurement programme The Goal


Download ppt "Paul Goodman Introducing Software Metrics via Benchmarking."

Similar presentations


Ads by Google