Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, 26-27 Nov. 2003 1 How to identify best practices? – empiri and system development SPIKE / Abelia.

Slides:



Advertisements
Similar presentations
1 Software Engineering (SU) group Reidar Conradi et al. IDI, NTNU, May 8, 2006
Advertisements

1 A Systematic Review of Cross- vs. Within-Company Cost Estimation Studies Barbara Kitchenham Emilia Mendes Guilherme Travassos.
Lecture # 2 : Process Models
© Chinese University, CSE Dept. Software Engineering / Software Engineering Topic 1: Software Engineering: A Preview Your Name: ____________________.
Realism in Assessment of Effort Estimation Uncertainty: It Matters How You Ask By Magne Jorgensen IEEE Transactions on Software Engineering Vol. 30, No.
1 Parastoo Mohagheghi- 21 Sept.2004 The Impact of Software Reuse and Incremental Development on the Quality of Large Systems Parastoo Mohagheghi Dept.
Statoil-NTNU contact meeting, 25 May 2009 STATOSS: OPEN SOURCE SOFTWARE AT STATOILHYDRO? NTNU-STATOILHYDRO contact meeting, 25 May 2009 Reidar.
R. Conradi: Research in Software Engineering, SU's PhD day, 23 Nov Research in Software Engineering – methods, theories,… basta o cerchiamo? NTNU,
Software Quality Engineering Roadmap
Experimental Evaluation in Computer Science: A Quantitative Study Paul Lukowicz, Ernst A. Heinz, Lutz Prechelt and Walter F. Tichy Journal of Systems and.
SE curriculum in CC2001 made by IEEE and ACM: Overview and Ideas for Our Work Katerina Zdravkova Institute of Informatics
R R R CSE870: Advanced Software Engineering (Cheng): Intro to Software Engineering1 Advanced Software Engineering Dr. Cheng Overview of Software Engineering.
R. Conradi: Research in Software Engineering, SU's PhD day, 23 Nov Research in Software Engineering – methods, theories,… basta o cerchiamo? NTNU,
Modeling and Validation Victor R. Basili University of Maryland 27 September 1999.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February Empiricism in Software Engineering Empiricism:
Course Introduction and Overview of Software Engineering Richard N. Taylor ICS 221 Fall 2002.
Software evolution.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February 6, Outline Motivation Examples of Existing.
1 ECE 453 – CS 447 – SE 465 Software Testing & Quality Assurance Case Studies Instructor Paulo Alencar.
MBI PRESENTATION By Jamal Jackson. INTRODUCTION - Improving the effectiveness of root cause analysis in post mortem analysis: A controlled experiment.
ISERN-Meeting, Honolulu, Hawaii 09 October 2000 Slide 0 Using Experiments to Teach Software Engineering Using Experiments to Teach Software Engineering.
1 Software Engineering (SU) group: general info, persons and R&D projects Reidar Conradi, IDI, NTNU, Trondheim, 22. Aug Reidar.
Slide 1 INCO: INcremental and COmponent-Based Development, new NFR R&D project INCO: Incremental and Component Based Development Jan. 25, 2000 (rev. 26.1,
An industrial study in Norway, Germany and Italy Seminar on CBSE (component-based software engineering) Simula Research Lab., Oslo, 4 Feb. 2005
Software evolution. Objectives l To explain why change is inevitable if software systems are to remain useful l To discuss software maintenance and maintenance.
Generalization through a series of replicated experiments on maintainability Erik Arisholm.
1 IDI, NTNU Programvarekvalitet og prosessforbedring vår 2000, Forrest Shull et al., Univ. Maryland and Reidar Conradi, NTNU (p.t. Univ.
Slide 1 MOWAHS: MObile Work Across Heterogeneous Systems, NFR IKT2010 R&D project MOWAHS: Mobile Work Across Heterogeneous Systems Reidar Conradi, Mads.
Template for ISERN Instructions:  Keep your main message short and clear: you can discuss the details in person or provide additional background material.
©Ian Sommerville 2000, Mejia-Alvarez 2009 Slide 1 Software Processes l Coherent sets of activities for specifying, designing, implementing and testing.
OHT 7.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Chapter 7.1.
Lessons Learned from Empirical IESE Dieter Rombach ISERN WS 2005 Noosa Heads, 14 November 2005.
 CS 5380 Software Engineering Chapter 2 – Software Processes Chapter 2 Software Processes1.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering.
Lecture on Computer Science as a Discipline. 2 Computer “Science” some people argue that computer science is not a science in the same sense that biology.
PhD seminar A case study of the mentoring approach in a SPIKE company By Finn Olav Bjørnson.
Invitation for a new OSS-USE R&D project OSS-USE: INDUSTRIAL SOFTWARE INNOVATION BY OSS 2 June 2009, Reidar Conradi et al., IDI, NTNU
COTS and OSS – What is it? M. Morisio, M. Torchiano Politecnico di Torino – Italy {morisio, Seminar on CBSE An industrial study in.
Assessing the Frequency of Empirical Evaluation in Software Modeling Research Workshop on Experiences and Empirical Studies in Software Modelling (EESSMod)
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
NIK’01, Tromsø, Nov An Empirical Study on the Utility of Formal Routines to Transfer Knowledge and Experience Reidar Conradi, NTNU Tore Dybå,
1 - 26/10/2015 The SINTEF Group The Foundation for Scientific and Industrial Research at the Norwegian Institute of Technology Tore Dybå.
Experiences with certification of reusable components in the GSN project in Ericsson, Norway Parastoo Mohagheghi and Reidar Conradi Dept. Computer and.
Institut Experimentelles Software Engineering Fraunhofe r IESE Sauerwiesen 6 D Kaiserslautern Germany The Architecture-centric Inspection Approach.
©Ian Sommerville 2004 Software Engineering. Chapter 21Slide 1 Chapter 21 Software Evolution.
Slide 1 Presentation University of Oslo, ISERN, Hawaii, 8-10 Oct Industrial Systems Development Department of Informatics University of Oslo, Norway.
Computing and SE II Chapter 15: Software Process Management Er-Yu Ding Software Institute, NJU.
Dag Sjøberg Simula Research Laboratory Basic Research in Computing and Communication Sciences!
CBSE seminar, Simula Research Lab., 4 Feb Seminar on CBSE (component-based software engineering): An industrial survey in Norway, Germany and Italy.
SFWR ENG 3KO4 Slide 1 Management of Software Engineering Chapter 8: Fundamentals of Software Engineering C. Ghezzi, M. Jazayeri, D. Mandrioli.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 21 Slide 1 Software evolution 1.
Software Evolution Program evolution dynamics Software maintenance Complexity and Process metrics Evolution processes 1.
Evolution in Open Source Software (OSS) SEVO seminar at Simula, 16 March 2006 Software Engineering (SU) group Reidar Conradi, Andreas Røsdal, Jingyue Li.
1 07 September 2004 Parastoo Mohagheghi The Impact of Software Reuse and Incremental Development on the Quality of Large Systems Parastoo Mohagheghi, Dept.
1 Experience from Studies of Software Maintenance and Evolution Parastoo Mohagheghi Post doc, NTNU-IDI SEVO Seminar, 16 March 2006.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 21 Slide 1 Software evolution.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 21 Slide 1 Software evolution.
Information day on FP6 Budapest Tor Ulsund (abelia)
5. 2Object-Oriented Analysis and Design with the Unified Process Objectives  Describe the activities of the requirements discipline  Describe the difference.
1 Requirements Engineering for Agile Methods Lecture # 41.
R. Conradi: Research in Software Engineering, SU's PhD day, 23 Nov Research in Software Engineering – topics, methods, models,… basta o cerchiamo?
Overview Software Maintenance and Evolution Definitions
Lecture 3 Prescriptive Process Models
Chapter 18 Maintaining Information Systems
The Systems Engineering Context
Introduction on Empirical Software Engineering - ESE seminar by NTNU, IDI and Simula Res. Lab. at SRL, Oslo, 2 June 2003 Reidar Conradi Dept. Computer.
Software Engineering Experimentation
(Software Engineering group, IDI, NTNU):
Cost Estimation I've got Bad News and Bad News!.
Research Methods Introduction Jarod Locke.
Presentation transcript:

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov How to identify best practices? – empiri and system development SPIKE / Abelia Innovasjon theme conference Klækken hotell, Nov Reidar Conradi Dept. Computer and Information Science (IDI) NTNU, NO-7491 Trondheim abelia-intro-26nov2003.ppt Tel , Fax

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Motivation for software improvement (1) Large societal importance: ubiquitous SW, vulnerable impact. Want software faster, better, cheaper, … ICT sector: second largest industry in Norway, 190 MNOK in annual revenues, employees, 3.5 % of GNP value creation (SSB, 2001). 2/3 of SW developed outside traditional ICT industries (EU) software developers (3%) in Norway– many with scant formal education in informatics. Ex. Norwegian failures: Skattedirektorates SCALA-system, NSBs billettsystem, NTNUs lønns/personalsystem, Rikstrygdeverkets TRESS, … But only the failures are heard of. Ex. Problems in USA: US Standish ”Chaos” report from 1995, cited in [PITAC99], on projects for tailored software: –31% stopped before finish, 81 bill. $ loss/year (1% of GNP!) –53% have serious overruns (189% average), 59 bill. $/year

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Motivation for software improvement (2) Some current challenges: –Web-systems: time-to-market (TTM) vs. reliability? –How do software systems evolve (”rot”) over time, cf. Y2K? –How to use COTS components? [Basili01] –How to estimate software development? –… –What is empirically known about software technologies (techniques, methods, processes)? –How to advice industry about software technologies, considering their context? –How can SMEs carry out systematic improvement? –How can we learn from each other – industry vs. research? –How to perform valid sw.eng. research in a university -- by student projects and having industry serving as a lab?

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Proposed ”silver bullets” [Brooks87] (1) What almost surely works: Software reuse/CBSE/COTS: yes!! Formal inspections: yes!! Systematic testing: yes!! Better documentation: yes! Versioning/SCM systems: yes!! OO/ADTs: yes?!, especially in domains like distributed systems and GUI. High-level languages: yes! - but Fortran, Lisp, Prolog etc. are domain-specific. Bright, experienced, motivated, hard-working, …developers: yes!!! – brain power. More powerful workstations: yes!! – computer power.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Proposed ”silver bullets” [Brooks87] (2) What probably works: Better education: hmm? UML: often?, but need tailored RUP and more powerful tools. Powerful, computer-assisted tools: partly? Incremental development e.g. using XP: partly? More ”structured” process/project (model): probably?, if suited to purpose. Software process improvement: in certain cases?, assumes stability. Structured programming: conflicting evidence wrt. maintenance? Formal specification/verification: does not scale up? – only for safety-critical systems. Need further studies (”eating”) of all these ”puddings”: what works with what results in what contexts – many challenges!

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Empirical Software Engineering (ESE) Lack of formal validation in computer science / software engineering vs. other disciplines: [Tichy98] [Zelkowitz98]. (New) technologies not properly validated: OO, UML, … Empirical / Evidence-based Software Engineering since 1992: writings by Basili, [Rombach93], [Wohlin00], Juristo. Int’l SW Eng. Res. Network (ISERN) group from 1992, ESERNET EU-project in Sw.eng. group at NTNU since 1993, at UiO from 1997 – both with ESE emphasis. Sw.eng. group at Simula Research Laboratory from 2001: attn/ Dag Sjøberg, in coop. with NTNU, SINTEF et al. SPIQ, PROFIT and SPIKE projects on empirical and practical SPI in Norway,

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov SW Eng. characterization: need ESE SE learnt by “doing”, i.e. realistic projects in SE courses. Strong “soft” (human and organizational) factors. Problems in being more “scientific”: –Most industrial SE projects are unique (goals, technology, people, …), otherwise just copy software with marginal cost! –Fast change-rate between projects: goals, technology, people, process, company, … – i.e. no stability, meager baselines. –Also fast change-rate inside projects: much improvisation, with theory serving as back carpet. –So never enough time to be “scientific” – with hypotheses, metrics, collected data, analysis, generalization, and actions. How can we overcome these obstacles, i.e. to learn and improve systematically? – ESE as the answer? Tens of factors (“context”) in software projects – how to show effect and causality? Realism vs. rigour?

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Possible “context” factors/variables To understand a discipline means to build models, that later can be validated and refined – but many context factors. People factors: number of people, level of expertise, group organization, problem experience, process experience, … Problem factors: application domain, newness to state of the art, susceptibility to change, problem constraints, … Process factors: life cycle model, methods, techniques, tools, programming language, other notations, … Product factors: deliverables, system size, required qualities such as time-to-market, reliability, portability, … Resource factors: target and development machines, calendar time, budget, existing software, … Example: 29 factors to predict software productivity [Walston77]. (from Basili’s CMSC 735 course at Univ. Maryland, fall 1999)

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex. Estimation models, e.g. by Barry Boehm Effort = E1 * Size ** E2 Duration = D1 * Effort ** D2 And many other magic formulaes! Question: Can “E1” express 29 underlying factors? And how to calibrate for an organization, and use with sense? Formal vs. informal (expert) estimation [Jørgensen03]?

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex. Model of fault rate vs. size [Basili84]: the fault rate of modules shrunk as module size and complexity grew in the NASA-SEL environment; other authors had inverse observation – who was right?: Explanation: smaller modules are normally better, but involve more interfaces and often chosen when “(re-)gaining” control. Above result confirmed by similar studies - but many more factors … Fault Rate Size/Complexity BelievedActual Hypothesised

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Four basic parameters in a study (GQM-method) Object: a process, a product, any form of model. Purpose: characterize, evaluate, predict, control, improve, … Focus (relevant object aspect): time-to-market, productivity, reliability, defect detection, accuracy of estimation model, … Point of view (stakeholder): researcher, manager, customer, … - all this involves many factors/variables.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov ESE: common kinds of empirical studies Formal experiments, “in vitro”, often among students: can control the artifacts, process and outer context. Quasi experiments, in “vivo”, in industry: costly and hard logistics. Use Simula’s SESE web-tool [Sjøberg02]? Case studies: try new technology in real project. Post-mortems: collect lessons-learned, e.g. by data mining or interviews [Birk02]. Surveys: often by questionnaires. Structured interviews: more personal than surveys. Observation: being a “fly on the wall”. General Theory: Generalize from available data. Action research: researcher and developer overlap roles.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov ESE: different data categories Quantitative (“hard”) data: data (i.e. numbers) according to a defined metrics, both direct and indirect data. Need suitable analysis methods, depending on the metrics scale – nominal, ordinal, interval, and ratio. Often objective. Qualitative (“soft”) data: prose text, pictures, … Often from observation and interviews. Need much human interpretation. Often subjective. Specific data for a given study (e.g. reuse rate) vs. Common data (cost, size, #faults, …) - “nice to have”?

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov ESE: validity problems Construct validity: the “right” (relevant, precise, minimal, …) metrics - use Goal-Question-Metrics? Internal validity: the “right” data values. Conclusion validity: the right (appropriate) data analysis. External validity: the “right” (representative) context.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov ESE: combining different studies/data Meta-studies: aggregations over single studies. Cf. medicine with Cochran reporting standard. Need shared experience databases? A composite study may combine several study kinds and data: 1.Prestudy, doing a survey or post-mortem 2.Initial formal experiment, on students 3.Sum-up, using interviews 4.Final case study, in industry 5.Sum-up, using interviews or post-mortem

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Achieving validated knowledge: by ESE Learn about ESE: [Rombach93] [Conradi03]. Set goals, e.g. use QIP [Basili95]? Need operational methods to perform studies: general [Kitchenham02], on GQM [Basili94]? Cooperate with others on repeatable studies / experiments (ISERN, ESERNET, …) [Vokác03]. Perform meta-analysis across single studies. Need reporting procedures, databases etc. Need more industrial studies, not only with students. Have patience, allocate enough resources. Industrial studies will run into unexpected problems; SPI initiatives have 30-50% “abortion” rate [Conradi02] [Dybå03].

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex. Some NTNU studies (all published) CBSE/reuse: Assessing reuse in 15 companies in REBOOT, Modifiability of C++ programs and documentation, Ex3, INCO: COTS usage in Norway, Italy, and Germany (many). Assessment of COTS components, Ex2, INCO: CBSE at Ericsson-Grimstad, (many). Inspections: Perspective-based reading, at U. Maryland and NTNU, Ex1, NTNU diploma theses: SDL inspections at Ericsson, UML inspections at U.Maryland, NTNU and at Ericsson, SPI/quality: Role of formal quality systems in 5 companies, Comparing process model languages in 3 companies, Post-mortem analysis in two companies, SPI experiences in SMEs in Scandinavia and in Italy and Norway, SPI lessons-learned in Norway (SPIQ, PROFIT), And many more!

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex1. SDL inspections at Ericsson-Oslo , data mining study in 3 MSc theses (Marjara et al.) General comments: AXE telecom switch systems, with functions around * and # buttons, teams of 50 people. SDL and PLEX as design and implementation languages. Data mining study of internal inspection database. No previous analysis of these data. Study 1: Project A, 20,000 person-hours. Look for general properties + relation to software complexity (by Marjara being a previous Ericsson employee). Study 2: Project A + Project-releases B-F, 100,000 person- hours. Also look for longitudinal relations across phases and releases, i.e. “fault-prone” modules - seems so, but not conclusive (by Skåtevik and Hantho) When results came: Ericsson had changed process, now using UML and Java, but with no inspections.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex1. General results of SDL inspections at Ericsson- Oslo , by Marjara Study 1 overall results: -About 1 person-hour per defect in inspections. -About 3 person-hours per defect in unit test, 80 p-h/defects in function test. -So inspections seem very profitable. ActivityYield = Number of Defects [#] Total effort on defect detection [h] Cost- efficiency [defect/h] Total effort on defect correction [h] Estimated saved effort by early defect removal (“formulae”) [h] Inspection preparation, design Inspection meeting, design Desk Check (Unit Test and Code Review) Function Test Total so far System Test Field Use (first 6 months) Table 1. Yield, effort, and cost-efficiency of inspection and testing, Study 1.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex1. SDL-defects vs. size/complexity (#states) at Ericsson-Oslo , by Marjara Study 1 results, almost “flat” curve -- why?: -Putting the most competent people on the hardest tasks! -Such contextual information is very hard to get/guess. Defects found during inspections Defects found in unit test States

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex1. SDL inspection rates/defects at Ericsson-Oslo , by Marjara > Recommended rate actual rate Study 1: No internal data analysis, so no adjustment of insp. process: - Too fast inspections: so missing many defects. - By spending 200(?) analysis hours, and ca more inspection hours: will save ca test hours!

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex2. INCO, studies and methods by PhD student Parastoo Mohagheghi, NTNU/Ericsson-Grimstad Study reusable middleware at Ericsson, 600 KLOC, shared between GPRS and UMTS applications: –Characterization of quality of reusable comp. (pre-case study) –Estimation of use-case models for reuse – with Bente Anda, UiO (case study) –OO inspection techniques for UML - with HiA, NTNU, and Univ. Maryland (real experiment) –Attitudes to software reuse – with two other companies (survey) –Evolution of product families (post-mortem analysis) –Improved reuse processes (proposal for case study) –Reliability and stability of reusable components, based on 13,500 (!) change requests – with NTNU (case study/data mining), next three slides

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex2. GPRS/UMTS system at Ericsson-Grimstad

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex2. Research design (data mining)

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex.2 Hypotheses testing (as null-hyp.) H01: Reused components have same fault-density as non- reused components. Rejected - reused more reliable. H02a: There is no relation between #faults and component size for all components. Not rejected - not incr. with size. H02b: There is no relation between #faults and component size for reused components. Not rejected - not incr. with size for reused. H02c: There is no relation between #faults and component size for non-reused components. Rejected - incr. with size for non- reused. H03a/b/c: There is no relation between fault-density and component size for all/reused/non-reused components. Not rejected. H04: Reused and non-reused components are equally modified. Rejected - reused more stable.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Ex3. COTS usage contradicts “common wisdom” In INCO, structured interviews of 7 Norwegian and Italian SMEs: Thesis T1: Open-source software is often used as closed source. Thesis T2: Integration problems result primarily from lack of compliance with standards; not architectural mismatches. Thesis T3: Custom code is mainly devoted to add functionalities. Thesis T4: Formal selection seldom used; rather familiarity with product or generic architecture. Thesis T5: Architecture more important than requirements to select components. Thesis T6: Tendency to increase level of control over vendor whenever possible. See [Torchiano04]. To be extended with larger Norwegian survey by NTNU and Simula, later repeated in Germany and Italy.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov From 50 software “laws” [Endres03]: L1, Glass: Requirement deficiencies are the prime cause of project failures. L5, Curtis: Good designs require deep application domain knowledge. L12, Corbató: Productivity and reliability depend on the length of a program’s text, independent of language level used. L16, Conway: A system reflects the organizational structure that built it. L23, Weinberg: A developer is unsuited to test his or her code. L27, Lehman-1: A system that is used will be changed.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov More from 50 software “laws”: L30, Basili-Möller: Smaller changes have a higher error density than large ones. L36, Brooks: Adding manpower to a late project makes it later. L45, Moore: The price/performance of processors is halved every 18 month. L47, Cooper: Wireless bandwidth doubles every 2.5 years. L49, Metcalfe: The value of a network increases with the square of its users.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Some of the 25 hypotheses, also from [Endres03]: H2, Booch-2: Object-oriented designs reduce errors and encourage reuse. H5, Dahl-Goldberg: Object-oriented programming reduce errors and encourage reuse. H9, Mays: Error prevention is better than error removal. H16, Wilde: Object-oriented programs are difficult to maintain. H25, Basili-Rombach: Measurements require both goals and models.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Conclusion (1) Best practices: depend on context, so must know more about that relation!! Need feedbacks from and cooperation with industry to be helpful – our “laboratory”! Compensation to industry for participation. Seek data relevance to actual goal/hypothesis! But unused data worse than no data? ESE: promising, but hard. High ESE / SPI activity in Norway since Much international cooperation.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Conclusion (2) Higher R&D spending in Norway?: still 1.7% of GNP, in spite of parliamentary promises from April 2000 on reaching OECD-level (2.25%) in 4 years. Large and growing ICT sector in Norway, sparse funds for R&D. Too much at the bottom (“hw/tele”) and at the top (“applications”) – need more in the middle (“software engineering” and likewise). Ex. NFR is using 100 MNOK per year on basic software research – as much as the three best Norwegian football players earn per year! Ex. Kreftregisteret for medicine, SSB for general data, Air traffic authority, Water research institute etc. – what public “bureau” is for (empirical) software engineering? Chinese proverb: –invest for one year - plant rice, –invest for ten years – plant a tree, –invest for 100 years – educate people.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Appendix 1: Some useful web addresses Fraunhofer Institute for Experimental Software Engineering (IESE), Kaiserslautern: International Software Engineering Research Network (ISERN): Fraunhofer Center for Experimental Software Engineering, Univ. Maryland (FC-MD): EU-network on Experimental Software Engineering (ESERNET, end-2003): Software engineering group (SU) at IDI, NTNU: Industrial software engineering group (ISU) at UiO: SINTEF Telecom and Informatics: Simula Research Laboratory, at IT-Fornebu from 2001: (see under “research” and then “Software Engineering”) SPIKE project: (official web cite), (NTNU one).

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Appendix 2: Literature list (1) [Basili84] Victor R. Basili, Barry T. Perricone: “Software Errors and Complexity: An Empirical Investigation”, Commun. ACM, 27(1):42-52, 1984 (NASA-SEL study). [Basili94] Victor R. Basili, Gianluigi Caldiera, and Hans Dieter Rombach: "The Goal Question Metric Paradigm", In John J. Marciniak (Ed.): Encyclopedia of Software Engineering -- 2 Volume Set, John Wiley and Sons, 1994, p , [Basili95] Victor R. Basili and Gianluigi Caldiera: “Improving Software Quality by Reusing Knowledge and Experience”, Sloan Management Review, 37(1):55-64, Fall 1995 (on the Quality Improvement Paradigm, QIP). [Basili01] Victor R. Basili and Barry Boehm: “COTS-Based Systems Top 10 List”, IEEE Computer, 34(5):91-93, May [Birk02] Andreas Birk, Torgeir Dingsøyr, and Tor Stålhane: "Postmortem: Never leave a project without it", IEEE Software, 19(3):43-45, May/June [Brooks87] Frederick P. Brooks Jr.: No Silver Bullet - Essence and Accidents of Software Engineering. IEEE Computer, 20(4):10-19, April [Conradi02] Reidar Conradi and Alfonso Fuggetta: "Improving Software Process Improvement", IEEE Software, 19(4):92-99, July/Aug [Conradi03] Reidar Conradi and Alf Inge Wang (Eds.): Empirical Methods and Studies in Software Engineering -- Experiences from ESERNET, Springer Verlag LNCS 2765, ISBN , Aug. 2003, 278 pages. [Dybå03] Tore Dybå: "Factors of SPI Success in Small and Large Organizations: An Empirical Study in the Scandinavian Context", In Paola Inverardi (Ed.): "Proceedings of the Joint 9th European Software Engineering Conference (ESEC'03) and 11th SIGSOFT Symposium on the Foundations of Software Engineering (FSE-11)“, Helsinki, Finland, 1-5 September, ACM Press, pp [Endres03] Albert Endres and Hans-Dieter Rombach: A Handbook of Software and Systems Engineering: Empirical Observations, Laws, and Theories, Fraunhofer IESE / Pearson Addison-Wesley, 327 p., ISBN , [Jørgensen03] Magne Jørgensen, Dag Sjøberg, and Ulf Indahl: “Software Effort Estimation by Analogy and Regression Toward the Mean”, Journal of Systems and Software, 68(3): , Nov

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Literature list (2) [Kitchenham02] Barbara A. Kitchenham, Susan Lawrence-Pfleeger, L.M. Pickard, P.W. Jones, D.C. Hoaglin, Khalid El Emam, and J. Rosenberg: "Preliminary guidelines for empirical research in software engineering", IEEE Trans. on Software Engineering, 28(8): , Aug [PITAC99] President’s Information Technology Advisory Committee: “Information Technology Research: Investing in Our Future”, 24 Feb. 1999, [Rombach93] Hans-Dieter Rombach, Victor R. Basili, and Richard W. Selby (Eds.): Experimental Software Engineering Issues: Critical Assessment and Future Directives, Springer Verlag LNCS 706, 1993, 261 p. (from International Workshop at Dagstuhl Castle, Germany, Sept. 1992). [Sjøberg02] Dag Sjøberg, Bente Anda, Erik Arisholm, Tore Dybå, Magne Jørgensen, Amela Karahasanovic, Espen Koren, and Marek Vokác: ”Conducting Realistic Experiments in Software Engineering”, ISESE’02, Nara, Japan, October 3-4, 2002, pp , IEEE CS Press (about SESE web-tool – an Experiment Support Environment for Evaluating Software Engineering Technologies). [Tichy98] Walter F. Tichy: "Should Computer Scientists Experiment More", IEEE Computer, 31(5):32-40, May [Torchiano04] Marco Torchiano and Maurizio Morisio: "Overlooked Facts on COTS-based Development", Forthcoming in IEEE Software, Spring 2004, 12 p. [Vokác03] Marek Vokác, Walter Tichy, Dag Sjøberg, Erik Arisholm, and Magne Aldrin: “A Controlled Experiment Comparing the Maintainability of Programs Designed with and without Design Patterns – a Replication in a real Programming Environment”, Accepted for Journal of Empirical Software Engineering in [Walston77] C. E. Walston and C. P. Felix: "A Method of Programming Measurement and Estimation“, IBM Systems Journal, 16(1):54-73, [Wohlin00] Claes Wohlin, Per Runeson, M. Höst, M. C. Ohlsson, Björn Regnell, and A. Wesslén: Experimentation in software engineering: An introduction, Kluwer Academic Publishers, ISBN , 224 pages. [Zelkowitz98] Marvin V. Zelkowitz and Dolores R. Wallace: "Experimental Models for Validating Technology", IEEE Computer, 31(5):23-31, May 1998.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Appendix 3: SU group at NTNU IDI’s software engineering (SU) group: Five faculty members: Reidar Conradi, Tor Stålhane, Letizia Jaccheri, Monica Divitini, Alf Inge Wang. One lecturer: MSc Per Holager. 15 active PhD-students, with 6 new in both 2002 and 2003: common core curriculum in empirical research methods. 35 MSc-cand. per year. Research-based education: students participate in projects, project results are used in courses. A dozen R&D projects, basic and industrial, in all our research fields – industry is our lab. Half of our papers are based on empirical research, and 25% are written with international co-authors.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Research fields of SU group (1) Software Quality: reliability and safety, software process improvement, process modelling Software Architecture: reuse and COTS, patterns, versioning Co-operative Work: learning, awareness, mobile technology, project work In all this: Empirical methods and studies in industry and among students, experience bases. Software engineering education: partly project-based. Tight cooperation with Simula Research Laboratory/UiO and SINTEF, active companies, Telenor R&D, Abelia/IKT-Norge etc.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov Research fields of the SU group (2) Software quality Software architecture Co-operative work Patterns, COTS, Evolution, SCM Mobile technology Distributed Software Eng. SPI, learning organisations Software Engineering Education Reliability, safety

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov SU research projects, part 1 Supported by NFR: 1.CAGIS-2, : distributed learning environments, CO2 lab, Ekaterina Prasolova-Førland (Divitini). 2.MOWAHS, : mobile technologies, Carl-Fredrik Sørensen (Conradi); with DB group. 3.INCO, : incr. and comp.-based development, Parastoo Mohaghegi at Ericsson (Conradi); with Simula/UiO. 4.WebSys, : web-systems – reliability vs. time-to-market, Sven Ziemer and Jianyun Zhou (Stålhane). 5.BUCS, : business critical software, Jon A. Børretzen, Per T. Myhrer and Torgrim Lauritsen (Stålhane and Conradi). 6.SPIKE, : industrial sw process improvement, Finn Olav Bjørnson (Conradi); with Simula/UiO, SINTEF, Abelia, and 10 companies - successor of SPIQ and PROFIT. Also INTER-PROFIT in FAMILIER, : Product families, Magne Syrstad (Conradi), mainly with IKT-Norge but some IDI-support. 8.SEVO ( ): Software Evolution of component-based systems for software reuse (two PhDs and one postdoc), Reidar Conradi.

Abelia/SPIKE: Good practice - empiri & syst.dev., Klækken, Nov SU research projects, part 2 IDI/NTNU-supported: 9.Software process, : Mingyang Gu (Jaccheri). 10.Software safety and security, : Siv Hilde Houmb (Stålhane). 11.Component-based development, : Jingyue Li (Conradi). 12.Creative methods in Education, (NTNU): novel educational practices, no PhDs, Jaccheri at IDI w/ other dept.s. Supported from other sources: 13.ESE/Empirical software engineering, : open source software, Thomas Østerlie (Jaccheri), saved SU project funds. 14.ESERNET, (EU): network on Experimental Software Engineering, no PhDs, Fraunhofer IESE + 25 partners. 15.Net-based cooperation learning, (HINT): learning and awareness, CO2 lab, Glenn Munkvold (Divitini).