Slide 1 Qualitative observations on NTNU’s OORT experiment, ISERN, Hawaii, 8-10 Oct. 2000 NTNU OORT Experiment, March 2000 Some qualitative observations.

Slides:



Advertisements
Similar presentations
Ada, Model Railroading, and Software Engineering Education John W. McCormick University of Northern Iowa.
Advertisements

0 WPI First Experience Teaching Software Testing Lessons Learned Gary Pollice Worcester Polytechnic Institute and Rational Software Corp.
CSci 107 Introduction to Computer Science Lecture 1.
NOTICE! These materials are prepared only for the students enrolled in the course Distributed Software Development (DSD) at the Department of Computer.
Learning and Teaching Conference 2012 Skill integration for students through in-class feedback and continuous assessment. Konstantinos Dimopoulos City.
G. Alonso, D. Kossmann Systems Group
M. George Physics Dept. Southwestern College
Software Engineering Lab Session Session 4 – Feedback on Assignment 1 © Jorge Aranda, 2005.
T Project Review Groupname [PP|…|DE] Iteration
How to Use the Artifact Wizard to Submit an Assignment Use this when you’ve when you've already uploaded a file in Tk20 but you need to submit it to an.
BORIS MILAŠINOVIĆ KREŠIMIR FERTALJ UNIVERSITY OF ZAGREB FACULTY OF ELECTRICAL ENGINEERING AND COMPUTING CROATIA Teaching staff role in students projects.
Enterprise Business Processes and Reporting (IS 6214) MBS MIMAS 3 rd Feb 2010 Fergal Carton Business Information Systems.
Potions Class Assistant Professor James Atlas (on loan from the University of Delaware)
Your Presentations Fall 2005 Software Engineering Computer Science and Engineering Qatar University.
Swami NatarajanJune 17, 2015 RIT Software Engineering Reliability Engineering.
Enterprise Business Processes and Reporting (IS 6214) MBS MIMAS 20 th Jan 2010 Fergal Carton Business Information Systems.
Computational Physics Home Assignment #3 Dr. Guy Tel-Zur.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February Empiricism in Software Engineering Empiricism:
CS351 - Software Engineering (AY2005)1 What is software engineering? Software engineering is an engineering discipline which is concerned with all aspects.
April 13, 2004CS WPI1 CS 562 Advanced SW Engineering General Dynamics, Needham Tuesdays, 3 – 7 pm Instructor: Diane Kramer.
1 Scenario-based Analysis of UML Design Class Models Lijun Yu October 4th, 2010 Oslo, Norway.
A case study of A Campus-Base Course Taught and Assessed Using e-learning John Fothergill.
Unit 1 – Improving Productivity. 1.1Why did you use a computer? What other systems / resources could you have used? For unit 10,I had to make a power.
MISMO Trimester Meeting January Jacksonville Florida Using the Reference Model Internally: Enterprise Systems Jim Metzger, Harland Greg Alvord,
ICT 1 Towards an Integrated Approach to Access Control to Health Information Presented by: Inger Anne Tøndel SINTEF Co-authors: Per Håkon Meland SINTEF.
TESTING.
ControlDraw, Modularisation, Standards And Re-Use Standardised Specification and Modular Design How ControlDraw Help.
1 IDI, NTNU Programvarekvalitet og prosessforbedring vår 2000, Forrest Shull et al., Univ. Maryland and Reidar Conradi, NTNU (p.t. Univ.
Slide 1 MOWAHS: MObile Work Across Heterogeneous Systems, NFR IKT2010 R&D project MOWAHS: Mobile Work Across Heterogeneous Systems Reidar Conradi, Mads.
Template for ISERN Instructions:  Keep your main message short and clear: you can discuss the details in person or provide additional background material.
Project Tracking. Questions... Why should we track a project that is underway? What aspects of a project need tracking?
Error reports as a source for SPI Tor Stålhane Jingyue Li, Jan M.N. Kristiansen IDI / NTNU.
Some years ago, CarTech CEO told us: What I want to see in our new hires is:
1 UseCase-based effort estimation of software projects TDT 4290 Customer-driven project IDI, NTNU, 14. Sept Bente Anda, Simula Research Lab., Oslo,
T Project Review X-tremeIT I1 Iteration
Using error reports in SPI Tor Stålhane IDI / NTNU.
Assessing the Frequency of Empirical Evaluation in Software Modeling Research Workshop on Experiences and Empirical Studies in Software Modelling (EESSMod)
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
Test vs. inspection Part 2 Tor Stålhane. Testing and inspection A short data analysis.
Course Information Sarah Diesburg Operating Systems COP 4610.
Slide 1 UFRJ COPPE Department of Computer Science Experimental Software Engineering Group Fraunhofer Center - Maryland Some experiences at NTNU and Ericsson.
Slide 1 SPI Approaches and their Research Methods, ISERN, Hawaii, 8-10 Oct SPI Approaches and their Research Methods ISERN Meeting, Hawaii, 8-10.
Project Management Concepts
1 These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 5/e and are provided with permission by.
Chapter(3) Qualitative Risk Analysis. Risk Model.
Fall 2010 UVa David Evans cs2220: Engineering Software Class 28: Past and Future.
System Context and Domain Analysis Abbas Rasoolzadegan.
Slide 1 Presentation University of Oslo, ISERN, Hawaii, 8-10 Oct Industrial Systems Development Department of Informatics University of Oslo, Norway.
CS 160: Software Engineering October 22 Class Meeting Department of Computer Science San Jose State University Fall 2014 Instructor: Ron Mak
Using Alice in an introductory programming course for non-CS majors Adelaida A. Medlock Department of Computer Science Drexel University
Software Engineering Emphasis for Engineering Computing Courses William Hankley Computing & Information Sciences Kansas State University.
CSC 480 Software Engineering Test Planning. Test Cases and Test Plans A test case is an explicit set of instructions designed to detect a particular class.
Fall 2012 Professor C. Van Loan Introduction to CSE Using Matlab GUIs CS 1115.
Fall 2014 (both sections) Assignment #1 Feedback Elapsed time (How long): –Ranged from 45 minutes to 10 days –About 1/4 said less than 1 day –About 1/2.
Contents 1 Session Goals 1 Session Goals 3 Design Levels 3 Design Levels 2 Design Goals 2 Design Goals 4 Known Issues 4 Known Issues 5 Picking a Specific.
Developing a Metric for Evaluating Discussion Boards Dr. Robin Kay University of Ontario Institute of Technology 2 November 2004.
1 Collecting and Interpreting Quantitative Data Deborah K. van Alphen and Robert W. Lingard California State University, Northridge.
Incorporating PDC Topics into University Level Digital Principles Class Abstract: Digital Computer Principles (CSC 115) is a university wide freshman level.
How to use the assessment process to improve the afterschool program.
T Project Review X-tremeIT PP Iteration
Quick Survey Results UCT Department of Computer Science Computer Science 1015F Hussein Suleman March 2009.
Review Techniques SEII-Lecture 16
PLANNING AND DESIGNING A RESEARCH STUDY
Project Review Team name
Course Information Mark Stanovich Principles of Operating Systems
Software engineering – 1
Lecture Software Process Definition and Management Chapter 3: Descriptive Process Models Dr. Jürgen Münch Fall
Producing Web Course Material with IBM Knowledge Factory Team
Collecting and Interpreting Quantitative Data
Systems Architecture and Engineering
Presentation transcript:

Slide 1 Qualitative observations on NTNU’s OORT experiment, ISERN, Hawaii, 8-10 Oct NTNU OORT Experiment, March 2000 Some qualitative observations Reidar Conradi Software Engineering Group Dept. of Computer and Information Science (IDI) NTNU

Slide 2 Qualitative observations on NTNU’s OORT experiment, ISERN, Hawaii, 8-10 Oct Background Repeated OORT experiment taken from CS735 course at UMD, Fall All artifacts and instructions in English. Part of 4th year QA/SPI course, taught by local responsibles; material adapted by R. Conradi in the USA. Students get pass/no-pass on the assignment. Main change: operationalized OORT instructions as Qij.x questions.

Slide 3 Qualitative observations on NTNU’s OORT experiment, ISERN, Hawaii, 8-10 Oct Overall impressions Big variation in effort, dedication and results: –E.g. some teams did not report effort data, even did the wrong OORTs. Big variation in UML expertise. Students felt frustrated by the extent of the assignment, and that indicated efforts were too low -- felt cheated. Lengthy and tedious pre-annotation of artifacts, before real defect detection could start. Discovered many defects already during annotation, even defects that remained unreported. OORTs too ”heavy” for the given (small) artifacts? Some confusion about the assigments: what to be done and how, on what artifacts,...?

Slide 4 Qualitative observations on NTNU’s OORT experiment, ISERN, Hawaii, 8-10 Oct OORT results Found many defects, not previously reported: –Loan Arranger: 30 (13+17) seeded defects & 23 more + 26 comments. –Parking Garage: 32 (21+11) seeded defects & 14 more + 30 comments. Defects actually reported, 4 groups for LA and 5 for PG, average and variance: –LA: 11 (7..14) seeded & 13 (3..27) more + 9 (6..16) comments. –PG: 7 (4..10) seeded & 4 (0..9) more +10 (0..21) comments. Effort spent: –LA: 5-6 hours. –PG: hours. Lacking access to background/questionnaire data (delayed). In general: more data analysis to come.

Slide 5 Qualitative observations on NTNU’s OORT experiment, ISERN, Hawaii, 8-10 Oct OORT comments Some unclear instructions: Executor/Observer role, Norwegian file names, file access, some typos. First read RD? Some unclear concepts: service, constraint, condition, … UML: not familiar by some groups. Technical comments on artifacts and OORTs: –Add comments/rationale to diagrams: UC and CDia are too brief. –CDe hard to navigate in -- add separators. –SqD had method parameters, but CDia not -- how to check? –Need several artifacts (also RD) to understand some OORT questions. –Many trivial checks could have been done by an automatic UML tool. –Many trivial typos and naming defects in the artifacts: Parking Garage artifacts need more work LA vs. Loan Arranger vs. LoanArranger, gate vs. Gate, CardReaders vs. Card_Readers. Fanny May = Loan Arranger? Lot = Parking Garage?