Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering Using Software Project Courses to Integrate Education and Research Barry.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering Using Software Project Courses to Integrate Education and Research Barry."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering Using Software Project Courses to Integrate Education and Research Barry Boehm November 18, 2009

2 University of Southern California Center for Systems and Software Engineering Outline Nature of real-client project course –Primarily USC campus, neighborhood e-services –MS-level; 2 semester; 6-8 person teams –Well-instrumented for continuous improvement Research/education integration via project experiments –Validate new methods and tools via project usage –Partial basis of 12 PhD dissertations –Rqts. negotiation, formalization (3), COTS integration (2), Value-based methods (3), Agile methods (1), Quality tradeoffs (1), Risk analysis (1), Cost estimation (1) Conclusions 11/18/2009

3 University of Southern California Center for Systems and Software Engineering 2009-10 Software Engineering Projects 11/18/2009 ProjectClientOrganization Online DB support for CSCI 511Jim AlstadCSCI 511 SHIELDS for FamilyReginald Van Appelen SHIELDS for Families Theater Stage Manager ProgramJulie SanchezJules T Bear Growing Great OnlineMatt McMahonGrowingGreat SPC Website Automation EnhancementDean L. JonesSouthland Partnership Corporation VALE Information Management SystemPamela ClayLivingadvantage Inc. LANI D-BaseAshley Westman Los Angeles Neighborhood Initiative (LANI) Freehelplist.orgStephen WolfsonFreehelplist Early Medieval East Asian TimelineKen KlienUSC East Asian Library BHCC Website DevelopmentCesar Armendariz Boyle Heights Chamber Of Commerce Client Case Management Database ORMarcy PullardAvenue Of Independence Website Development : Avenue Of IndependenceMarcy PullardAvenue Of Independence Healthcare The RightwayRoderick ForemanRight Way Direction AROHE Web DevelopmentJanette BrownAROHE

4 University of Southern California Center for Systems and Software Engineering 11/18/2009

5 University of Southern California Center for Systems and Software Engineering MBASE Model Integration: LCO Stage 11/18/2009 Domain Model WinWin Taxonomy Basic Concept of Operation Frequent Risks Stakeholders, Primary win conditions WinWin Negotiation Model IKIWISI Model, Prototypes, Properties Models Environment Models WinWin Agreements, Shared Vision Viable Architecture Options Updated ConOps, Business Case Life Cycle Plan elements Outstanding LCO risks Requirements Description LCO Rationale Life Cycle Objectives (LCO) Package Anchor Point Model determines identifies determines situatesexercise focus use of focus use of determines guides determination of validate inputs for provides initializeadoptidentify update achieveiterate to feasibility, consistency determines exit criteria for validates readiness of initializesinitializes

6 University of Southern California Center for Systems and Software Engineering 11/18/2009 S&C Subdomain (General) 1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 20, 31, 32, 35, 36, 37, 39 Type of Application Simple Block Diagram Examples (project nos.) Developer Simplifiers Developer Complicators Multimedia Archive  Use standard query languages  Use standard or COTS search engine  Uniform media formats  Natural language processing  Automated cataloging or indexing  Digitizing large archives  Digitizing complex or fragile artifacts  Automated annotation/descrip tion/ or meanings to digital assets  Integration of legacy systems MM asset info Catalog MM Archive query MM asset update query update notification  Rapid access to large Archives  Access to heterogeneous media collections

7 University of Southern California Center for Systems and Software Engineering The Results Projects That Failed LCO Criteria Post-1997 failures due to non-S&C causes –Team cohesion, client outages, poor performance 11/18/2009

8 University of Southern California Center for Systems and Software Engineering Outline Nature of real-client project course –Primarily USC campus, neighborhood e-services –MS-level; 2 semester; 6-8 person teams –Well-instrumented for continuous improvement Research/education integration via project experiments –Validate new methods and tools via project usage –Partial basis of 12 PhD dissertations –Rqts. negotiation, formalization (3), COTS integration (2), Value-based methods (3), Agile methods (1), Quality tradeoffs (1), Risk analysis (1), Cost estimation (1) Conclusions and references 11/18/2009

9 University of Southern California Center for Systems and Software Engineering Empirical Software Engineering Research Empirical software engineering research generally slow –Projects take 2-5 years to complete –Improvements confounded with other factors –Data generally sparse, hard to compare across projects Team projects the ESE equivalent of the fruit fly –20 per year, real clients, using industrial-grade processes –Teams include 2 off-campus working professionals –1-2 of 6 on-campus students have 1-2 years work experience –Extensive data, consistently collected –Opportunities to run (partially) controlled experiments Projects, teams not identical 11/18/2009

10 University of Southern California Center for Systems and Software Engineering 11/18/2009 Project Course Experience Factory Projects OrganizationExperience Factory 1. Characterize 2. Set Goals 3. Tailor Process Execution plans 4. Execute Process Project Support 5. Analyze products, lessons learned, models 6. Technology Initialize Apply, Refine Formalize Disseminate Experience Base environment characteristics tailorable technology, mentoring project analysis, process modification data, lessons learned

11 University of Southern California Center for Systems and Software Engineering 11/18/2009 WikiWinWin – Tool Shaper facilitate negotiation in WikiWinWin Initial ideas surfaced at the meeting Shaper organize them into a prospective win condition after the meeting Stakeholders engage in a further discussion

12 University of Southern California Center for Systems and Software Engineering 11/18/2009 WikiWinWin – Current Progress Initial Results (fall 07) –Correlation between usage aspect and outcome –Not all positive feedbacks LCO package quality shortfall vs. usage by team LCO package quality shortfall vs. usage by shaper

13 University of Southern California Center for Systems and Software Engineering 11/18/2009 Growth of COTS-Based USC e-Services Projects –Requires new processes, architectures, methods –It’s not “all about programming” anymore –Similar trends at Motorola, other USC-CSE Affiliates * * Industry: 2001 Standish Report

14 University of Southern California Center for Systems and Software Engineering Axiom 1. Process Happens Where the Effort Happens COTS Assessment, Tailoring, Glue Code CCode/Integration 11/18/2009 b. CBA Effort Distribution of Industry COCOTS Calibration Data a. CBA Effort Distribution of USC E-Service Projects

15 University of Southern California Center for Systems and Software Engineering CBA Spiral Framework 11/18/2009

16 University of Southern California Center for Systems and Software Engineering COTS Assessment Example USC Collaborative Services (USCCS): –Objectives Project management File management Discussion board Project calendaring Chat room –COTS Candidates eProject (577a and 577b) Dot Project (577a) eStudio (577a) eRoom (577a) Blackboard (577b) iPlanet (577b ) 11/18/2009

17 University of Southern California Center for Systems and Software Engineering Example: USCCS Evaluation Results 11/18/2009

18 University of Southern California Center for Systems and Software Engineering Interoperability Evaluation Framework Interfaces 11/18/2009

19 University of Southern California Center for Systems and Software Engineering iStudio Tool 11/18/2009

20 University of Southern California Center for Systems and Software Engineering Experiment 1 Results Data SetGroupsMean Standard Deviation P- Value Dependency Accuracy Pre Framework Application79.3%17.9 0.017 Post Framework Application100%0 Interface Accuracy Pre Framework Application76.9%14.4 0.0029 Post Framework Application100%0 Actual Assessment Effort Projects using this framework1.531.71 0.053 Equivalent projects that did not use this framework 5 hrs3.46 Actual Integration Effort Projects using this framework9.5 hrs2.17 0.0003 Equivalent projects that did not use this framework 18.2 hrs3.37 * Accuracy of Dependency Assessment: 1 – (number of unidentified dependencies/total number of dependencies) ** Accuracy of Interface Assessment: 1 – (number of interface interaction mismatches identified/total number of interface interactions) Accuracy: a quantitative measure of the magnitude of error [IEEE 1990] 11/18/2009

21 University of Southern California Center for Systems and Software Engineering 11/18/2009 AssumptionsSMOPSV Developer needs extensive, on demand user/customer interaction DevSS 0.86H User/Customer are available at limited times User/ Cust SS User/Customer are free to add/modify the system requirements during the system implementation User/ Cust PD 0.46H Changes/additions to system requirements require extra budget and schedule DevPP Model Clashes Occurrence Probability & Severity: 35 Projects RAD Sample S: Stakeholder; M: Model; OP: Occurrence Probability; SV: Severity

22 University of Southern California Center for Systems and Software Engineering 11/18/2009 Clash Types and their Contribution to Project Risk % of Clashes 41231641330765 % of Risk 61742051224543 Success- Property Success- Product Success- Success Product- Property Process- Property Property- Property Product- Product Process- Process Product- Process Success- Process 0.8 1.3 1.4 Majority of research (product-product) addresses minority of risk

23 University of Southern California Center for Systems and Software Engineering 11/18/2009 Contribution of Inter and Intra Model Clashes to Risk 0 20 40 60 80 100 Inter Model Clashes % Total Risk Intra Model Clashes Distribution of Inter and Intra Model Clashes 0 20 40 60 80 100 Inter Model Clashes % Total Model Clashes Intra Model Clashes Inter and Intra Model Clashes and their Contribution to Project Risk Inter model clashes caused majority of risk 47% 53%55% 43%

24 University of Southern California Center for Systems and Software Engineering 11/18/2009

25 University of Southern California Center for Systems and Software Engineering 11/18/2009 Value-Based Review Process (II) Negotiation Meeting Developers Customers Users Other stakeholders Priorities of system capabilities Artifacts-oriented checklist Criticalities of issues General Value- based checklist Domain Expert Priority High Medi um Low Critic ality High Medi um Low 1 2 3 4 5 optio nal 6 Reviewing Artifacts Number indicates the usual ordering of review* * May be more cost-effective to review highly-coupled mixed-priority artifacts.

26 University of Southern California Center for Systems and Software Engineering 11/18/2009 Value-Based Checklist (I) High-Criticality IssuesMedium-Criticality IssuesLow-Criticality Issues CompletenessCritical missing elements: backup/ recovery, external interfaces, success-critical stakeholders; critical exception handling, missing priorities Critical missing processes and tools; planning and preparation for major downstream tasks (development, integration, test, transition) Critical missing project assumptions (client responsiveness, COTS adequacy, needed resources) Medium-criticality missing elements, processes and tools: maintenance and diagnostic support; user help Medium-criticality exceptions and off-nominal conditions; smaller tasks (review, client demos), missing desired growth capabilities, workload characterization Easily-deferrable, low-impact missing elements: straightforward error messages, help messages, GUI details doable via GUI builder, project task sequence details Consistency/ Feasibility Critical elements in OCD, SSRD, SSAD, LCP not traceable to each other Critical inter-artifact inconsistencies: priorities, assumptions, input/output, preconditions/post-conditions Missing evidence of critical consistency/feasibility assurance in FRD Medium-criticality shortfalls in traceability, inter- artifact inconsistencies, evidence of consistency/feasibility in FRD Easily-deferrable, low-impact inconsistencies or inexplicit traceability: GUI details, report details, error messages, help messages, grammatical errors AmbiguityVaguely defined critical dependability capabilities: fault tolerance, graceful degradation, interoperability, safety, security, survivability Critical misleading ambiguities: stakeholder intent, acceptance criteria, critical user decision support, terminology Vaguely defined medium-criticality capabilities, test criteria Medium-criticality misleading ambiguities Non-misleading, easily deferrable, low-impact ambiguities: GUI details, report details, error messages, help messages, grammatical errors ConformanceLack of conformance with critical operational standards, external interfaces Lack of conformance with medium-criticality operational standards, external interfaces Misleading lack of conformance with document formatting standards, method and tool conventions Non-misleading lack of conformance with document formatting standards, method and tool conventions, optional or low-impact operational standards RiskMissing FRD evidence of critical capability feasibility: high-priority features, levels of service, budgets and schedules Critical risks in top-10 risk checklist: personnel, budgets and schedules, requirements, COTS, architecture, technology Missing FRD evidence of mitigation strategies for low-probability high-impact or high-probability, low-impact risks: unlikely disasters, off-line service delays, missing but easily-available information Missing FRD evidence of mitigation strategies for low-probability, low-impact risks

27 University of Southern California Center for Systems and Software Engineering 11/18/2009 By NumberP-value% Gr A higherBy ImpactP-value% Gr A higher Average of Concerns 0.20234 Average Impact of Concerns 0.04965 Average of Problems 0.05651 Average Impact of Problems 0.01289 Average of Concerns per hour 0.02655 Average Cost Effectiveness of Concerns 0.004105 Average of Problems per hour 0.02361 Average Cost Effectiveness of Problems 0.007108 Group A: 15 IV&V personnel using VBR procedures and checklists Group B 13 IV&V personnel using previous value-neutral checklists – Significantly higher numbers of trivial typo and grammar faults Experiment Value-Based Reading (VBR) Experiment — Keun Lee, ISESE 2005

28 University of Southern California Center for Systems and Software Engineering Pair Development vs. Fagan Inspection TDC = Total Development Costs TDC (man-hour) Production Costs (man-hour) Appraisal Costs (man-hour) Rework Costs (man-hour) E1 (Thailand 05) PD Group 526.73314.02102.078.03 FI Group 695.11309.23234.9743.72 E2 (Thailand 05) PD Group 336.66186.6773.3313.67 FI Group 482.5208.516545 E3 (Thailand 05) PD Group 1392.9654.2325.7233 FI Group 1342429436317 E4 (US 05) PD Group 187.5468.1688.8320.05 FI Group 237.9362.82122.1042.52 11/18/2009

29 University of Southern California Center for Systems and Software Engineering Lean MBASE Effort Comparison 11/18/2009 Average number of hours spent for documentation: Less Effort, except SSAD in Fall 2005 Average number of hour/page in documentation: Less number of hours per page; except SSRD in Fall 2006

30 University of Southern California Center for Systems and Software Engineering ICM Electronic Process Guide 11/18/2009

31 University of Southern California Center for Systems and Software Engineering Integrating Software Research, Education Empirical software engineering research generally slow –Projects take 2-5 years to complete –Improvements confounded with other factors –Data generally sparse, hard to compare across projects MS-student projects the ESE equivalent of the fruit fly –20 per year, real clients, using industrial-grade processes –Extensive data, consistently collected –Opportunities to run (partially) controlled experiments Projects, teams not identical –Results frequently correlate with industry experience Results strengthen future educational experiences 11/18/2009


Download ppt "University of Southern California Center for Systems and Software Engineering Using Software Project Courses to Integrate Education and Research Barry."

Similar presentations


Ads by Google