R2 LCO Review Outbrief 2011-09-01. Summary Thanks Very impressed with: – Amount of work – Comprehensive architecture and artifacts – Knowledge of staff.

Slides:



Advertisements
Similar presentations
Project Management Concepts
Advertisements

Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
1 Requirements and the Software Lifecycle The traditional software process models Waterfall model Spiral model The iterative approach Chapter 3.
Software Process Models
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
W5HH Principle As applied to Software Projects
Rational Unified Process
Object-oriented Analysis and Design
Iterative development and The Unified process
By Saurabh Sardesai October 2014.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
Defining the Activities. Documents  Goal Statement defines why helps manage expectations  Statement of Work what gets delivered defines scope  Software.
® IBM Software Group © 2006 IBM Corporation PRJ480 Mastering the Management of Iterative Development v2 Module 3: Phase Management - Inception.
S/W Project Management
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Software Engineering Chapter 15 Construction Leads to Initial Operational Capability Fall 2001.
RUP Fundamentals - Instructor Notes
Software Development *Life-Cycle Phases* Compiled by: Dharya Dharya Daisy Daisy
OSF/ISD Project Portfolio Management Framework January 17, 2011.
Software System Engineering: A tutorial
1 Process Engineering A Systems Approach to Process Improvement Jeffrey L. Dutton Jacobs Sverdrup Advanced Systems Group Engineering Performance Improvement.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Assignment 2 Project: Sea Urchin Sporting Goods Team Members: Gwn Hannay, Debbie Carlton, Susan Dalgleish Daniel Barnes, David Cooper.
CEN rd Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Phases of Software.
Ocean Observatories Initiative OOI CI Release 3 (Scope To Complete) Kick-Off Tim Ampe: System Development Manager Release 3 Kick-off La Jolla, CA October.
1 Project Management Introduction. 2 Chap 1 What is the impact? 1994: 16% of IT projects completed “On-Time” 2004 : 29% of IT projects “On- Time” 53%
Ahmad Al-Ghoul. Learning Objectives Explain what a project is,, list various attributes of projects. Describe project management, discuss Who uses Project.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Executive Session Director’s CD-3b Review of the MicroBooNE Project January 18, 2012 Dean Hoffer.
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
Research & Technology Implementation TxDOT RTI OFFICE.
Object-oriented Analysis and Design Stages in a Software Project Requirements Writing Analysis Design Implementation System Integration and Testing Maintenance.
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
Managing CMMI® as a Project
Fleming College Quality and Risk Management Review Summary February 2, 2006.
Review of Software Process Models Review Class 1 Software Process Models CEN 4021 Class 2 – 01/12.
Core Banking Transformation: A Roadmap to a Successful Core Banking Product Implementation - PMI Virtual Library | | © 2008 Kannan S. Ramakrishnan.
Cyberinfrastructure R3 Life Cycle Objectives Review January 8-9, 2013 Ocean Observatories Initiative CI Release 3 Life Cycle Objectives Review Preliminary.
February 5, 2009 Technical Advisory Committee Meeting Texas Nodal Program Implementation: Program Update Trip Doggett.
Introducing Project Management Update December 2011.
Chapter Sixteen Managing Network Design and Implementation.
Test status report Test status report is important to track the important project issues, accomplishments of the projects, pending work and milestone analysis(
The Implementation of BPR Pertemuan 9 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
Overview of RUP Lunch and Learn. Overview of RUP © 2008 Cardinal Solutions Group 2 Welcome  Introductions  What is your experience with RUP  What is.
Cyberinfrastructure R3 Life Cycle Objectives Review January 8-9, 2013 Ocean Observatories Initiative CI Release 3 Life Cycle Objectives Review Charge to.
OOI CI Release 2 LCO Review August 30, Contents Logistics and Protocol “The Situation” Charge Expectations.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
CI R1 LCO Review Panel Preliminary Report. General Comments –Provide clear definition of the goals of the phase (e.g. inception), the scope, etc. in order.
Continual Service Improvement Methods & Techniques.
Complete Design & Gate Review. Complete Design Detailed Design Review (DDR) 1.“High and medium technical risk” areas –Design Review (Prelim-DDR): standard.
SDOT IS Project Intake Process
IS&T Project Reviews September 9, Project Review Overview Facilitative approach that actively engages a number of key project staff and senior IS&T.
Prof. Shrikant M. Harle.  The Project Life Cycle refers to a logical sequence of activities to accomplish the project’s goals or objectives.  Regardless.
Principal Investigator ESTCP Selection Meeting
Iterative development and The Unified process
TK2023 Object-Oriented Software Engineering
Project life span.
Principal Investigator ESTCP Selection Meeting
Identify the Risk of Not Doing BA
Defining the Activities
ITSM Governance is Imperative to Succeed
Change Assurance Dashboard
IS&T Project Reviews September 9, 2004.
Project Management Process Groups
Principal Investigator ESTCP Selection Meeting
Principal Investigator ESTCP Selection Meeting
Presentation transcript:

R2 LCO Review Outbrief

Summary Thanks Very impressed with: – Amount of work – Comprehensive architecture and artifacts – Knowledge of staff – Quality of presentations – Comprehensive demo of R1- well worth it! – Frankness and completeness of discussions – Logistics were excellent as usual- tx Paul! – Ability to stay on Schedule – Cell phone discipline

Report out rules of engagement We will add / delete / modify comments in our final report Would like to present entire out brief and then take questions

Good Stuff CI architecture appears to be very well planned, documented, and logical. The work that CI has done to test the platform agent on the target hardware was outstanding, and dramatically lowered risk. Prototypes were well planned, well executed, and of great value. Good that CI reached out to MBARI to benefit from their experience and lessons learned

Are Release 2 LCO Review Entry Criteria Met? 1.Are Release 2 LCO artifacts provided and adequate? Artifact SetContentProvidedAdequate ManagementSystem Life Cycle Plan Risk Register Elaboration Execution Plan Yes No – out of date No - Lacks tasks, schedule, resources RequirementsUse cases (mature) User workflows (candidate) System and subsystem requirements (candidate) Yes No Yes Unknown n/a None for UX; not yet mapped to use cases DesignArchitecture Specification (candidate) Technology List (baselined) Prototype Reports Yes ImplementationNone DeploymentNone

Are Release 2 LCO Review Exit Criteria Met?

Are the use cases understood by the stakeholders, and do they provide a complete description of the release scope? To our knowledge, the Marine IOs, EPE, and the project scientists were not asked to review the applicable use cases

Are the core user workflows understood and agreed to by the stakeholders? We do not know what the core workflows are We do not know who the stakeholders are

Are the candidate requirements understood by the stakeholders, and do they cover the critical use cases? Requirements have not yet been mapped to use cases for Rel 2 We do not know which use cases are ‘critical’

Are the critical risks identified, and have they been mitigated through exploratory prototypes? We saw evidence that prototypes were put in place to mitigate technical risks – good job The risk register was not updated to reflect this mitigation We did not see evidence of mitigation strategies for programmatic risks

Are the candidate architectures viable as demonstrated through analysis and prototyping? Yes

Is the Elaboration phase execution plan credible? The Elaboration Plan appears difficult to utilize to manage the Elaboration Phase The Elaboration Plan was not complete – lacking schedule and tasking details

Actions from R1 Reviews – Findings, Recommendations, and Suggestions Were items satisfactorily addressed from Release 1 reviews? – Response to findings/recommendations was discussed in the homework presentation. – Due to time constraints, supporting evidence was not presented for all item resolutions

Are Release 2 LCO Review Exit Criteria Satisfactorily Met? Is this a viable User Interface strategy, team and approach? No – Concerned that one single interface cannot apply to multiple audiences – Not clear that the correct people (e.g., at the marine IOs) have been interviewed Have all of the applicable user interfaces been identified? No Have these user interfaces been appropriately characterized for this stage? No Have R1 user interfaces been adequately proven to support L2 start? No data to make the assessment

Findings – Use Cases There was no identified engagement of appropriate program groups (e.g., marine IOs, data working groups) in the Use Case validation process prior to the LCO. CI expected that consensus on the use cases would take place at the LCO. This consensus process should be a very detailed and concerted working group effort spanning weeks/months as opposed to a day or two. Holding this LCO review does not remove the need to engage the stakeholders to achieve consensus on Use Cases. For example, there is no use case for Navy embargoed data streams.

Findings – User Interface UX does not appear to be sufficiently planned or mature enough to meet goal of pixel- perfect GUIs by LCA for Release 2.

Findings - Schedule The IMS is not well understood by the team, thus affecting their ability to effectively plan. (Ex: some team members did not truly understand the scope for the release)

Findings - Management Spiral development methodology is being used to defer more difficult tasks to later releases. Inadequate staffing continues to be a critical risk to CI schedule. OL needs to provide leadership and support for system integration between the IOs.

Findings - Risks There appear to be significant R2 risks that need active mitigation efforts. The risk register is not current – It has not been updated as a result of recent prototypes – It does not include the risk of having a 3-4X increase in number of use cases in R2 vs R1 Risks do not appear to be pro-actively managed (too many realized risks)

Findings - Risks, Cont. Schedule and Budget impacts as a result of pushing 40% of Release 1 into Release 2 have not been fully developed. This deferment activity should be entered into the risk register as it may have drastic impacts on the ability to deploy Release 2 in time to support the Marine IO deployment.

Findings - Technical Apparent lack of formal trade study process There continues to be no decision on how assets will be managed for OOI (SAF vs. CI) – OL should take lead in resolving this issue

Findings – Review Process Overall, the presentations appeared to lack quantitative backup data. (e.g., lack of staff loading and schedule detail; lack of risk metrics) There is a misunderstanding of the board’s role in representing the stakeholders at this and previous reviews; the board cannot function as a working group It appears that CI has treated each of their milestone reviews so far (R1 LCO, LCA, IOC; R2 LCO) as a progress reporting event rather than as a gate review.

Recommendations – User Community Continue to include Rel 1 early adopters as part of Rel 2 user community (and make sure we don’t break what is already there) There needs to be a formal mechanism for soliciting feedback from users Continue to reach out to outside organizations like MBARI, and expand the list of organizations consulted

Recommendations – Use Cases There appears to be a need for end-to-end threads that exercise the entire system as it will be used operationally.

Recommendations – Use Cases and Requirements Mapping of use cases to requirements should be accomplished during Inception rather than during Elaboration. A requirements verification matrix should be included in the artifacts at all reviews. Use case format should include a field for the associated requirements.

Recommendations - Schedule CI must know and present a critical path schedule to their entire team. Overall schedule risk should be re-assessed.

Recommendations - Management For issues involving cross-IO boundaries, OL needs to be pro-active and own and manage the issues. CI needs to quickly hire more systems engineers and developers. OL and CI need to jointly prioritize requirements and deliverables to pro-actively prepare for budget fluctuations. CI should assign a person, as a local representative, to both Marine IOs to support integration. Project Management support must be given to UX.

Recommendations - Risks Risk 2329 should be promoted to the system level. Reinstitute the formal risk management process. Add a new risk for Release 1 maintenance consuming Release 2 resources. A new risk should be added for the ATBD schedule, and promoted to the system level. Mitigation approaches need to be identified and implemented for risks to R2 LCA.

Recommendations – Review Process Review charge (LCO, LCA, etc.) should be well understood by all parties (board, development team, management team). In other words there was confusion as to what should be accomplished at the review. – Provide complete written charge four weeks prior to the review PRR for Release n should precede LCO for Release n+1

Recommendations - Review Process Allocate more time for questions from the board Two weeks before review, provide evidence to review chair that all entry criteria are met; if not, review should not proceed

Stoplight Chart Technology AreaTechnical Risk AssessmentManagement Assessment UXRed (very few screens wireframed) Red (no plan) COIGreen/Yellow (complexity)Green CEIGreen/Yellow (scalability risk) Green DMGreenYellow (status tracking, staffing) S&AGreenYellow (under staffed) Marine Integration, Sensor Sets, and Dataset Agents Green ArchitectureGreen/Yellow (new technologies, will it scale?) Green CyberPoPGreen

Summary Conclusion Exit criteria for LCO were not met Conduct delta LCO – specifics will be provided in final board report – Includes: Working review and finalization of use cases Mapping of R2 requirements to use cases UX plan and screens Staffing and resource-loaded schedule with critical path Updated risk register with mitigation plans OOI board members are available to assist