Architecture Review Board ©USC-CSSE
Outline USC CS577 ARB Concept Participants Procedures Results
The Incremental Commitment Spiral Model (ICSM) 4 Key Principles: Stakeholder value-based system definition and evolution Incremental commitment and accountability Concurrent system and software definition and development Evidence and risk-based decision making First of all, ICSM or the Incremental Commitment Spiral Model, which is a process model. One of the most important ingredient of the ICSM is the opportunity and risk that will determine course of action in the development.
ICSM for 24-week e-services projects
USC CS577 ARB Participants Project Team Everybody presents something Reviewers Clients Instructors, TAs, and Graders Each presentation takes 80 minutes.
(c)USC-CSSE
ARB/milestones for one-semester team FCR/DCR ARB: Oct 15th, 17th, and 19th Based on DC package Focus on DCR success criteria CCD: November 14th Core Capability Drive-through Client(s) will have hands-on experience on your core capabilities TRR: November 26th, 28th, 30th Based on AsBuilt package
(c)USC-CSSE
ARB/milestones for two-semester team FCR ARB: October 15th, 17th, and 19th Based on preliminary FC package Focus on FCR success criteria Hands-on client feedback session: November 14th DCR ARB: November 26th, 28th, 30th Based on draft DC package Focus on DCR success criteria
ARB Review Success Criteria FCR For at least one architecture, a system built to arch. will: Support the Ops Concept Satisfy the Requirements Be faithful to the Prototype(s) Be buildable within the budgets and schedules in the Plan Show viable business case Most major risks identified and resolved or covered by risk management plan Key stakeholders committed to support Foundations (nee Architecting or Elaboration) Phase (to DCR) DCR For the selected architecture, a system built to the arch. will: Be buildable within the budgets and schedules in the Plan All major risks resolved or covered by risk management plan Key stakeholders committed to support full life cycle
Commitment Review Success Criteria CCD TRR / OCR Determine whether client needs anything further to ensure successful Transition and Operation Changes in priorities for remaining features? Changes being made to operational procedures? More materials needed for training? Changes in system data or environment we should prepare for? Anyone else who should experience CCD? Show value Product works as expected (or not with noted exceptions) Product will help users do job Show quality development e.g. relevant parts of your IOC documentation Process Show sustainability e.g. support requirements/plans Transition plan & status Show confidence that product is/will be ready to be used e.g. Transition plan & status See also value
Team Preparation for ARB Reviews Before ARB week Within-team Dry run of presentations and demo Further dry runs as necessary During ARB week ARB Presentation and discussion Follow-up team discussions, client discussions After ARB week Monday: FC packages due (Two-semester team) Monday: DC packages due (One-semester team) Before ARB week Within-team Dry run of presentations and demo Further dry runs as necessary ARB Week ARB Presentation and discussion Follow-up team discussions, client discussions Week+1 Monday: FC packages due Monday: DC packages due
Grading Criteria (70 points) Quality of Presentation (10 points) Quality of Project (40 points) Progress (10 points) Consistency and project synchronization (5 points) Time management (5 points)
FCR ARB Session Outline Architected Agile Team (x,y): (presentation time, total time) (5, 5) Remote Team Member(s) Team’s strong points & weak points (operational view and technical view) concerns & possible solutions; (10,10) OCD. System purpose; shared vision; proposed new system; benefit-chain diagram; system boundary; desired capabilities and goals (10,10) Prototype. Most significant capabilities [buying information](especially those with high risk if gotten wrong) Need to show additional progress addition to the first team prototype presentation (5, 10) Requirements. Most significant requirements and its priorities level (10, 10) Architecture. Top-level physical and logical architecture; Use case diagram, status of NDI/reuse choices (5, 10) Life Cycle plan. Life cycle strategy; focus on Foundations phase; key stakeholder responsibilities; project plan, resource estimation (10, 10) Feasibility Evidence. Business case (beginnings, including benefits analysis); NDI analysis results (what are the COTS you are considering/considered); major risks; (5, 5) QFP. Traceability Matrix and summary; Quality Management Strategy; Defect Identification review type summary (what & how) by document section or UML, and current defect injection & removal matrix, technical debt (20) Things done right; issues to address (Instructional staff) Do not forget your slide number Each chart MUST have information specific to your project
DCR ARB Session Outline NDI/ NCS Team (1 semester) (x,y): (presentation time, total time) (5, 5) Remote Team Member(s) Team’s strong points & weak points (operational view and technical view) concerns & possible solutions; S/P Engineer observations (10,10) OCD. System purpose; shared vision; proposed new system; benefit-chain diagram; system boundary; core capabilities, constraints and goals (5 , 5) WinWin Agreements. Agreed Win conditions in each category (10,10) Prototype/ Product Demo. Most significant capabilities, NDI/NCS integration (5, 5) Architecture. Top-level physical and logical architecture; (if applicable) (10, 10) Life Cycle plan. Life cycle strategy; focus on Development phase & transition increment; key stakeholder responsibilities; project plan; resource estimation (10, 15) Feasibility Evidence. NDI/NCS alternatives, NDI/NCS evaluation & analysis results; Business case (beginnings, including benefits analysis); major risks; Capability and LOS feasibility evidence; 3 personas (5, 5) QFP. Traceability Matrix and summary; Defect Identification review type summary (what & how) by document section or UML, and current defect injection & removal matrix; Quality Management Strategy; Technical Debt (20) Things done right; issues to address (Instructional staff) Do not forget your slide number Each chart MUST have information specific to your project
Specfics for DEN students Team’s strong points & weak points List at least one item for each of the following List your team’s strong points Operational view Technical view List your team’s weak points Identify specific technical concerns & possible solutions Identify operational risks & possible mitigation Sources of observations Team activities, package evaluation, WinWin negotiation, and etc.
QFP – Defect Identification Review Traceability Matrix For each document section, UML model, and etc. identify the following type of review you used (peer review, agile artifact review, and etc. ) Other form of defect identification, e.g., grading, client feedback, etc. Current Defect Injection and Removal Matrix Current, total defect information from your progress report Briefly explain techniques, tools your team is using for quality management, configuration management. Is it useful? Improvement?
ARB Session Outline DCR Similar format to FCR, different focus: Less time for OCD, Prototype More time for Architecture, Plans Details TBD based on FCR ARB experience General rule on focus: emphasize your project's high risk areas At FCR (in most cases) this will involve establishing the operational concept (including system analysis) At DCR (in most cases) this will involve the system design and development plan (especially schedule)
ARB Packages bring at least 2 copies of your ARB presentation (2 or 4 slides per page) Post your presentation and artifacts on your team website > valuation phase before your ARB for off-campus (DEN) students and instructional staff
Demos in ARB For those teams doing a live demo in the ARB meeting, please include screenshots of your demo in your presentation for your IV&Vers to see the demo in case video connection is a problem for reviewers to make notes on
Skype/Google Hangout Off-campus students who can not attend the ARB in-person, will be connecting through Skype or Google + If your session is conducted during class time, you may use WebEx Please finish setting up before the session starts
ARB timeslots Monday Oct 15 Wed Oct 17 Fri Oct 19 12:30 – 1:50pm SAL 322 2:00 – 3:20pm OHE 122 4:00 – 5:20pm 3:30 – 4:50pm 5:20 – 6:40pm 5:00 – 6:20pm Make reservation at https://doodle.com/poll/qmmrm6innmppmzer Clients and all team members, including off-campus students, must be available to attend
Past FCR Experiences and General Guidelines
Outline Previous FCR ARB Feedback Summary Examples of Good and Bad Practices seen at ARBs ARB Chartsmanship & Presentation
Overall FCR Feedback Generally done well: presentations, time management, client rapport Reconcile FCR content with ARB Success Criteria When asked a question: Give the answer in brief, this will help your time management and the Review Board will get the desired information Listen Carefully, One speaker at a time Many had poor time management that indicated that presentation(s) had NOT been practiced Occasional pointing at laptop screen, not projected image (even better over Webex, use mouse) Very occasionally, slides with NO value added 25 25
OCD Feedback (1) System boundary diagram Generally done well: Organizational goals, Operational concept, System boundary and organizational environment. Some benefits chains need rework: Added stakeholders: users, clients, developers, IIV&Vers, database administrators, maintainers, interoperators, suppliers Assumptions are about environment not about outcome Involvement/use of system before system is built Some organization goal(s) are Benefits Chain end outcomes System boundary diagram If you are using the component in/for your system, remove it from environment, e.g. PHP, .NET framework. 26 26
OCD Feedback (2) Organization Goals Are Benefits Chain End Outcomes (or maybe a subset) Are NOT project Initiative contributions Identify Levels of Service properly 100% availability, 100% reliability - not feasible! Make sure you can measure LOS goals Prototypes and System are NOT the same (usually) Business Workflow Use activity-type diagram Illustrate business activities Not technical/system activity May not even “see” system explicitly 27 27
Prototype Feedback Generally done well: GUI Prototypes, Good understanding of client’s needs Prototype all high-risk elements, not just GUI’s COTS interoperability, performance, scalability Use user/client-friendly terms “John Doe, 22 Elm St.” not generic substitutions like “Name1, Addr1” Use as an opportunity to gather more information and/or examples Identify end users and try to get feedback from end users Focus on important and high priority requirements (initially) 28 28
Requirements Feedback Generally done well: Project and Capability requirements, OCD-requirements traceability Prioritize all the requirements Propagate LOS goals from OCD into SSRD or drop LOS requirements from Win Conditions (and SSAD) Distinguish between client imposed requirements and developer choice solution (SSAD) Make sure all requirements are testable Qualify “24/7 Availability" with exceptions Update the new requirements in WinBook tool There is no such thing as an “implicit requirement” 29 29
SSAD Feedback Generally done well: Overall views Follow UML conventions (arrows, annotations, etc.) Generalization of actors Uncommon mistakes in use-case diagrams Two actors-one use case (means BOTH present) Arrow direction for <extends> or <includes> Devil is in the detail; simple is the best Only two teams had an adequate start on Information & Arctifacts Diagram Read the exit criteria for the milestone carefully 30 30
LCP Feedback - 1 Generally done well: overall strategy, roles and responsibilities Too many 577b TBDs Identify required skills for NN new team member(s) (577b; if needed to meet "team size") Show (concentrate on) your future plan; not the past Full Foundations phase plan Don’t plan ONLY for documentation Include Modeling Include Prototyping; coding; executable architecture 31 31
LCP Feedback - 2 COCOMO drivers Often differ per the module (type) PMATs rationale was often wrong: CS577 projects' process maturity should be between 2 and 3 Some driver rationales were "ridiculous" Add DEN Student interactions to Gantt Chart IIV&V System/Project Engineer Add maintainer’s responsibilities 32 32
FED Feedback Generally done well: Business case framework, risk analysis Specify LOS feasibility plans Include training, operations, maintenance, opportunity costs/effort Few had developers hours as cost (which is correct) Try to quantify benefits, show return on investment Change ROI to reflect on-going costs (possibly savings) Distinguish one-time from annual costs in business case Benefits start in mid 2017 (go at 6 months granularity); Costs start mid 2006 Elaborate process rationale 33 33
QFP Generally done well Some missing traceability injection-removal matrix Some seemed to try to "snow us with data", not present just a quick summary 34 34
Things to improve Presentation – communication skill One word wrong could lead to billion $ loss. Practice in front of others Be concise and precise Consistencies among each artifact Team work vs. integrated individual works Prepare your client: Tell them what an ARB is (use agenda, success criteria) Tell them what to expect from ARB Time management Get in and set-up ASAP Have documents & client present 35 35