Download presentation
Presentation is loading. Please wait.
Published byAmelia Harrell Modified over 9 years ago
1
T-76.4115 Iteration Demo Software Trickery I2 Iteration 5.3.2008
2
2 Agenda for presentation Project status (5 min) –Achieving the goals of the project –Project metrics Work results (25 min) –Brief overview of the system –Demo of TMS Quality Assurance (10 min) –Continuous integration –Automated unit and acceptance tests –Agile methods, and used practices
3
3 Introduction to the project: Tournament Management System Module for Party management system v2.0 (PMS) –Modular web-based system for managing party events –E.g. used in Assembly events Tournament management system module (TMS) –Replaces existing system (non PMS-module) –A solution for managing game tournaments –Main user groups Administrators Tournament players Outside spectators –Will be first used at Winter Assembly 2008 gaming festival Estimated 1500 users at this event
4
4 Status of the project’s goals Goal 1: All stakeholders satisfied with course outcome –OK, the final points will tell Goal 2: Customer is satisfied with the product –OK, TMS to be used in Winter Assembly 2008 Goal 3: Project organization works smoothly –OK, Everyone still happy Goal 4: Everyone reaches their personal learning goals on this course –OK Goal 5: Winning the quality award with superior quality product –OK, looks pretty good Goal 6: Creating interest in the assembly organization among the group –OK, at least 1 developer going to WA 2008 as an organizer
5
5 Status of the iteration’s deliverables Project plan, QA-plan, Requirements document & Technical specification –OK, updated Test cases, QA report and test logs –OK Software and online user manual –OK –Known defects listed in customer’s Trac Final report –OK T-76.5158: SEPA diaries –OK
6
6 Resource usage DevDays helped to utilize Ok, 5% error goal set for the iteration reached PMQMSAMaGMiAToAJaLMaHErHSUM PP 55701410456119184 I1 436778763360245250483 I2 52535810488851108791728 Total 150190150190125150 1405 Original plan (in the beginning of the iteration) Realization and updated plan (realized hours) PMQMSAMaGMiAToAJaLMaHErHSUM PP 55701410456119184 I1 436778763360245250483 I2 51625710780831078890728 Total 1491991491931171481471511491402
7
7 Results of the iteration I2 New major features developed during I2 –Phase types Double-elimination, Round-robin, Single-result –Clan management –Seeding By rank –Online admin user’s manual –New user interface Tested by peer group
8
8 System overview
9
9 PMS permissions
10
10 TMS access control
11
11 Sample tournament
12
12 Demo The actual software developed You may use the demo script to follow the presentation
13
13 Quality Assurance (1/3): Quality dashboard Part of the systemQualityConfidenceComments Administrative functionality 3Thoroughly tested by acceptance-level test suites and exploratory testing. Some minor usability issues are still at large, but the overall status is very good. Participant functionality 3Thoroughly tested by acceptance-level test suites and exploratory testing. Tournament phases and progression 2The tournament progression and phases contain a huge amount of special cases. These could have been tested a bit more thoroughly. Spectator functionality 3Thoroughly tested by acceptance-level test suites and exploratory testing.
14
14 Quality Assurance (2/3): Quality goals GoalStatusReasoning QG01: Functionality Almost everything is implemented that were planned for I2. Although, some of the less important nice-to-have features had to be dropped out. QG02: Usability The test results provided by the customer and our peer group indicate that the system is easy to use. However, there are some issues due to the fact the amount of functionalities, especially in the admin part of the software, is very high. Nonetheless, the overall status of usability is good. QG03: Code correctness We have extensive suites of automated unit and acceptance tests for all the important features. QG04: Security We have automated unit-tests and acceptance-level tests for all the most important features. PMS-platform acts as a safety-net. QG05: Maintainability Code is not as clear and commented as well as it could have been.
15
15 Quality Assurance (3/3): CruiseControl & automated testing Automated unit-tests (NUnit) –45 test cases concentrating on the most important modules Automated acceptance-tests (Selenium) –Thorough suite of 111 test cases Automated performance-tests (JMeter) Code metrics and analysis Linking between failed acceptance- tests and defect reports See http://pyppe.iki.fi:8880/ccnet/ Number of builds: 428 Number of failed builds: 37 (of which 84% was caused by failing unit tests) Build success rate: 91% Avg. build fix time: 80 minutes Median of build fix time: 62 minutes Maximum build fix time: 5 hours 24 minutes Minimum build fix time: 4 minutes
16
16 Changes to the project Moved to regular development days in I2 –Project not so distributed anymore Requirement for new UI in sprint S2.2 –More important to customer than functionalities
17
17 Agile methods - used work practices Regular development days –Held 2 or 3 times a week Weekly status meetings –Held every Tuesday –Customer present Version control and documentation on customer server –Helps customer to find software and documentation Demo environment for external stakeholders –Running 24/7 after I1 Automated testing –Unit and acceptance-level The Carrots of Agility
18
18 Any Questions ? Software Trickery would like to thank everyone!
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.