Download presentation
Presentation is loading. Please wait.
Published byMyrtle Tucker Modified over 9 years ago
1
Mastergoal Machine Learning Environment Phase III Presentation Alejandro Alliana CIS895 MSE Project – KSU
2
MMLE Project Overview Provide an environment to create, repeat, save experiments for creating strategies for playing Mastergoal using ML techniques. Divided in 4 sub-projects. Mastergoal (MG) Mastergoal AI (MGAI) Mastergoal Machine Learning (MMLE) User Interface (UI)
3
Phase III Artifacts User Documentation Component Design Source Code (and exe) Assessment Evaluation Project Evaluation References
4
User Documentation MMLE UI User manual provided.manual provided Install Common use cases Data fields types and formats Examples MG, MGAI, MMLE libraries. MGMGAIMMLE API generated by doxygen.
5
Component Design Project divided in 4 main subprojects. UI uses MMLE, MGAI and MG as libraries.
6
Component Design – Design Patterns Used Factory Method: Board, Search Algorithm. Prototype: Strategy. Singleton: All factories, Terms, Fitness Functions, Selection Criteria, Termination Criteria. Template Method: Terms, Strategy. Strategy: Search Algorithms – Agents, Strategy – Search Algorithms. Observer: GameSubject – GameListener, TrainSubject – TrainListener. Proxy: TrainBridge, UIAgentProxy
7
Component Design Deployment Diagrams show packages and files.
8
Component Design Deployment diagrams. Class diagrams Sequence diagrams. Object Diagrams. Short description of classes and link to API online documentation.
9
Source Code Kept in SVN repository (7 projects). 4 sub-projects. 3 test sub-projects. Metrics were taken weekly and will be discussed later in the presentation.
10
Installer and executable. Installer created with NSIS (Nullsoft Scriptable Install System).NSIS UI created with the Windows Forms GUI API available in the.NET Framework. All other sub-projects coded in (unmanaged) C++ and are available as libraries.
11
MMLE Demonstration.
12
Assessment Evaluation I used the CPPUnit framework to perform unit testing on the projects.CPPUnit MastergoalTest MastergoalAiTest Mmle-test Assertions used to test for pre and post- coditions. I used the Visual Leak Detector system to detect memory leaks.Visual Leak Detector
13
Assessment Evaluation Test Plan All test passed* CPPUnit Regression Bugs. Coding of test cases. Document and debug test cases. Memory Leak Bugs.
14
Assessment Evaluation – Defect Log.
15
Size of the test projects. Overall the three test projects have 1125 lines of code.
16
Assessment Evaluation - Metrics
18
Project Evaluation
20
Metrics 533 hours (13.3 weeks or 3 months over a period of 10 months) and 11 KLOC. Estimations FP: Time 10.79 months, 2.79 KLOC. COCOMO: Time 9.24 months, 7.5 KLOC. COCOMO II: Time 9.54 months, 7.5 KLOC.
21
Project Evaluation - FP Real 11 KLOC and 3 months. Estimates of Function Points Size 2.79 KLOC, Time 10.79 months. Lack of experience using FP. Some of the user interfaces were more complex than previously thought No.NET conversion rates A big part of the project is the user interface which contains automatically generated code Algorithms not well represented.
22
Project Evaluation - COCOMO Real 11 KLOC and 3 months. Estimates of COCOMO. Actual size is arbitrary, based on experience. Size 7.5 KLOC, time = 9.25 months Inexperience in C++/.NET. Conversion rates of the languages. Estimates of COCOMO II Application Composition model 5.57 Person months (Object Points / Productivity) Post Architectural Model 9.54 months (7.5 KLOC)
23
Project Evaluation - Time spent at each phase.
24
Project Evaluation
25
Project Evaluation - Lessons Learned Implementation: C++ language, memory management, implementation of design patterns. Tools and libraries (NSIS, CPPUnit, VLD, Doxygen) Design: Design Patterns.
26
Project Evaluation - Lessons Learned Experience on various estimate models. Measurement Tools (CCCC, Process Dashboard). Testing. CPPUnit framework. VLD. Process Iterative process Artifacts
27
Project Evaluation - Future work Improve performance of search algorithm and add new algorithms. Add more functionality to the game playing library and UI. Add more selection mechanisms to the GA Experiments. Add more learning algorithms. Distributed computation to speed up training Refactoring of some classes. Add test classes for each feature.
28
Project Evaluation Results
31
Tools Used MS Visual Studio 2005 BoUml Rational Software Architect NSIS (Nullsoft Scriptable Install System) CCCC (C and C++ Code Counter) TinyXML Visual Leak Detector Doxygen Process Dashboard TortoiseSVN
32
References Design Patterns: Elements of Reusable Object-Oriented Software, Gamma, Erich; Richard Helm, Ralph Johnson, and John Vlissides (1995). Addison- Wesley. ISBN 0-201-63361-2. Machine Learning, Tom Mitchell, McGraw Hill, 1997 ISBN 0-07-042807-7 BoUML http://bouml.free.fr/ Rational Software Architect http://www- 306.ibm.com/software/awdtools/architect/swarchitect/ http://sourceforge.net/projects/tinyxml/ NSIS (Nullsoft Scriptable Install System) http://nsis.sourceforge.net/Main_Page CCCC (C and C++ Code Counter) http://sourceforge.net/projects/cccc CPPUnit http://cppunit.sourceforge.net/doc/lastest/cppunit_cookbook.html TinyXML http://www.grinninglizard.com/tinyxml/ Visual Leak Detector available at http://dmoulding.googlepages.com/vld Doxygen http://www.stack.nl/~dimitri/doxygen/ Process Dashboard http://processdash.sourceforge.net/ TortoiseSVN http://tortoisesvn.tigris.org/
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.