Download presentation
Presentation is loading. Please wait.
Published byBonnie Morgan Modified over 9 years ago
1
MSE Presentation 3 By Padmaja Havaldar- Graduate Student
Under the guidance of Dr. Daniel Andresen – Major Advisor Dr. Scott Deloach-Committee Member Dr. William Hankley- Committee Member
2
Introduction Overview Revised Artifacts Testing Evaluation
Project Evaluation Problems Encountered Lessons Learned User Manual Conclusion Demonstration
3
Overview Objective: To develop a web-based Statistical Analysis tool based on the statistics alumni information. The four types of tests used for analysis were Regression analysis, Correlation analysis, Hypothesis test and Chi-Square test.
4
Overview
5
Revised Artifacts Object Model
6
Revised Artifacts Object Model
7
Revised Artifacts Formal Requirement Specification
The USE model was refined with the changes suggested during presentation 2
8
Components J2EE Application Server Enterprise Java Beans Java Servlets
XML HTML Java Applets
9
Component Design Servlet
10
Component Design Entity Bean
11
Component Design Session Beans
12
Testing Evaluation Registration form
All the inputs to the fields in the form were tested. Functionality of tests: Each test was tested to check its functionality by using test cases and also by checking the output obtained from the tool with that of Excel. Some of the test cases for the tests are listed below Regression test Less than 3 members No MS members No PhD members
13
Testing Evaluation Chi-Square No Citizens No International students
No person with a job in 3 months of graduation No person without a job in 3 months of graduation Hypothesis test No MS alumni No PhD alumni Correlation No members
14
Testing Evaluation Testing using JMeter
Stress or performance test was conducted using JMeter based on the number of simultaneous users accessing the site To check the results using JMeter, graphs were plotted as results. Throughput is dependent upon many factors like network bandwidth, clogging of network and also the amount of data passed The deviation is amount deviated. this should be as small as possible for best results. The average defines the average time required to access the questions page.
15
Testing Evaluation The values seem high because the data is passed to the bean and many calculations are performed on the data The servlet uses the result to display the graphs as applets and also some tabular representations
16
Testing Evaluation 10 Users/ second (optimal) 30Users/ (average) 45 Users/ (Worst) Deviation 248 ms 559 ms 1542 ms Throughput 755/min 981/min 824/min Average 709 ms 1619 ms 2998 ms According to careful consideration it would be close to impossible to have more than 30 simultaneous users with no lag between them so that tests were made with 15, 30 and 45 users The time required looks higher than normal text web sites thus the total performance is best at low simultaneous users but high number of users deteriorates the performance.
17
Testing Evaluation Testing using Microsoft Application Center Test
Test type:Dynamic Test duration:00:00:05:00 Test iterations:227 Total number of requests:4,093 Total number of connections:4,093 Average requests per second:13.64 Average time to last byte (msecs):72.39 Number of unique requests made in test:12 Number of unique response codes:2 Errors Counts DNS:0 Socket:0 Average bandwidth (bytes/sec):134,048.33 Number of bytes sent (bytes):1,434,357 Number of bytes received (bytes):38,780,141 Average rate of sent bytes (bytes/sec):4,781.19 Average rate of received bytes (bytes/sec):129,267.14
18
Testing Evaluation Scalability Portability Robustness Database
Oracle database is highly scalable. The number of users in the database does not affect the performance of the database because of the fact that the database has only one table of users. The database is used to retrieve users from the table. Application Tests with 200 simultaneous users also provided reasonable results. Average time for each user to access the questions page: 5 seconds Deviation was 2 seconds. Portability Since the J2EE architecture is based on the Java framework, the application can be used across many enterprise platforms. Robustness Using client side scripting and error checking within the middle tier, the application is more or less robust against invalid data. The application has undergone many iterations of unit testing to finally culminate into a robust application. The worst case tests with JMeter also provided reasonable results to exemplify the fact that the application is highly robust.
19
Formal Technical Inspection
Inspection of the SRS was conducted by Laksmikanth Ghanti and Divyajyana Nanjaiah The inspection results specified that the SRS was 99% satisfactory. Minor changes were corrected by adding a section for Anticipated future changes in version 2.0 of the SRS and making provision for additional error messages in the SRS Results
20
User Manual User manual
An installation guide and a detailed walkthrough of the project is provided in the user manual User manual
21
Project Evaluation Project Duration Start Time Finish Time
Expected Actual Phase I 03/15/03 06/30/03 Phase II 07/01/03 7/28/ /27/03 Phase III 10/28/03 27/11/ /09/03
22
Project Evaluation
23
Project Evaluation
24
Project Evaluation
25
Project Evaluation Lines of Code Estimate in first phase = 4636
Actual Lines of code Entity Java Beans = 1869 Servlet =1040 XML =120 Total = 3029 lines of code.
26
Problems Encountered Learning curve J2EE and Deploy tool Alumni data
Does not update files automatically Not best suited for unit testing or development practices EJB packaging errors Alumni data
27
Lessons Learned Methodology Reviews Technology
Usefulness of methodologies Reviews The feedback during reviews was very helpful Technology J2EE architecture and deploy tool
28
Conclusion SAT was implemented using the J2EE architecture
JMeter and Microsoft ACT was used to stress test the application and the performance was found to be satisfactory The SAT is extensible
29
Demonstration
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.