Download presentation
Presentation is loading. Please wait.
Published byTerence McDowell Modified over 9 years ago
1
LECTURE 20 26/11/15
2
Summary - Testing ◦ Testing affects all stages of software engineering cycle ◦ One strategy is a bottom-up approach – class, integration, validation and system level testing ◦ XP advocates test-driven development: plan tests before you write any code, then test any changes ◦ Other techniques: ◦ white box (look into technical internal details) ◦ black box (view the external behaviour) ◦ debugging (systematic cause elimination approach is best)
3
Summary - Testing ◦ Testing is a balance of art, science, and luck ◦ Testing may be conducted for different reasons ◦ Quality assurance testing looks for potential problems in a proposed design ◦ Develop a test plan for locating and removing bugs ◦ A test plan offers a road map for testing activity; it should state test objectives and how to meet them ◦ The plan need not be very large; in fact, devoting too much time to the plan can be counterproductive ◦ There are no magic tricks to debugging; however, by selecting appropriate testing strategies and a sound test plan, you can locate the errors in your system and fix them by utilizing debugging tools
4
Software Testing and QA ◦ So each time we reuse it, we further test all its methods, uncovering some old errors ◦ So each reuse will then benefit earlier programs that used this class ◦ The stability and trustworthiness of the class are progressively improved ◦ Better traceability from Analysis to Design to Coding: corrections to the requirements or design point us directly to the affected code, and vice-versa
6
Software Quality ◦ In general, quality means customer satisfaction ◦ So it must satisfy the users’ actual requirements ◦ For software it is often interpreted as ability to conform to requirements ◦ Executing without failure ◦ Most software fails in its first production run ◦ Mission-critical software applications require extremely high levels of quality
7
Verification ◦ Verification ◦ Meeting functional and non-functional requirements without failure ◦ Use cases ◦ Robustness, speed, scalability, etc. ◦ Testing – identifying logical errors ◦ Debugging – finding the root cause of errors ◦ Validation (or usability and user satisfaction) - relates to ◦ functionality coverage – priorities being met ◦ ease of use ◦ perception of the user ◦ Slide 7
8
Errors and Debugging Error Types ◦ Logic errors ◦ Language errors or syntax errors ◦ Run-time errors Debugging ◦ The first step in debugging is recognising that a bug exists ◦ Sometimes it's obvious, the first time you run the application, it shows itself ◦ Other bugs might not surface until a method receives a certain value, or until you take a closer look at the output ◦ Selecting appropriate testing strategies ◦ Developing test cases and sound test plan ◦ Debugging tools
10
Some Answers The user Id field accepts special characters. Confirm password field does not show content in encrypted mode. The name field does not seem to have any validation for number of characters. Captcha is not at all readable. There is no way user can reload the captcha. Register button should be at bottom rather than on side. Register button’s label has “r” instead of “R”. There is no cancel button available if user wants to cancel the procedure.. There is no close button available if user wants to close the page. The page title show wrong spelling of Registration. Password selection guideline should be provided like the password should be alpha numeric or password strength factor should be present. Page title should be New User Registration rather than New Registration. Field length and labels should be same for the whole page / form. The country field should by default show “Select” rather than selecting a value default.
11
IS2204 Module Overview Is the IS development project feasible? What should/does the system do? How should the system be designed to serve the needs of the user (s)? Does the system measure up? Are the users satisfied? Does the system do what it is supposed to do? How do we gather the requirements to inform system design? What are the alternative approaches to Information System development? How do we manage an IS development project efficiently and effectively?
12
Summary ◦ Several features of object-oriented languages and programs impact testing from encapsulation and state-dependent structure to generics and exceptions but only at unit and subsystem levels and fundamental principles are still applicable Basic approach apply Techniques for each major issue (e.g., exception handling, generics, inheritance,...) can be applied incrementally and independently
13
IS2204 Summary ◦ Alternatives to Systems Building Approaches ◦ SDLC ◦ Agile, XP and Scrum ◦ Requirements Gathering /Fact Finding ◦ Role of the Systems Analyst ◦ Cross Lifecycle Activities ◦ Feasibility Analysis ◦ Project Management ◦ The characteristics of a Project Manager ◦ Project Failure ◦ Risk Analysis ◦ FoxMeyer Case Study ◦ Introduction to UML – Use Case Diagrams ◦ Interface Layer in OO and GUI design process ◦ Software Quality Assurance ◦ Verification ◦ Validation ◦ Software Testing Types of Testing ◦ Approaches to software testing
14
Resources ◦ Object Oriented Systems Analysis and Design (Ashrafi and Ashrafi) Chapter 4, 6,7 ◦ User interface design Chapter 10,11,12 ◦ Essentials of systems analysis and design / Joseph S. Valacich ; Joey F. George ; Jeffrey A. Hoffer. ◦ Chapter 5 – Determine Systems Requirements ◦ Appendix A – OOA and D
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.