Validata Performance Tester Deliver flexible, scalable performance testing across platforms.

Slides:



Advertisements
Similar presentations
Tales from the Lab: Experiences and Methodology Demand Technology User Group December 5, 2005 Ellen Friedman SRM Associates, Ltd.
Advertisements

Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group
Automating Software Module Testing for FAA Certification Usha Santhanam The Boeing Company.
High level QA strategy for SQL Server enforcer
Topics to be discussed Introduction Performance Factors Methodology Test Process Tools Conclusion Abu Bakr Siddiq.
Performance Testing - Kanwalpreet Singh.
Test Automation Success: Choosing the Right People & Process
Software Quality Assurance Plan
HP Quality Center Overview.
Key-word Driven Automation Framework Shiva Kumar Soumya Dalvi May 25, 2007.
Validata Message Testing (MSG)
Validata TestCloud ™ The end of testing as we know it!
All content in this presentation is protected – © 2008 American Power Conversion Corporation Rael Haiboullin System Engineer Change Manager.
Validata Automated Build & Configuration (ABC)
VoIP: Full Lifecycle Management Russell M. Elsner APM Technology Director OPNET Technologies, Inc.
Validata Release Coordinator Accelerated application delivery through automated end-to-end release management.
Roadmap to Continuous Integration Testing and Benefits Gowri Selka, Walgreens Natalie Koltun, Walgreens May 20th, 2014 ©2013 Walgreen Co. All rights reserved.
Documentation Testing
Software Testing and Quality Assurance Testing Web Applications.
©Company confidential 1 Performance Testing for TM & D – An Overview.
Chapter 9: Moving to Design
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Introduction to Systems Analysis and Design
Introduction to Software Testing
Validata Advanced Testing Suite (ATS)
MSF Testing Introduction Functional Testing Performance Testing.
Release & Deployment ITIL Version 3
Load Test Planning Especially with HP LoadRunner >>>>>>>>>>>>>>>>>>>>>>
Customer Sales Presentation Stoneware webNetwork Powered by ThinkServer.
Effective Methods for Software and Systems Integration
Complete and Integrated Lifecycle Management. Challenges 1.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
TESTING STRATEGY Requires a focus because there are many possible test areas and different types of testing available for each one of those areas. Because.
Cloud Testing Speaker : Mrityunjaya Hikkalgutti Date : 3 rd July 2010.
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
Model Bank Testing Accelerators “Ready-to-use” test scenarios to reduce effort, time and money.
Case Study : Morcom Trading – P BSC 21
RUP Implementation and Testing
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
1 © Quality House QUALITY HOUSE The best testing partner in Bulgaria.
Service Transition & Planning Service Validation & Testing
Event Management & ITIL V3
IT Requirements Management Balancing Needs and Expectations.
Middleware for FIs Apeego House 4B, Tardeo Rd. Mumbai Tel: Fax:
Landstar Application Case Study: Development Of Content-rich Solutions For The Mobile Employee Bob Leo Director of Professional Services October 15, 2000.
TESTING LEVELS Unit Testing Integration Testing System Testing Acceptance Testing.
9 Systems Analysis and Design in a Changing World, Fourth Edition.
Microsoft Management Seminar Series SMS 2003 Change Management.
PRJ566 Project Planning & Management Software Architecture.
Automated Testing Gireendra Kasmalkar Prabodhan Exports Pvt. Ltd.
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Network design Topic 6 Testing and documentation.
7 Strategies for Extracting, Transforming, and Loading.
July, 2008 Impati – Software Test Solutions. July, Contents Testing Service Overview and Approach Test Services and Industries Key Services Offering.
Performance Testing Test Complete. Performance testing and its sub categories Performance testing is performed, to determine how fast some aspect of a.
Mobile Application Testing Mobile Application Testing.
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
User Acceptance Testing The Hard Way Graham Thomas BCS SIGIST 10 th May 1996.
SG SCM with MKS scmGalaxy Author: Rajesh Kumar
HPHC - PERFORMANCE TESTING Dec 15, 2015 Natarajan Mahalingam.
1 DEPLOYMENT AND OPERATIONS MODULE 23 ECM SPECIALIST COURSE 1 Copyright AIIM.
Software Engineering (CSI 321)
Software Architecture in Practice
Software testing
Performance Testing Methodology for Cloud Based Applications
Introduction to Software Testing
Course: Module: Lesson # & Name Instructional Material 1 of 32 Lesson Delivery Mode: Lesson Duration: Document Name: 1. Professional Diploma in ERP Systems.
Open Source Tool Based Automation solution with Continuous Integration and end to end BDD Implementation Arun Krishnan - Automation Manager Maria Afzal-
Bringing more value out of automation testing
Open Source Tool Based Automation solution with Continuous Integration and end to end BDD Implementation Arun Krishnan - Automation Manager Maria Afzal-
Presentation transcript:

Validata Performance Tester Deliver flexible, scalable performance testing across platforms

Validata Advanced Testing Suite Validata Testing Approach Validata Testing Methodology Validata Performance Tester Overview Benefits 6 Business Challenges Validata Performance Tester Case Study 7

Testing the performance of web-based applications can easily miss the mark. It’s easy to design unrealistic scenarios. Easy to collect and measure irrelevant performance data. And, even if you manage to design sound scenarios and collect the right data, it’s easy to use the wrong statistical methods to summarize and present the results. Traditional performance testing approaches, involve performance testing teams very late in the implementation lifecycle. Furthermore, applications are tested and tuned at the latest stages of the project, whereas business needs are not successfully met due to the constant environment changes. Therefore deep, flexible and efficient testing coverage can not be achieved with traditional testing tools.

 Validata Advanced Testing Suite (ATS) provides a full end-to-end automated testing capability that adapts easily to changes in the application under test, ensuring higher quality and reduced costs and effort.  Validata ATS is a truly integrated and business process management solution.  Validata ATS is the first model – driven test automation tool for Functional, Technical and Continuous Regression Testing.  Validata focuses on the analytics (the context and the content) thus providing root cause analysis linking requirements and testing. Full reporting is on-demand from the Executive Dashboard Module. Project Success

Efficient Testing - Reduced Testing time - Less time to develop, Shortened application life cycle and Faster time to market - Reduced QA Cost - Upfront cost of automated testing is easily recovered over the lifetime of the product. The cost of performing automated testing is much lower, many times faster and with fewer errors Effective Testing Greater Coverage -The productivity gains delivered by automated testing enable more and complete testing. Greater coverage reduces the risk of malfunctioning or non-compliant software Improved testing productivity - Test suites can be run earlier and more often Improve Process Consistent test procedures - Ensuring process repeatability, resource independence, eliminates manual errors Replicating Testing - Across different platforms is easier using automation Results Reporting - Automated testing produces convenient reporting and analysis with standardized measures allowing more accurate interpretations Better Use of resources Using Testing Effectively -Testing is a repetitive activity. Automation of testing processes allows machines to complete the tedious, repetitive work while resources are diverted to perform other tasks Test team members can focus on quality

Testing Techniques:  Model Driven  Data Driven  Key Word Driven

Validata Performance Tester fulfills the needs of organizations for performance testing of web- based applications. It is fully integrated with Validata ATS, incorporates SWIFT, ARC IB and other Internet Banking applications and is designed to deliver a faster and more cost-effective approach to test the reliability and scalability of critical IT systems. The T24 specialized adapters and the pre-built test scenarios library accelerates the performance testing element of the project by 75%. Objectives of Performance Testing:  Ensure that the system provides adequate response times (verify performance requirements)  Determine maximum number of concurrent users (current system capacity)  Meet end-user expectations  Determine optimal hardware and application configuration  Identify performance bottlenecks  Verify the scalability of the system  Assess the impact of any hardware or software changes on site performance, new features, or functions Validata ATS has the ability to perform Parallel Testing on multiple environments using the unique test engine adapter Performance Tester

Load Testing Expected number of users with average user interaction times, over short period of time, and load conditions that will occur in a live production environment. Focuses on:  Number of users accessing the server  Combination of business transactions that are executed  Impact on different environment components Stress Testing Worst-case scenarios for a short period of time Focuses on:  Locating the point at which server performance breaks down  Steadily increasing the number of simulated users until a breaking point is reached  Identifies performance issues that might not otherwise be seen  Verifies that web site/application will perform as expected under peak conditions

On Line Testing  Socket Based Transactions (Interfaces)  ATM & POS & Mobile  Browser Based Transactions (http)  Module Executions  Mixed Executions  IB Application Transactions (http)  Module Executions  Mixed Executions Batch Offline Testing Batch File Transactions COB Testing (Daily, Monthly, Quarterly)  Report Generation and Interest Accruals  Account Statements  Noise Transaction Generation (T24 Browser)

Requirements Analysis Test Planning Test Design Test Execution Reporting  Indentify the required stake holders, business analysts, Infrastructure managers  Organize and gather the business requirements  Convert the business requirements into performance requirements and metrics  Run workshops for knowledge transfer  Indentify the required stake holders, business analysts, Infrastructure managers  Organize and gather the business requirements  Convert the business requirements into performance requirements and metrics  Run workshops for knowledge transfer  Collect the business critical transactions  Determine the required test volumes  Prepare entry and exit criteria  Prepare the schedules for testing and testing estimations  Check infrastructure availability  Collect the business critical transactions  Determine the required test volumes  Prepare entry and exit criteria  Prepare the schedules for testing and testing estimations  Check infrastructure availability  Identify the pre – test and post – test procedures  Determine the test customization requirements and prerequisites  Isolate monitoring requirements and metrics to be collated  Create and Review the test cases  Create and review the workload scenarios  Identify the pre – test and post – test procedures  Determine the test customization requirements and prerequisites  Isolate monitoring requirements and metrics to be collated  Create and Review the test cases  Create and review the workload scenarios  Execute Smoke Testing  Setup the required environment monitors  Execute the test and collate the results  Share the test results with the Project Team  Schedule the next execution cycle after the resolutions of the issues  Execute Smoke Testing  Setup the required environment monitors  Execute the test and collate the results  Share the test results with the Project Team  Schedule the next execution cycle after the resolutions of the issues  Correlate Test results from different test cycles  Prepare the test summary document  Test Summary presentation to the stake holders  Sign Off  Correlate Test results from different test cycles  Prepare the test summary document  Test Summary presentation to the stake holders  Sign Off

Performance Testing per Applications Mix of critical transactions for performance testing Create cycles per transaction or group of transactions Clone of cycles for multiple executions Execution of cycles with Validata pre-built T24 adapters Pre-built performance test cases

Transaction Based MetricsServer Based Metrics Metrics to be CapturedComments Throughput (per Sec) Transactions per sec Response Times / Elapsed Times Time taken to process a transaction Types of Errors Totals for different types of errors for a particular test Count of Errors Total errors for a particular test. Transaction CountTotal transactions processed for the time period Metrics to be CapturedComments CPU Usage User% System% Idle% Wait% Logical CPU Memory Usage Used% Used in GB Memory available Disk I/O Disk Read KB/sec Disk Write KB/sec IO/sec Network ActivityMB/sec Packets/sec Size of packets Bandwidth used

Hard Disk Usage Transactions per Second & Requests per Second Errors Count Available Memory

 Aggressive project plan  Performance testing of a consistently changing environment  Need of a performance testing where the test cases could easily be updated to reflect the new environment. Challenges  Developed a performance testing strategy and plan including all aspects of environment testing  Designated their resources and collaborate with those of the bank and Temenos, to manage the process, development and execution of the tests  Deployed Validata Performance Tester solution to produce all management reporting for project progress and defect management  Delivered the full solution on a fixed fee basis Solution Outline  Easily manage from start to finish all performance testing processes.  Identify design issues and performance bottlenecks and overcome them.  Effectively concentrate their resources, by exploiting the product and resources encompassed by Validata.  Efficiently manage monthly costs for testing  Enjoy a cost efficient solution that incorporated both resource and product Benefits Realized With a network of over 40 branches and many Internet Banking customers, Mauritius Commercial Bank (MCB) required a robust environment to continue to provide to their customers the level of service enjoyed prior to the implementation of Temenos T24. As such they identified a need for a performance testing tool to assist them with validation of the configuration of the environment.

Distributions and Scalability Executions  Scalability from 50 to 2500 Virtual Users  1 Hour Continuation of Executions  Executions per Modules  Transaction Mixed Executions

Achieve full test coverage Decrease total time of performance testing up to 60-70% Maximum reusability on the tests assets with minimum effort to maintain them Script less creation of scenarios, achieving 100% automation Truly de-skilled reducing the turn around time by 50% Less time to prepare, faster time to market by 50% On the Cloud: Remote access & Multiple site support

We would be happy to help. Do You Have Any Questions?