Silk Performer POC.

Slides:



Advertisements
Similar presentations
AIMSweb Progress Monitor Online User Training
Advertisements

Crystal Reports In Depth Session XII. Crystal Reports - In Depth Reports outside of halFILE Selecting an ODBC Datasource Selecting a second table and.
DNR-322L & DNR-326.
© by Pearson Education, Inc. All Rights Reserved.
1 Chapter 12 Working With Access 2000 on the Internet.
1 of 6 Parts of Your Notebook Below is a graphic overview of the different parts of a OneNote 2007 notebook. Microsoft ® OneNote ® 2007 notebooks are digital.
Lesson 13: Building Web Forms Introduction to Adobe Dreamweaver CS6 Adobe Certified Associate: Web Communication using Adobe Dreamweaver CS6.
Classroom User Training June 29, 2005 Presented by:
Introducing Dreamweaver MX 2004
Tutorial 1 Getting Started with Adobe Dreamweaver CS3
XP New Perspectives on Browser and Basics Tutorial 1 1 Browser and Basics Tutorial 1.
Creating a Web Site to Gather Data and Conduct Research.
Copyright © 2007, Oracle. All rights reserved. Managing Concurrent Requests.
MagicInfo Pro Scheduler Now that a template has been created from content imported into the Library, the user is ready to begin scheduling content to.
Network Management Tool Amy Auburger. 2 Product Overview Made by Ipswitch Affordable alternative to expensive & complicated Network Management Systems.
Session Objectives • Login to PeopleSoft Test Framework(PTF)
How to Run a Scenario In HP LoadRunner >>>>>>>>>>>>>>>>>>>>>>
Microsoft Access 2010 Chapter 10 Administering a Database System.
FrontPage Tutorial Part 2 Creating a Course Web Site.
XP New Perspectives on Microsoft Office FrontPage 2003 Tutorial 7 1 Microsoft Office FrontPage 2003 Tutorial 8 – Integrating a Database with a FrontPage.
8 Chapter Eight Server-side Scripts. 8 Chapter Objectives Create dynamic Web pages that retrieve and display database data using Active Server Pages Process.
SNG via Webinar. Where’s Webinar??  Double click Aflac 2000 folder  Highlight “SNGWebCommunicator”  Right Click and “Send To - Desktop”
Emdeon Office Batch Management Services This document provides detailed information on Batch Import Services and other Batch features.
General System Navigation
Dive Into® Visual Basic 2010 Express
Chapter 3: Getting Started with Tasks
SAP ERP Basic System Navigation
NETSTORM.
Running a Forms Developer Application
Computer Literacy BASICS
Working with Data Blocks and Frames
Creating Oracle Business Intelligence Interactive Dashboards
Working in the Forms Developer Environment
AESA – Module 8: Using Dashboards and Data Monitors
Tutorial 5: Working with Excel Tables, PivotTables, and PivotCharts
Forms and Reports 09.
CONTENT MANAGEMENT SYSTEM CSIR-NISCAIR, New Delhi
Data Virtualization Tutorial… CORS and CIS
MCTS Guide to Microsoft Windows 7
Lawson System Foundation 9.0
Dynamic Web Page A dynamic web page is a kind of web page that has been prepared with fresh information (content and/or layout), for each individual viewing.
T_C_N_L_G_ E D I D I E O Y O H I E B J I R E A A W.
Word and the Writing Process
Building Configurable Forms
Office 2010 and Windows 7: Essential Concepts and Skills
ALEPH Version 22 Beginning Cataloging
Defining Styles and Automatically Creating Table of Contents and Indexes Word Processing 4.03.
Chapter 7 Advanced Form Techniques
Multi-host Internet Access Portal (MIAP) Enhancement Guide
Windows Internet Explorer 7-Illustrated Essentials
Benchmark Series Microsoft Word 2016 Level 2
Microsoft Office Access 2003
Exploring Microsoft® Access® 2016 Series Editor Mary Anne Poatsy
Adding and Editing Students and Student Test Settings
Microsoft Office Access 2003
Using JDeveloper.
Access Tutorial 8 Sharing, Integrating, and Analyzing Data
Word offers a number of features to help you streamline the formatting of documents. In this chapter, you will learn how to use predesigned building blocks.
Microsoft PowerPoint 2007 – Unit 2
IBM SCPM Basic Navigation
Tutorial 7 – Integrating Access With the Web and With Other Programs
Active Tests and Traffic Simulation: Module Objectives
Active Tests and Traffic Simulation: Module Objectives
Planning a Group Policy Management and Implementation Strategy
Approving Time in Kronos Manager/Supervisor Reference Guide
Using Microsoft Outlook: Outlook Support Number
Chapter 8 Using Document Collaboration and Integration Tools
Tutorial 8 Sharing, Integrating, and Analyzing Data
Logging In Using CAT for the Participant Version 1.6
How to install and manage exchange server 2010 OP Saklani.
Presentation transcript:

Silk Performer POC

Silk Performer POC Defining Load Test Projects Creating Test Scripts Analyzing Test Scripts Customizing Test Scripts Defining User Profiles Identifying Baseline Performance Setting up Monitoring Templates Defining Workload Running & Monitoring Tests Exploring Test Results

Defining Load Test Projects Procedure To define a load test project: Click the Start here button on the Silk Performer Workflow bar.

Defining Load Test Projects The Workflow - Outline Project dialog opens. Enter the Project Name in the Project Name field Enter a description for the project in the Project description field Select Web business transaction (HTML/HTTP) in the Application type field. Click OK to create the project based on your settings

Creating Test Scripts Procedure To model a load test script : Click the Model Script button on the Silk Performer Workflow bar.

Creating Test Scripts The Workflow - Model Script dialog appears (Select Record in the Script area of the dialog) From the Select application profile drop-down list, select a profile for the client application you wish to use in your load test (e.g., Internet Explorer). If a profile has not yet been set up for the application you wish to use, click the button to the right of the drop-down list to set one up now. In the URL field, enter a URL to be recorded (http://EBII479F0:9090/LabOld) Click OK. The Silk Performer Recorder then opens in minimized form and a browser is launched, loaded with the URL that you specified for recording.

Browse the Application once finished Click stop recording button Creating Test Scripts Note To see a report of the actions that occur during recording, maximize the recorder dialog by clicking the Change GUI size button. Browse the Application once finished Click stop recording button The Save As dialog appears. Save the script with a meaningful name Stop Button GUI Size Button

Creating Test Scripts You can see the newly generated Script

Analyzing Test Scripts Trying Out a Generated Script Once you’ve initiated a Try Script run for a recorded test script you’ll need to analyze the results of the Try Script run using True Log Explorer. Test script analysis with True Log Explorer involves the following tasks: Visual Analysis with True Log Explorer Viewing a Summary Report Finding Errors Viewing Page Statistics Comparing Record and Replay True Logs

Analyzing Test Scripts Trying Out a Generated Script Once you’ve generated a test script you should determine if the script runs without error via a Try Script run With Try Script runs only a single virtual user is run and the stress test option is enabled so that there is no think time or delay between transactions. Try Script runs are viewed in Silk Performer's True Log Explorer, which helps you find replay errors quickly. Procedure To try out a load test script: Click the Try Script button on the Silk Performer Workflow bar.

Analyzing Test Scripts Try Script dialog appears. The active profile is already selected in the Profile drop-down list. The script you created is selected in the Script drop-down list. The VUser virtual user group is selected in the User group area To view the data that is actually downloaded from the Web server during the Try Script run, select the Animated checkbox Note To test an application other than a Web application, disable the Animated option. Click Run. Note You are not running an actual load test here, only a test run to see if your script requires debugging.

Analyzing Test Scripts The Try Script run begins. The Monitor window opens, giving you detailed information about the run’s progress. True Log Explorer opens, showing you the data that is actually downloaded during the Try Script run. If any errors occur during the Try Script run, True Log Explorer will assist you

Analyzing Test Scripts Visual Analysis with True Log Explorer True Log Explorer’s interface is comprised of the following sections Workflow bar- Primary interface as you work with True Log Explorer Tree List view – The left of the interface allows you to expand and collapse True Log data downloaded during load tests The Source window – Views of both raw HTML source code and rendered HTML content. Viewing Page Statistics Information view –displays data regarding load testing scripts and test runs, including general information about the loaded TrueLog file, the selected API node, BDL script, HTML references, and HTTP header data.

Analyzing Test Scripts Visual Analysis with True Log Explorer Workflow Bar Tree View of the Access API Nodes Rendered HTML View HTML Source view Tab Information Window

Analyzing Test Scripts Analyze Test The Analyze Test, offering you three options: View a virtual user summary report Look for errors in the TrueLog Compare the replay test run to the record test run

Analyzing Test Scripts Analyze Test Procedure To analyze the results of a test run: Load a True Log of interest into True Log Explorer. With the True Log of the Try Script run you executed in the previous click the Analyze Test button on the Workflow bar. The Analyze Test dialog appears, offering you three options: View a virtual user summary report Look for errors in the True Log Analyze Test Virtual Summary Report Find Errors Compare Test Run

Analyzing Test Scripts View a virtual user summary report Enabling summary reports To display a virtual user summary report for a test run: With the TrueLog you generated in the preceding tutorial loaded into True Log Explorer, click the Analyze Test button. Select the Show the virtual user summary report link. Analyze Test Show the virtual user summary

Analyzing Test Scripts View a virtual user summary report Enabling summary reports Because virtual user summary reports require significant processing, they are not generated by default. To enable the automatic display of virtual user reports at the end of animated Try Script runs, or when clicking the root node of a TrueLog file in Tree List view, select the Display virtual user report checkbox on the TrueLog Explorer Settings menu (Settings/Options/Workspace/Reports). Enable Virtual User Report

Analyzing Test Scripts View a virtual user summary report Virtual user summary reports include details regarding: Virtual users Uncovered errors Response time information tracked for each transaction defined in a load test script Page timer measurements for each downloaded Web page Measurements related to each Web form declared in a load-testing script, including response time measurements and throughput rates for form submissions using POST, GET, and HEAD methods. Individual timers and counters used in scripts (Measure functions) Click Analyze Test Show the Virtual User Summary Report Enabling summary Report: Goto Settings Options Workspace Enable Display virtual user Report

Analyzing Test Scripts View a virtual user summary report

Analyzing Test Scripts Finding Errors True Log Explorer helps you find errors quickly after Try Script runs Procedure To find replay errors in a True Log: With the True Log you generated in the preceding tutorial loaded into True Log Explorer, click the Analyze Test button. Click the Find errors link on the dialog. Analyze Test Find Errors

Analyzing Test Scripts Finding Errors The Step through True Log dialog appears with the Errors option selected. Click the Find Next button to step through True Log result files one error at a time. This is a Generated Error Report

Analyze Test Scripts Viewing Page Statistics After verifying the accuracy of a test run, you can analyze the performance of your application under “no-load” conditions via page statistics. Overview pages detail total page response times, document download times (including server busy times), and time elapsed for receipt of embedded objects. Detailed Web page statistics show exact response times for individual Web page components—allowing you to easily pinpoint root causes of errors and slow page downloads.

Analyze Test Scripts Viewing Page Statistics Detailed Web page drill-down results include the following data for each page component: DNS lookup time Connection time SSL handshake time Send request time Server busy time Response receive time Cache statistics

Viewing Page Statistics Procedure To view an Overview page: Analyze Test Scripts Viewing Page Statistics Procedure To view an Overview page: Select an API node in Tree List view for which you would like to view Statistics. Select the Whole pages option on the Step through True Log dialog. Click the Statistics tab to open Statistics view. Step through dialog Tree List View Statistics View URL Links

Analyze Test Scripts Viewing Page Statistics Select specific components listed in the URL column for detailed analysis and page drill-down.

Comparing Record and Replay True Logs Analyze Test Scripts Comparing Record and Replay True Logs It shows the differences between record and replay True Logs This is caused by session relevant data, such as session ID’s. See “Customizing Session Handling” for details. Procedure To compare a replay True Log with its corresponding record True Log: Click the Analyze Test button on the Workflow Bar. The Workflow - Analyze Test dialog appears. Click the Compare your test run button. Analyze Test Compare Test Run

Comparing Record and Replay True Logs Analyze Test Scripts Comparing Record and Replay True Logs The corresponding record True Log opens in Compare view and the Step through True Log dialog appears with the Whole pages option selected— allowing you to run a node-by-node comparison of the True Logs. Click the Find Next button to step through True Log result files one page at a time. Note: Windows displaying content presented during replay have green triangles in their upper left corners. Windows displaying content originally displayed during application recording have red triangles in their upper left corners. Windows displaying the recorded session have red triangles Windows displaying the replayed session have green triangles

Comparing Record and Replay True Logs Analyze Test Scripts Comparing Record and Replay True Logs Explore Diff Visually

Customizing Test Scripts Three Main areas in Customizing Test Scripts Customizing Session Handling Customizing User Data Adding Verifications

Customizing Test Scripts Customize session handling True Log Explorer’s allow you to replace static session ID’s in scripts with dynamic session ID’s—and thereby maintain state information for successful load test runs. session handling customization is generally not required for most applications. So if you don’t detect any problems when you analyze your test you can skip session handling customization Procedure To customize session handling: click the Customize Session Handling button on the Workflow Bar. The Workflow - Customize Session Handling dialog opens. Click Find differences to view a differences table (Source Differences view) Use the Step Through True Log dialog to step through HTML server responses—with the recorded responses displayed alongside the corresponding replayed responses.

Customizing Test Scripts Customize session handling Customize Session Handling Find Differences

Customizing Test Scripts Customize session handling This indicator tells you that a difference may be session information The number of occurrences in HTML code is listed here Double click a listed difference to customize it A difference can only be customized if it occurs in BDL code

Customizing Test Scripts Customize session handling Double-click an error listed in Source Differences view. The Insert Parsing Functions dialog appears with the boundary values and variable name already inserted—there’s no need to locate and enter boundary values manually. Click OK on the Insert Parsing Function dialog to insert the required parsing function into the BDL script.

Customizing Test Scripts Customize session handling Once the script has been modified successfully, click the Customize Session Handling button to initiate a new Try Script run. Click Try Script Run to see if the script runs correctly now that session handling has been modified. Try Script Run

Customizing Test Scripts Customizing User Data With user data customization you can make your test scripts more realistic by replacing static recorded user input data with dynamic, parameterized user data that changes with each transaction Procedure To customize user input data for an HTML form field: Click the Customize User Data button on the Workflow bar. The Customize User Data dialog appears. Click Customize. Customize User Data Customize Button

Customizing Test Scripts Customizing User Data Use the Find Next and Find Previous buttons on the Step through True Log dialog to browse through all Web Page Submit calls in the True Log (these are the calls that are candidates for user data customization). Highlighted HTML controls in Post Data view identify form fields that can be customized. You can replace the recorded values with Various types of input data (including predefined values from files and generic random values) and generate code into your test script that substitutes recorded input data with your customizations.

Customizing Test Scripts Customizing User Data Right-click into a form control that you wish to customize and select Customize Value to open the Parameter Wizard. Select the Create new parameter radio button and click Next to create a new parameter.

Customizing Test Scripts Customizing User Data The Create New Parameter dialog appears. Select the Parameter from Multi Column Data file radio button and click Next Create a New file / Integrate with existing file Select the Column Next

Customizing Test Scripts Customizing User Data Select the Column Next Click Finish Go to Customize User Data Click Try Run the Script executed based on your runtime data.

Customizing Test Scripts Adding Verifications True Log Explorer allows you to easily add content checks to your test scripts to verify whether content that is to be sent by servers is in fact received by clients under real-world conditions Procedure To define content verifications for a Web page: With a True Log loaded into True Log Explorer, navigate to a page of interest. Select text to be verified (this step is not required for page title and page digest verifications). Click Add Verifications. Alternative You may also right-click content and select verification functions from a context menu.

Customizing Test Scripts Adding Verifications

Customizing Test Scripts Adding Verifications Select a pre-enabled verification from the Workflow – Add Verifications screen: Verify the page title Verify the selected text Verify the selected text in an HTML table Verify the digest Click OK Click OK to accept the verification settings. Once the BDL script has been successfully modified, repeat this process for each verification you wish to add to the BDL script. In the Script you can see the changes

Customizing Test Scripts Adding Verifications Once all the Verification steps are over run the Try Run again Confirm that verifications have been passed successfully (API nodes that include verifications are indicated by blue “V” symbols). verifications are indicated by blue “V” symbols verifications Report

Defining User Profiles Project profiles contain important project-specific settings. A project may contain as many profiles as is required, each with unique settings. New profiles can easily be added to projects, and existing profiles can be copied, renamed, and deleted. Procedure To define a custom user profile: Click the Customize Test button on the Silk Performer workflow bar. Customize Test button

Defining User Profiles The Workflow - Customize Test dialog appears. In the Profile drop-down list, the currently active profile is selected. This is the default profile. To add a new profile to your workload, click the New Profile button. The New Profile dialog appears. Enter a name (e.g., IE6_DSL) for the new profile and click OK In the Project tab of the tree-view area of the main Silk Performer window, expand the Profiles folder. Right click the newly created profile name in the Project window and select Edit Profile from the context menu to display the Edit Profile dialog.

Defining User Profiles From the Edit Profile dialog you can configure numerous settings related to your project’s user profile Click OK to accept the profile settings.

Defining User Profiles Baseline tests establish baseline performance for load tests using specific user types. For baseline tests, only one virtual user per user type is executed. The Find Baseline dialog allows you to define multiple user types (unique combinations of script, user group, and profile). The following option settings are automatically set for baseline tests: Baseline report files are automatically created The Stop virtual users after simulation time (Queuing Workload) option is activated The Random think time option is deactivated The Load test description field is set to "Base Line Test". The Display All Errors Of All Users option in the Monitor window is activated The Virtual user output files (.wrt) option is activated The Virtual user report files (.rpt) option is activated

Defining User Profiles Procedure To identify a test baseline: Click the Find Baseline button on the Silk Performer Workflow bar. The Workflow - Find Baseline dialog appears. Select the user types you wish to have in the baseline test. One virtual user from each user type will be executed.

Defining User Profiles If you want to add new user types to your load test, press the Add button and select a unique combination of script, profile, and user group from the Add User Type dialog Click OK Click Run to run the baseline test. The Monitor window opens, giving you detailed information about the progress of the test.

Confirming a Baseline Confirming a Baseline The next step in conducting a Silk Performer load test is to confirm that the test baseline established by the test actually reflects the desired performance of the application under test. If baseline reports are baseline report satisfactory, they can be stored for further processing. Procedure To view a baseline report: Click the Confirm Baseline button on the Silk Performer Workflow bar.

Confirming a Baseline The Workflow - Confirm Baseline dialog opens. Click Baseline Report to display the baseline report for the current test. Baseline reports are comprised of the following elements: General Information User Types Summary tables Transaction response times HTML page timers Web form measurements Accept Results button Assuming you are satisfied with the test results and wish to save them for further processing (e.g., calculation of the number of concurrent virtual users and network bandwidth required for the load test), click the Accept Baseline button.

Confirming a Baseline

Confirming a Baseline Click Yes.

Confirming a Baseline Once You accepted the baseline you can set response time thresholds for the selected timers. Measure Set Bound functions will be generated into the script to set these thresholds. Click the set response time thresholds button to display the Automatic Threshold Generation dialog.

Confirming a Baseline Click OK Select the timers which you want to generate thresholds Specify multipliers for lower and upper bounds The lower bound represents good response times. The upper bound separates bad response times from acceptable ones Specify if a message should be raised and the type of severity Specify minimum threshold values

Setting Up Monitoring Templates Silk Performer offers server and client-side monitoring during load tests—enabling you to view live graphical display of server performance while tests run. Custom server monitoring templates can be set up or you can use pre-installed templates (available for virtually all application types). Procedure To set up a template for server monitoring: Click the Confirm Baseline button on the Silk Performer Workflow bar. The Workflow - Confirm Baseline dialog opens. Click Monitoring template.

Setting Up Monitoring Templates The Profile Settings dialog opens, showing the Monitoring tab of the Results category. In the Monitoring options area, select the Automatically start monitoring option to automatically launch Performance Explorer monitoring each time a load test begins. Performance Explorer displays server performance data that is relevant to the server type under test. Select the Use custom monitoring template radio button to create a custom monitor template. Enter a name for the custom template file and click the Create Custom Monitor Template button. Performance Explorer appears. Close any monitor windows that are not relevant to the template. Click the Monitor Server button on the Performance Explorer Workflow bar.

Setting Up Monitoring Templates Monitor Server Button

Setting Up Monitoring Templates The Data Source Wizard appears. Click the Select from predefined Data Sources radio button to select a specific data source provided by the server. Note Performance Explorer can also scan servers for available data sources. Click Next. In the tree view on the System selection screen, expand the folder that corresponds to the operating system on which the server and the application under test run.

Setting Up Monitoring Templates The Connection Parameters screen appears In the Hostname edit field, enter connection parameters such as the host name or IP address of the computer that hosts the application, connection port, user name, and password. The data required here varies based on the operating system run by the monitored computer. Click Next. The Select displayed measures screen appears. Expand the tree view and select the measurements you wish to have monitored. Click Finish.

Setting Up Monitoring Templates A monitor window appears, with the elements you specified shown in a live, color-coded server performance graph. Beneath the graph is a list of included elements, along with a color-coding key, and performance information for each element. Select Clone Monitor Report from the Performance Explorer Monitor menu to write monitor results to a time series data (.tsd) file.

Setting Up Monitoring Templates A monitor report appears containing the same performance measurements that were selected for the graph. Monitoring information appears in the report in tabular format To save the monitoring report so that it can later be compared with load test results for results exploration, select Write Monitor Data from the Performance Explorer Monitor menu.

Defining Workload Silk Performer offers different workload models that can be used as the basis for load tests. You must select the workload model that best meets your needs prior to the execution of a load test. The number of concurrent virtual users per user type, the duration, and the involved agents must also be configured when defining workload. The following workload models are available: Increasing workload With this workload model, at the beginning of a load test, Silk Performer simulates not the total number of defined users, but only a specified number. Gradually workload is increased until all the users in the user list are running. Steady state workload In this model, the same number of virtual users is employed throughout the test.

Defining Workload Dynamic workload With this model you can manually change the number of virtual users in a test while the test runs. The maximum number of virtual users to be run is set; within this limit, the number can be increased and decreased at any time during the test. All day workload You can assign different numbers of virtual users to any interval of the load test. Each user type can use a different load distribution Queuing workload In this model, transactions are scheduled following a prescribed arrival rate. This rate is a random value based on an average interval calculated from the simulation time and the number of transactions specified in the script (dcluser section: number of transactions per user)

Defining Workload Procedure To specify the workload of a load test: Click the Adjust Workload button on the Silk Performer Workflow bar. Select the workload model that most closely meets your needs Click Workload Wizard. Specify the Simulation Times Click Next.

Defining Workload Enter the number of virtual users you wish to run or calculate the number by using session time data from the baseline test. In this manner the number of virtual users is calculated using the following formula: Vusers = Session Time[s] * Sessions Per Peak Hour / 3600. After completing your specifications, the All Day Workload Configuration dialog appears. Click OK Check the True Log On Error option, when you want Silk Performer to generate True Log files for transactions that contain errors.

Defining Workload Click the User Distribution Overview button to view the assignment of virtual users to the agent computers that are currently available. Click OK to save your changes. Click Run to execute the test or click Connect to initialize the agent connection and start the test manually from the monitor view by clicking the Start all button.

Running & Monitoring Tests Running a Load Test Monitoring a Test Monitoring a Server

Running & Monitoring Tests Running a Load Test Procedure To start a load test: Activate the workload model you wish to use for the test. To activate a workload, right-click it and select Activate from the context menu. Click the Run Test button on the Silk Performer Workflow bar

Running & Monitoring Tests The Workload Configuration dialog appears. Confirm all workload settings that you wish to use for the load test. Click Run to start the load test.

Running & Monitoring Tests To monitor a specific virtual user: Select the Virtual User right click on it Select show out put of VirtualUser_5 you can see the out put specific to the virtual user To customize the columns that are displayed, right-click a Window header and select Columns from the context menu. On the Select Monitor Columns window, select the columns you wish to have display.

Exploring Test Results Working with True Log On Error True Log On Error files provide complete histories of erroneous transactions uncovered during load tests—enabling you to drill down through real content to analyze error conditions. Viewing an Overview Report Once a load test is complete, Performance Explorer provides an overview report for the load test. These reports include the most important test results in tabular and graphical form. Viewing a Graph Performance Explorer offers a comprehensive array of graphic features for displaying test results.

Exploring Test Results Working with True Log On Error Procedure To analyze erroneous transactions uncovered during a load test: After the completion of a load test, click Silk Performer Workbench’s Explore Results button. Note : True Log On Error files are generated when Silk Performer’s Generate True Log On Error option is enabled.

Exploring Test Results Working with True Log On Error The Explore Results dialog appears. Click True Log Explorer Note :The True Log Explorer Button is disabled when either no errors are detected during a test or when True log On error is not enabled. In such instances, proceed directly with Performance Explorer.

Exploring Test Results Procedure To view an overview summary report: Click the Explore Results button in the Silk Performer Workflow bar. The Workflow - Explore Results dialog appears Click the Performance Explorer button. The Performance Explorer opens and displays an overview summary report for the most recent load test. Note: The Generate overview report automatically option must be set on the Settings/Options/Reporting dialog for overview reports to be displayed.

Exploring Test Results You can Edit the Title of the Report You can also add custom Charts The Report contains the following main menus General Information Summary Tables Ranking User Types Custom Charts Custom Tables Detailed Charts

Exploring Test Results General Information Report

Exploring Test Results Summary Tables Report

Exploring Test Results Ranking Report

Exploring Test Results User Types Report

Exploring Test Results Custom Chart Report

Exploring Test Results Custom Table Report: