Mark Nesson June, 2008 Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9)
Copyright 2007, Information Builders. Slide 2 Why WebFOCUS for z Runs natively on MVS, Linux IBM has brand new specialty engines you can take advantage of Ability to create partitions on z to centralize business intelligence on a single server – where the databases and applications reside
Copyright 2007, Information Builders. Slide 3 Information Builders products used in benchmark WebFOCUS iWay Software iWay Service Manager, is a unique and powerful Enterprise Service Bus (ESB) that invoked as Web services to provide event-driven integration and B2B interaction management.
Copyright 2007, Information Builders. Slide 4 There’s an easier way! RDBMS HTTP ClientsWeb Server App Server/ Servlet Container Reporting Server RC Repository RC Distribution Server RDBMS DB Servers ibi_html ibi_bid approot ibi_apps rcaster basedir worp adapters focexecs synonym data reports HTTP/ HTTPS HTTP/ HTTPS/ Proprietary TCP HTTP/ HTTPS TCP/JDBC TCP/ JDBC TCP via Client Driver or JDBC TCP/JDBC
Copyright 2007, Information Builders. Slide 5 Benchmark Objectives Test WebFOCUS, on the proven strengths of System z hardware running the Linux open source OS with UDB, and z/OS with DB2. Evaluate the scalability and performance of WebFOCUS and iWay Service Manager in all operating systems environment on IBM z Server and the benefit of using various specialty engines on IBM z server. All test configurations accurately and faithfully replicated prior benchmarks run on other UNIX and Windows platforms. Test results therefore represent the true performance of WebFOCUS workload on that hardware vendor machine. Testing was done at IBM Gaithersburg (Washington Systems Center) in November of 2006 by a combined team from IBM and Information Builders.
Copyright 2007, Information Builders. Slide 6 UDB and DB2 database size Linux system IBILIN03 under zVM and z/OS system IBI1 are used in the various benchmark configurations to host the test databases. Two databases are defined on IBILIN03 and IBI1 With multiple tables defined to each database 2 million rows of data 7 million rows of data Each row is 256 bytes long.
Copyright 2007, Information Builders. Slide 7 Benchmark Test Workload used Workload-small: Query to retrieve 61 rows of data Workload-large: Query to retrieve 3000 rows of data Workload-complex: CPU intensive query, involves 4 table join retrieve 5118 rows
Copyright 2007, Information Builders. Slide 8 How IBI tests were measured For each given test configuration, Information Builders used the same parameter settings, i.e. interval time, keep alive time, to run small, large and complex workload by varying the concurrent active user numbers, then measure the end to end user response time.
Copyright 2007, Information Builders. Slide 9 Test Environment – 1 (all on Linux)
Copyright 2007, Information Builders. Slide 10 Test Environment – 1 (Linux) Scenarios Scenario1: IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2 Connect IBILIN03: 2/4 CP, 2 GB, UDB 8.2 Scenario2: IBILIN01: 2/4 CP, 2/4 GB, WAS , WebFOCUS Client 7.6 IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2 Connect IBILIN03: 2/4 CP, 2 GB, UDB 8.2 Scenario3: IBILIN02: 2/4/8 CP, 2 GB, iWay Service Manager, DB2 JDBC type 4 IBILIN03: 2/4 CP, 2 GB, UDB 8.2
Copyright 2007, Information Builders. Slide 11 Test Env. – 1 (Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-small) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Small Small
Copyright 2007, Information Builders. Slide 12 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB RDBMS DB2 Rows Returned: 61 Number of CPUS: 2, 4, 8 Protocol: TCP Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 13 Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-small)
Copyright 2007, Information Builders. Slide 14 Test Env.– 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-large) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Large ** Large ** Linux system swap occurred
Copyright 2007, Information Builders. Slide 15 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB RDBMS DB2 Rows Returned: 3000 Number of CPUS: 2, 4, 8 Protocol: TCP Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 16 Test Env.– 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-Large)
Copyright 2007, Information Builders. Slide 17 Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex ) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Complex Complex Complex
Copyright 2007, Information Builders. Slide 18 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB CPU Speed:550 Mips Rows Returned: 5118 Number of CPUS: 2, 4, 8 Protocol: TCP Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 19 Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex )
Copyright 2007, Information Builders. Slide 20 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-small) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Small 3.659** Small ** Switched to use JLINK
Copyright 2007, Information Builders. Slide 21 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB Rows Returned: 61 Number of CPUS: 2, 4, 8 Protocol: SERVLET Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 22 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, Web FOCUS Reporting Server, DB2 Connect, WL-small)
Copyright 2007, Information Builders. Slide 23 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Large ** Large ** switched to use JLINK
Copyright 2007, Information Builders. Slide 24 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 25 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large)
Copyright 2007, Information Builders. Slide 26 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Complex Complex Complex
Copyright 2007, Information Builders. Slide 27 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 28 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex)
Copyright 2007, Information Builders. Slide 29 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC Type 4, WL-Small) # user Response Time in seconds 2 cp4 cp8 cp Workload Type * 1.796* 3.451* Small 6.255* 0.734* 1.221* 4.71* 2.62* Small *7.34* Small * JVM size 1024 MB
Copyright 2007, Information Builders. Slide 30 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 31 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC type 4, WL-Small)
Copyright 2007, Information Builders. Slide 32 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC type 4, WL-Large) # user Response Time in seconds 2 cp4 cp8 cp Workload Type * * * * Large * 74.42* * Large Large * JVM size 1024 MB.
Copyright 2007, Information Builders. Slide 33 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 34 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC type 4, WL-Large )
Copyright 2007, Information Builders. Slide 35 Benchmark Test Environment – 2 (App and driver on Linux, DB on z/OS )
Copyright 2007, Information Builders. Slide 36 Benchmark Test Environment – 2 Scenarios Scenario1: IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS Reporting Server 7.6, DB2 Connect, Native Data Driver (CLI) (z/OS) IBI1: 8 CP, 8 GB, DB2, 2 tables
Copyright 2007, Information Builders. Slide 37 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Small Small (WebFOCUS Reporting Server, DB2, WL=Small)
Copyright 2007, Information Builders. Slide 38 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 39 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) (WebFOCUS Reporting Server, DB2, WL=Small)
Copyright 2007, Information Builders. Slide 40 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Large Large (WebFOCUS Reporting Server, DB2, WL=Large)
Copyright 2007, Information Builders. Slide 41 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 42 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) ( WebFOCUS Reporting Server, DB2, WL=Large)
Copyright 2007, Information Builders. Slide 43 Benchmark Test Environment – 3 (WAS, WebFOCUS Reporting Server, iWay Service Manager – IBI2, DB on z/OS –IBI1)
Copyright 2007, Information Builders. Slide 44 Benchmark Test Environment – 3 Scenarios Scenario1: IBI2: 2/4/8 CP, 8 GB, ISM 5.5, JDBC Type-4 Driver. IBI1: 2/4/8 CP, 8 GB, DB2, 2 tables Scenario2: IBI2: 2/4/8 CP, 8 GB, WAS 6.1, WebFOCUS Reporting Server 76, WF Client, CLI. IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables IBI2 communicates with IBI1 via Hipersocket 1 zIIP engine Scenario3: IBI2: 2/4/8 CP, 8 GB, WebFOCUS Reporting Server 76, CLI IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables IBI2 communicates with IBI1 via Hipersocket 1 zIIP engine
Copyright 2007, Information Builders. Slide 45 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Small Small Small
Copyright 2007, Information Builders. Slide 46 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 47 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP)
Copyright 2007, Information Builders. Slide 48 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Large Large Large
Copyright 2007, Information Builders. Slide 49 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 50 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP)
Copyright 2007, Information Builders. Slide 51 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Small Small Small * 4 Clustered WAS Application Servers
Copyright 2007, Information Builders. Slide 52 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 53 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
Copyright 2007, Information Builders. Slide 54 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Large Large Large
Copyright 2007, Information Builders. Slide 55 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 56 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
Copyright 2007, Information Builders. Slide 57 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Small Small Small
Copyright 2007, Information Builders. Slide 58 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 59 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
Copyright 2007, Information Builders. Slide 60 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Large Large Large
Copyright 2007, Information Builders. Slide 61 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 62 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
Copyright 2007, Information Builders. Slide 63 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Complex, 1 zIIP ) # user Response Time in seconds 2 cp4 cp8 cp Workload Type Complex * Complex * Complex * RMF report indicated that 1 zIIP engine utilization was at 100%
Copyright 2007, Information Builders. Slide 64 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
Copyright 2007, Information Builders. Slide 65 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Complex, 1 zIIP )
Copyright 2007, Information Builders. Slide 66 Proven Value of zIIP Specialty Engine 06/11/17 01:04 Thread Summary ROW 85 TO 107 OF Display Filter View Print Options Help SDSF DA IBI1 IBI1 PAG 0 CPU/L 100/100 LINE 1-5 (5) COMMAND INPUT ===> SCROLL ===> CSR PREFIX=DSN* DEST=(ALL) OWNER=* SYSNAME= NP JOBNAME SP ASIDX Real Paging SPAG SCPU% ECPU-Time ECPU% SzIIP% Racf Id DSN8MSTR 1 004D DB2USER DSN8DIST 1 004F 11T DB2USER DSN8IRLM DB2USER DSN8DBM T DB2USER DSN8SPAS DB2USER 06/11/17 01:14 Thread Summary ROW 1 TO 23 OF Display Filter View Print Options Help SDSF DA IBI1 IBI1 PAG 0 CPU/L 100/100 LINE 1-5 (5) COMMAND INPUT ===> SCROLL ===> CSR PREFIX=DSN* DEST=(ALL) OWNER=* SYSNAME= NP JOBNAME SP ASIDX Real Paging SPAG SCPU% ECPU-Time ECPU% SzIIP% Racf Id DSN8MSTR 1 004D DB2USER DSN8DIST 1 004F 12T DB2USER DSN8IRLM DB2USER DSN8DBM T DB2USER DSN8SPAS DB2USER
Copyright 2007, Information Builders. Slide 67 zIIP Actual Versus Projected CP configuration: IBI1 - 8 CPs, 1 zIIP IBI2 - 8 CPs, 0 zIIP WebFOCUS Large IBI IBI Users Time CPU% zIIP% IIPCP I/O Rate CPU% I/O Rate 50 00: : : : Complex Query IBI IBI Users Time CPU% zIIP% IIPCP I/O Rate CPU% I/O Rate 50 01: : : :
Copyright 2007, Information Builders. Slide 68 Test Env – 3 (2 separate z/OS), Scenario – 3b (WebFOCUS Reporting Server, CLI, WL=Complex, 8CP,Vary zIIPs) # user Response Time in seconds Workload Type Complex Complex Complex NO zIIP 1 zIIP 3 zIIP6 zIIP
Copyright 2007, Information Builders. Slide 69 Test Env – 3 (2 separate z/OS),Scenario – 3b (WebFOCUS Reporting Server, CLI, WL=Complex, 8 CP. Vary zIIPs)
Copyright 2007, Information Builders. Slide 70 WSC Benchmark Team Mary Hu John Bishop Richard Lewis Joe Consorti Jennie Liang John Goodyear Kenneth Hain Glenn Materia Dennis McDonald
Copyright 2007, Information Builders. Slide 71 Questions ?
Copyright 2007, Information Builders. Slide 72 Appendix A Workload-Small Query SELECT T1."F3SSN",T1."F3ALPHA10",T1."F3INTEGER5", T1."F3FLOAT9X8",T1."F3DBL15X2" FROM TST2MLN T1 WHERE (T1."F3SSN" <= ' ') FOR FETCH ONLY;
Copyright 2007, Information Builders. Slide 73 Appendix A Workload-Large Query SELECT T1."F3SSN",T1."F3ALPHA10",T1."F3INTEGER5", T1."F3FLOAT9X8",T1."F3DBL15X2" FROM TST2MLN T1 WHERE (T1."F3SSN" <= ' ') FOR FETCH ONLY;
Copyright 2007, Information Builders. Slide 74 Appendix A Workload-Complex Query SELECT T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT", T1."M_VISIT_POINTER",T4."M_C_USE",T4."M_C_CONC_CD", T4."EFFECT_DATE",T4."EXPIRE_DATE", MAX(T1."TRIP_CODE"), MAX(T3."TRIP_NAME"), MAX(T1."M_DELVRY_DATE"), MAX(T1."DELVRY_YEAR"), SUM(T4."CUSTOMER_COUNT") FROM ( ( ( MKTING_GLOBAL T1 INNER JOIN TRIP_GLOBAL T2 ON T2."VYD_TRIP" = T1."M_TRIP" ) LEFT OUTER JOIN TRANSPORT_NAME_GLOBAL T3 ON T3."TRANSPORT_CODE" = T1."TRANSPORT_CODE" ) INNER JOIN MKTING_CUSTMR_SURVEY T4 ON T4."M_MKTING_NBR" = T1."M_MKTING_NBR" AND T4."M_TRIP" = T1."M_TRIP" ) WHERE (T1."BUSS_CODE" = 'A') AND (T2."VYD_TRIP_STAT" IN('A', 'H')) AND (T2."REPORT_YEAR" ='2006') AND (T4."EXPIRE_DATE" > ' ') AND (T4."EFFECT_DATE" <= ' ') AND (T4."M_C_CUSTMR_STAT" = 'C') AND (((T1."M_GROUP_TYPE" ='H') AND (T4."M_C_CUSTMR_ID" = '2')) OR ((T1."M_GROUP_TYPE" <> 'H') OR T1."M_GROUP_TYPE" IS NULL)) GROUP BY T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT",T1."M_VISIT_POINTER",T4."M_C_USE", T4."M_C_CONC_CD",T4."EFFECT_DATE",T4."EXPIRE_DATE" ORDER BY T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT", T1."M_VISIT_POINTER",T4."M_C_USE",T4."M_C_CONC_CD", T4."EFFECT_DATE",T4."EXPIRE_DATE" FOR FETCH ONLY;