Download presentation
Presentation is loading. Please wait.
Published byMelinda Gardner Modified over 9 years ago
1
Mark Nesson June, 2008 Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9)
2
Copyright 2007, Information Builders. Slide 2 Why WebFOCUS for z Runs natively on MVS, Linux IBM has brand new specialty engines you can take advantage of Ability to create partitions on z to centralize business intelligence on a single server – where the databases and applications reside
3
Copyright 2007, Information Builders. Slide 3 Information Builders products used in benchmark WebFOCUS iWay Software iWay Service Manager, is a unique and powerful Enterprise Service Bus (ESB) that invoked as Web services to provide event-driven integration and B2B interaction management.
4
Copyright 2007, Information Builders. Slide 4 There’s an easier way! RDBMS HTTP ClientsWeb Server App Server/ Servlet Container Reporting Server RC Repository RC Distribution Server RDBMS DB Servers ibi_html ibi_bid approot ibi_apps rcaster basedir worp adapters focexecs synonym data reports HTTP/ HTTPS HTTP/ HTTPS/ Proprietary TCP HTTP/ HTTPS TCP/JDBC TCP/ JDBC TCP via Client Driver or JDBC TCP/JDBC
5
Copyright 2007, Information Builders. Slide 5 Benchmark Objectives Test WebFOCUS, on the proven strengths of System z hardware running the Linux open source OS with UDB, and z/OS with DB2. Evaluate the scalability and performance of WebFOCUS and iWay Service Manager in all operating systems environment on IBM z Server and the benefit of using various specialty engines on IBM z server. All test configurations accurately and faithfully replicated prior benchmarks run on other UNIX and Windows platforms. Test results therefore represent the true performance of WebFOCUS workload on that hardware vendor machine. Testing was done at IBM Gaithersburg (Washington Systems Center) in November of 2006 by a combined team from IBM and Information Builders.
6
Copyright 2007, Information Builders. Slide 6 UDB and DB2 database size Linux system IBILIN03 under zVM and z/OS system IBI1 are used in the various benchmark configurations to host the test databases. Two databases are defined on IBILIN03 and IBI1 With multiple tables defined to each database 2 million rows of data 7 million rows of data Each row is 256 bytes long.
7
Copyright 2007, Information Builders. Slide 7 Benchmark Test Workload used Workload-small: Query to retrieve 61 rows of data Workload-large: Query to retrieve 3000 rows of data Workload-complex: CPU intensive query, involves 4 table join retrieve 5118 rows
8
Copyright 2007, Information Builders. Slide 8 How IBI tests were measured For each given test configuration, Information Builders used the same parameter settings, i.e. interval time, keep alive time, to run small, large and complex workload by varying the concurrent active user numbers, then measure the end to end user response time.
9
Copyright 2007, Information Builders. Slide 9 Test Environment – 1 (all on Linux)
10
Copyright 2007, Information Builders. Slide 10 Test Environment – 1 (Linux) Scenarios Scenario1: IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2 Connect IBILIN03: 2/4 CP, 2 GB, UDB 8.2 Scenario2: IBILIN01: 2/4 CP, 2/4 GB, WAS 6.0.2.17, WebFOCUS Client 7.6 IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2 Connect IBILIN03: 2/4 CP, 2 GB, UDB 8.2 Scenario3: IBILIN02: 2/4/8 CP, 2 GB, iWay Service Manager, DB2 JDBC type 4 IBILIN03: 2/4 CP, 2 GB, UDB 8.2
11
Copyright 2007, Information Builders. Slide 11 Test Env. – 1 (Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-small) # user Response Time in seconds 2 cp4 cp8 cp 50 100 200 500 Workload Type 0.616 1.205 2.388 3.726 0.407 0.435 0.793 Small 1.946 ----- 0.223 1.321 ----- Small
12
Copyright 2007, Information Builders. Slide 12 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB RDBMS DB2 Rows Returned: 61 Number of CPUS: 2, 4, 8 Protocol: TCP Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
13
Copyright 2007, Information Builders. Slide 13 Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-small)
14
Copyright 2007, Information Builders. Slide 14 Test Env.– 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-large) # user Response Time in seconds 2 cp4 cp8 cp 50 100 200 500 Workload Type 1.829 3.529 6.095 17.032 0.941 1.867 3.323 Large 5.487 ----- 1.013 6.883** ----- Large ** Linux system swap occurred
15
Copyright 2007, Information Builders. Slide 15 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB RDBMS DB2 Rows Returned: 3000 Number of CPUS: 2, 4, 8 Protocol: TCP Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
16
Copyright 2007, Information Builders. Slide 16 Test Env.– 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-Large)
17
Copyright 2007, Information Builders. Slide 17 Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex ) # user Response Time in seconds 2 cp4 cp8 cp 10 25 50 100 Workload Type 4.676 11.295 24.443 40.467 3.133 5.813 11.333 Complex 25.175 ----- 14.92 8.963 Complex 200 80.92860.784 ----- Complex
18
Copyright 2007, Information Builders. Slide 18 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB CPU Speed:550 Mips Rows Returned: 5118 Number of CPUS: 2, 4, 8 Protocol: TCP Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
19
Copyright 2007, Information Builders. Slide 19 Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex )
20
Copyright 2007, Information Builders. Slide 20 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-small) # user Response Time in seconds 2 cp4 cp8 cp 50 100 200 500 Workload Type 0.799 1.378 2.594 5.417 ----- 0.798 ----- Small 3.659** ----- 0.624 2.065 ----- Small ** Switched to use JLINK
21
Copyright 2007, Information Builders. Slide 21 Filters and Specs: Operating System:SUSE SLES 9 z/VM 5.2 Memory:2 GB Rows Returned: 61 Number of CPUS: 2, 4, 8 Protocol: SERVLET Access Method: CLI Concurrent Users: 10, 25, 50, 75, 100, 200, 500 Keep Alive:60 Interval:.05 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
22
Copyright 2007, Information Builders. Slide 22 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, Web FOCUS Reporting Server, DB2 Connect, WL-small)
23
Copyright 2007, Information Builders. Slide 23 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large) # user Response Time in seconds 2 cp4 cp8 cp 50 100 200 500 Workload Type 1.913 3.702 7.279 14.966 ----- 2.45 ----- Large 11.857** ----- ---- ----- Large ** switched to use JLINK
24
Copyright 2007, Information Builders. Slide 24 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
25
Copyright 2007, Information Builders. Slide 25 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large)
26
Copyright 2007, Information Builders. Slide 26 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex) # user Response Time in seconds 2 cp4 cp8 cp 10 25 50 75 Workload Type 11.578 28.438 ----- 2.151 5.047 10.826 Complex 15.312 ----- 7.121 Complex 100 -----19.79114.056 Complex
27
Copyright 2007, Information Builders. Slide 27 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
28
Copyright 2007, Information Builders. Slide 28 Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex)
29
Copyright 2007, Information Builders. Slide 29 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC Type 4, WL-Small) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type 1.731 3.707 7.59 11.799 0.95* 1.796* 3.451* Small 6.255* 0.734* 1.221* 4.71* 2.62* Small 500 18.3339.807*7.34* Small * JVM size 1024 MB
30
Copyright 2007, Information Builders. Slide 30 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
31
Copyright 2007, Information Builders. Slide 31 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC type 4, WL-Small)
32
Copyright 2007, Information Builders. Slide 32 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC type 4, WL-Large) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type 81.522 188.558 369.617* ----- 44.842* 90.319* 235.208* Large ----- 32.34* 74.42* ----- 182.265* Large 500 ----- Large * JVM size 1024 MB.
33
Copyright 2007, Information Builders. Slide 33 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
34
Copyright 2007, Information Builders. Slide 34 Test Env. – 1 (all on Linux), Scenario – 3 (iWay Service Manager, DB2 JDBC type 4, WL-Large )
35
Copyright 2007, Information Builders. Slide 35 Benchmark Test Environment – 2 (App and driver on Linux, DB on z/OS )
36
Copyright 2007, Information Builders. Slide 36 Benchmark Test Environment – 2 Scenarios Scenario1: IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS Reporting Server 7.6, DB2 Connect, Native Data Driver (CLI) (z/OS) IBI1: 8 CP, 8 GB, DB2, 2 tables
37
Copyright 2007, Information Builders. Slide 37 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) # user Response Time in seconds 2 cp4 cp8 cp 50 100 200 500 Workload Type 0.896 1.78 3.276 5.166 0.444 0.875 1.885 Small 2.807 0.214 0.435 1.50 0.721 Small (WebFOCUS Reporting Server, DB2, WL=Small)
38
Copyright 2007, Information Builders. Slide 38 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
39
Copyright 2007, Information Builders. Slide 39 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) (WebFOCUS Reporting Server, DB2, WL=Small)
40
Copyright 2007, Information Builders. Slide 40 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) # user Response Time in seconds 2 cp4 cp8 cp 50 100 200 500 Workload Type 18.417 33.183 49.23 ---- 9.812 17.145 34.74 Large ---- 4.998 10.278 ---- 18.369 Large (WebFOCUS Reporting Server, DB2, WL=Large)
41
Copyright 2007, Information Builders. Slide 41 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
42
Copyright 2007, Information Builders. Slide 42 Test Env. – 2, Scenario – 1 (App on Linux, DB on z/OS) ( WebFOCUS Reporting Server, DB2, WL=Large)
43
Copyright 2007, Information Builders. Slide 43 Benchmark Test Environment – 3 (WAS, WebFOCUS Reporting Server, iWay Service Manager – IBI2, DB on z/OS –IBI1)
44
Copyright 2007, Information Builders. Slide 44 Benchmark Test Environment – 3 Scenarios Scenario1: IBI2: 2/4/8 CP, 8 GB, ISM 5.5, JDBC Type-4 Driver. IBI1: 2/4/8 CP, 8 GB, DB2, 2 tables Scenario2: IBI2: 2/4/8 CP, 8 GB, WAS 6.1, WebFOCUS Reporting Server 76, WF Client, CLI. IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables IBI2 communicates with IBI1 via Hipersocket 1 zIIP engine Scenario3: IBI2: 2/4/8 CP, 8 GB, WebFOCUS Reporting Server 76, CLI IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables IBI2 communicates with IBI1 via Hipersocket 1 zIIP engine
45
Copyright 2007, Information Builders. Slide 45 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type 2.032 3.22 6.782 15.34 0.716 1.329 2.715 Small 5.864 0.56 1.11 5.2 2.4 Small 500 22.547.546.391 Small
46
Copyright 2007, Information Builders. Slide 46 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
47
Copyright 2007, Information Builders. Slide 47 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP)
48
Copyright 2007, Information Builders. Slide 48 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type 58.95 121.98 292.22 ----- 39.179 80.675 210.811 Large ----- 35.0 74.17 ----- 181.3 Large 500 ----- Large
49
Copyright 2007, Information Builders. Slide 49 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
50
Copyright 2007, Information Builders. Slide 50 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP)
51
Copyright 2007, Information Builders. Slide 51 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type ----- 3.639 4.616 9.224 ----- Small ----- Small 500 21.20----- Small * 4 Clustered WAS Application Servers
52
Copyright 2007, Information Builders. Slide 52 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
53
Copyright 2007, Information Builders. Slide 53 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
54
Copyright 2007, Information Builders. Slide 54 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type ----- 6.353 11.756 22.226 ----- Large ----- Large 500 35.249----- Large
55
Copyright 2007, Information Builders. Slide 55 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
56
Copyright 2007, Information Builders. Slide 56 Test Env – 3 (2 separate z/OS), Scenario – 2 (WAS, WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
57
Copyright 2007, Information Builders. Slide 57 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type ----- 0.566 1.118 2.207 ----- 0.313 0.65 Small 1.377 ----- 0.14 0.749 0.278 Small 500 7.7865.4131.847 Small
58
Copyright 2007, Information Builders. Slide 58 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
59
Copyright 2007, Information Builders. Slide 59 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Small, 1 zIIP)
60
Copyright 2007, Information Builders. Slide 60 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type ----- 5.575 9.615 18.874 ----- 3.001 6.756 Large 12.159 ----- 1.755 4.824 3.674 Large 500 33.52527.501 7.582 Large
61
Copyright 2007, Information Builders. Slide 61 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
62
Copyright 2007, Information Builders. Slide 62 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Large, 1 zIIP)
63
Copyright 2007, Information Builders. Slide 63 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Complex, 1 zIIP ) # user Response Time in seconds 2 cp4 cp8 cp 25 50 100 200 Workload Type ----- Complex ----- 21.084 82.053* 42.295 Complex 500 ----- 167.605* Complex * RMF report indicated that 1 zIIP engine utilization was at 100%
64
Copyright 2007, Information Builders. Slide 64 Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users
65
Copyright 2007, Information Builders. Slide 65 Test Env – 3 (2 separate z/OS), Scenario – 3 (WebFOCUS Reporting Server, CLI, WL=Complex, 1 zIIP )
66
Copyright 2007, Information Builders. Slide 66 Proven Value of zIIP Specialty Engine 06/11/17 01:04 Thread Summary ROW 85 TO 107 OF 148........................... Display Filter View Print Options Help ------------------------------------------------------------------------------- SDSF DA IBI1 IBI1 PAG 0 CPU/L 100/100 LINE 1-5 (5) COMMAND INPUT ===> SCROLL ===> CSR PREFIX=DSN* DEST=(ALL) OWNER=* SYSNAME= NP JOBNAME SP ASIDX Real Paging SPAG SCPU% ECPU-Time ECPU% SzIIP% Racf Id DSN8MSTR 1 004D 3683 0.00 0 100 18.30 0.00 100 DB2USER DSN8DIST 1 004F 11T 0.00 0 100 3590.05 95.80 100 DB2USER DSN8IRLM 1 0050 1266 0.00 0 100 5.51 0.08 100 DB2USER DSN8DBM1 1 0052 263T 0.00 0 100 349.40 1.66 100 DB2USER DSN8SPAS 1 0053 1184 0.00 0 100 0.15 0.00 100 DB2USER 06/11/17 01:14 Thread Summary ROW 1 TO 23 OF 201........................... Display Filter View Print Options Help ------------------------------------------------------------------------------- SDSF DA IBI1 IBI1 PAG 0 CPU/L 100/100 LINE 1-5 (5) COMMAND INPUT ===> SCROLL ===> CSR PREFIX=DSN* DEST=(ALL) OWNER=* SYSNAME= NP JOBNAME SP ASIDX Real Paging SPAG SCPU% ECPU-Time ECPU% SzIIP% Racf Id DSN8MSTR 1 004D 3683 0.00 0 100 18.42 0.01 100 DB2USER DSN8DIST 1 004F 12T 0.00 0 100 6366.16 95.70 100 DB2USER DSN8IRLM 1 0050 1271 0.00 0 100 6.75 0.03 100 DB2USER DSN8DBM1 1 0052 416T 0.00 0 100 401.31 3.36 100 DB2USER DSN8SPAS 1 0053 1184 0.00 0 100 0.15 0.00 100 DB2USER
67
Copyright 2007, Information Builders. Slide 67 zIIP Actual Versus Projected CP configuration: IBI1 - 8 CPs, 1 zIIP IBI2 - 8 CPs, 0 zIIP WebFOCUS Large ------------- IBI1 ------------- ---- IBI2 ---- Users Time CPU% zIIP% IIPCP I/O Rate CPU% I/O Rate 50 00:45 3.2 14.04 0.94 11.1 71.3 2548 100 00:49 3.1 14.03 0.78 10.2 74.5 3651 200 00:53 3.6 16.86 1.21 10.8 85.3 2506 500 00:58 3.3 14.64 0.87 10.3 76.4 3437 Complex Query ------------- IBI1 ------------- ---- IBI2 ---- Users Time CPU% zIIP% IIPCP I/O Rate CPU% I/O Rate 50 01:05 84.3 90.22 290.43 1159 53.8 180.2 100 01:10 97.8 95.33 336.71 1941 51.0 200.0 200 01:16 92.0 93.20 305.94 2697 63.1 247.5 500 01:23 100.0 95.51 337.83 3117 48.5 111.1
68
Copyright 2007, Information Builders. Slide 68 Test Env – 3 (2 separate z/OS), Scenario – 3b (WebFOCUS Reporting Server, CLI, WL=Complex, 8CP,Vary zIIPs) # user Response Time in seconds 25 50 100 200 Workload Type ----- 44.095 94.362 ----- 21.084 Complex 82.053 ----- 33.351 Complex 500 ----- ------ Complex ----- 18.814 79.838 ----- NO zIIP 1 zIIP 3 zIIP6 zIIP 22.105 42.295 37.786 17.413 68.262
69
Copyright 2007, Information Builders. Slide 69 Test Env – 3 (2 separate z/OS),Scenario – 3b (WebFOCUS Reporting Server, CLI, WL=Complex, 8 CP. Vary zIIPs)
70
Copyright 2007, Information Builders. Slide 70 WSC Benchmark Team Mary Hu John Bishop Richard Lewis Joe Consorti Jennie Liang John Goodyear Kenneth Hain Glenn Materia Dennis McDonald
71
Copyright 2007, Information Builders. Slide 71 Questions ?
72
Copyright 2007, Information Builders. Slide 72 Appendix A Workload-Small Query SELECT T1."F3SSN",T1."F3ALPHA10",T1."F3INTEGER5", T1."F3FLOAT9X8",T1."F3DBL15X2" FROM TST2MLN T1 WHERE (T1."F3SSN" <= '000000061') FOR FETCH ONLY;
73
Copyright 2007, Information Builders. Slide 73 Appendix A Workload-Large Query SELECT T1."F3SSN",T1."F3ALPHA10",T1."F3INTEGER5", T1."F3FLOAT9X8",T1."F3DBL15X2" FROM TST2MLN T1 WHERE (T1."F3SSN" <= '000003000') FOR FETCH ONLY;
74
Copyright 2007, Information Builders. Slide 74 Appendix A Workload-Complex Query SELECT T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT", T1."M_VISIT_POINTER",T4."M_C_USE",T4."M_C_CONC_CD", T4."EFFECT_DATE",T4."EXPIRE_DATE", MAX(T1."TRIP_CODE"), MAX(T3."TRIP_NAME"), MAX(T1."M_DELVRY_DATE"), MAX(T1."DELVRY_YEAR"), SUM(T4."CUSTOMER_COUNT") FROM ( ( ( MKTING_GLOBAL T1 INNER JOIN TRIP_GLOBAL T2 ON T2."VYD_TRIP" = T1."M_TRIP" ) LEFT OUTER JOIN TRANSPORT_NAME_GLOBAL T3 ON T3."TRANSPORT_CODE" = T1."TRANSPORT_CODE" ) INNER JOIN MKTING_CUSTMR_SURVEY T4 ON T4."M_MKTING_NBR" = T1."M_MKTING_NBR" AND T4."M_TRIP" = T1."M_TRIP" ) WHERE (T1."BUSS_CODE" = 'A') AND (T2."VYD_TRIP_STAT" IN('A', 'H')) AND (T2."REPORT_YEAR" ='2006') AND (T4."EXPIRE_DATE" > '2007-02-17') AND (T4."EFFECT_DATE" <= '2006-10-17') AND (T4."M_C_CUSTMR_STAT" = 'C') AND (((T1."M_GROUP_TYPE" ='H') AND (T4."M_C_CUSTMR_ID" = '2')) OR ((T1."M_GROUP_TYPE" <> 'H') OR T1."M_GROUP_TYPE" IS NULL)) GROUP BY T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT",T1."M_VISIT_POINTER",T4."M_C_USE", T4."M_C_CONC_CD",T4."EFFECT_DATE",T4."EXPIRE_DATE" ORDER BY T1."M_MKTING_NBR",T1."M_TRIP",T1."M_STAT", T1."M_VISIT_POINTER",T4."M_C_USE",T4."M_C_CONC_CD", T4."EFFECT_DATE",T4."EXPIRE_DATE" FOR FETCH ONLY;
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.