Presentation is loading. Please wait.

Presentation is loading. Please wait.

Investigating QoS of Web Services by Distributed Evaluation Zibin Zheng Feb. 8, 2010 Department of Computer Science & Engineering.

Similar presentations


Presentation on theme: "Investigating QoS of Web Services by Distributed Evaluation Zibin Zheng Feb. 8, 2010 Department of Computer Science & Engineering."— Presentation transcript:

1 Investigating QoS of Web Services by Distributed Evaluation Zibin Zheng zbzheng @cse.cuhk.edu.hk Feb. 8, 2010 Department of Computer Science & Engineering The Chinese University of Hong Kong Hong Kong, China

2 2 Outlines Background Evaluation Results on Web Services

3 3 Web Applications Web applications are becoming more and more important!

4 4 The age of Web 2.0 Web 1.0: Static HTML Web page Web 2.0: Enriched Web Applications (Mush-able) Web-Oriented Architecture (WOA) Light-weight, Customer, REST, AJAX. Service-Oriented Architecture (SOA) Enterprise, WSDL, SOAP, UDDI

5 5 Trends Application: desktop  Web –Personal applications: Google doc. –Enterprise systems: salesforce Data: local  online –Public data: Movie, software, music. –Private data: Google doc, salesforce.

6 6 Next generation Web –Web pages + API. –More and more Web services (Web API). Performance investigation of the Web services is important.

7 7 Quality-of-Service Quality-of-Service (QoS): Non-functional performance. –User-independent QoS properties. price, popularity. No need for evaluation –User-dependent QoS properties. failure probability, response time, throughput. Different users receive different performance.

8 8 QoS-Driven Approaches Web service selection Web service composition Fault tolerant Web services Web service ranking Web service recommendation No real-world large-scale Web service QoS datasets from distributed locations.

9 9 Real-world Web Service Evaluation Obtain a list of publicly available Web services. Write a lot of programs –totally 235,262,555 lines of Java codes are produced in our experiments. Monitor a large number of Web services. Collect performance data from different locations. Large-scale real-world Web service evaluation is not an easy task!!!

10 10 Contributions 21,358 Web service addresses are obtained by crawling Web service information from the Internet. Two large-scale distributed evaluations are conducted and first hand experiences are provided. –Dataset 1: 150 users * 100 Web services –Dataset 2: 339 users * 5825 Web services

11 11 Location Information Among all the 89 countries, the top 3 countries provide 55.5% of the 21,358 obtained Web services. United States: 8867 Web services, United Kingdom: 1657 Web services, Germany: 1246 Web services

12 12 Web service address obtaining Web service portals or directories – xmethods.net, –webservicex.net, –webservicelist.com, Web service searching engines –seekda.com, –esynaps.com, Obtain 21,358 addresses of WSDL files

13 13 WSDL file Infomation

14 14 Java Code Generation Axis 2 generate Java codes for 13,108 Web services. Totally 235,262,555 lines of Java codes are produced.

15 15 Dataset 1: Failure probability

16 16 Overall Performance Average failure probabilities of all of the 100 Web services and all the 150 service users are larger than 0. The standard deviation first becomes larger with the increase of mean and begins to decrease after a certain threshold. The standard deviations are large, indicating that performance of the same Web service observed by different service users can vary widely.

17 17 Failure Types (1) Web service invocations can fail easily. (2) Providing reliable Web services is not enough for building reliable service- oriented system. (3) The Web service invocation failures are unavoidable in the unpredictable Internet environment, service fault tolerance approaches are becoming important. (4) Service fault tolerance mechanisms should be developed at the client-side.

18 18 Dataset 2 Web services with poor average response time performance tend to have large performance variance to different users. Large response time of a Web service can be caused by the long data transferring time or the long request processing time at the server-side. Influenced by the client-side network conditions, different service users observe quite different average response time performance on the same Web services.

19 19 Dataset 2 Influenced by the poor server-side network conditions, there is a small part of Web services providing very poor average throughput performance (<1 kbps). Service users with large average throughput values are more likely to observe large throughput performance variance on different Web services.

20 20 Conclusion and Discussion Investigating QoS of Web Services by Distributed Evaluation. The experimental observations are not very surprising. The datasets provide important real-world QoS data of Web services for future research. Further information and the detailed Web service QoS dataset is available in http://www.wsdream.nethttp://www.wsdream.net


Download ppt "Investigating QoS of Web Services by Distributed Evaluation Zibin Zheng Feb. 8, 2010 Department of Computer Science & Engineering."

Similar presentations


Ads by Google