Download presentation
Presentation is loading. Please wait.
Published byBuck Warren Modified over 9 years ago
1
Automated Benchmarking Of Local Authority Web Sites Brian Kelly UK Web Focus UKOLN University of Bath Bath, BA2 7AY UKOLN is supported by: Email B.Kelly@ukoln.ac.uk URL http://www.ukoln.ac.uk/
2
2 Contents Background WebWatch Project Current Approach Pilot UK Local Authority Survey Other Approaches Discussion Conclusions and Recommendations
3
3 The Problem Background Local and central government organisations are developing Web-based services There is a need to audit the services in order to measure compliance with standards and guidelines, coverage, usability, etc. Aim Of This Talk This talk describes experiences in the use of Web- based auditing services and summarises the benefits and limitations of this approach NOTE The talk does not provide detailed results of a survey of UK Local government Web sites The talk does not cover manual evaluation of Web sites NOTE The talk does not provide detailed results of a survey of UK Local government Web sites The talk does not cover manual evaluation of Web sites
4
4 Web Site Benchmarking Why benchmark Web sites? To monitor compliance with standards and community guidelines To inform the Web community (e.g. W3C) on the uptake of Web standards and protocols To monitor developments across a community To allow funders to observe developments To allow members of a community to see how the community is developing and how they compare with the community
5
5 Benchmarking Examples Examples: Do local government Web sites comply with W3C WAI guidelines? How large are the entry points to local government Web sites? Do the Web sites comply with HTML, CSS, XML, etc. standards? Does it appear, for example, that awareness of the importance of accessibility and standards compliance been accepted or does it seem to be too difficult to provide compliance?
6
6 WebWatch Project WebWatch project: Funded for one year by British Library Started in 1997 Software developer recruited Development and use of robot software to monitor Web sites across communities Several surveys carried out: UK Public Library Web sites UK University Web sites … See
7
7 WebWatch Mark II By 1999: Funding had finished Software developer left Realisation that: Development of in-house software was expensive Web site auditing tools were becoming available Web site auditing Web services were becoming available Since 1999: Use of (mainly) freely available Web services to benchmark various public sector Web communities Regular columns in Ariadne e-journal Experience gained in issues of Web site benchmarking
8
8 Benchmarking Web Sites Bobby is an example of a Web- based benchmarking service which provides information on compliance with W3C WAI guidelines http://www.cast.org/bobby/
9
9 Use Of The Services The benchmarking Web sites are normally designed for interactive (manual) use However the input to the Web sites can be managed automatically, which speeds up the submission process It would be possible to automate processing of the results, but this hasn’t (yet) been done: Lack of software developer resources Quality of output needs to be determined It should be the responsibility of the service provider to provide output in reusable format
10
10 Displaying Results The input to the benchmarking Web services and a summary of the results is provided as a Web resource. This provides: Openness of methodology Ability to compare your Web sites with those published
11
11 Use of Bobby Analysis of UKOnline appears to show a compliant site, 0.5K in size. Examination show that this is an analysis of a Redirect page. Analysis of the destination shows lack of compliance with WAI guidelines and a size of 1.17 K Further examination show that this is an analysis of a Frames page. Analysis of the individual frames shows: A file size of 24.8 K for one frame The other frame could not be analysed due to lack of support for cookies in Bobby Bobby analysis of
12
12 Benchmarking Services (2) NetMechanic is another examples of a Web-based Web site testing services It can check: Links HTML and browser compatibility File sizes … http://www.netmechanic.com/
13
13 Some Issues When using Bobby and NetMechanic different results may be obtained. This may be due to: Analysis vs following redirects Analysis of frameset page but not individual frame pages Not analysing images due to Robot Exclusion Protocol Differences in covering external resources such as JavaScript files, CSS, etc. Splash screens …
14
14 Benchmarking Sites It is possible to benchmark entire Web sites and not just individual pages, such as entry points: Nos. of links to Web site Nos. of pages indexed Relationships with other Web sites … You can also measure the server availability and uptime (e.g. using Netcraft)
15
15 Standard Files It is also possible to analyse a number of standard Web sites files: The robots.txt file Has one been created (to stop robots for indexing, say, pre-release information)? Is it valid? The 404 error page Has a tailored 404 page been created or is the server default one used? Is it rich in functionality (search facility, links to appropriate help information, etc.)? Note that manual observation of the functionality of these files is currently needed
16
16 Market For Benchmarking There is increasing interest in Web site benchmarking: Industry (search for “Web auditing”) Consortia e.g. see SOCITM “Will you be Better Connected in 2001?” service at : Visual impairment rating 12 page report about your site Recommendations for improving site £495 (subscribers) or £950 for survey
17
17 Who Does The Work And Why? Who should benchmark? Community itself (e.g. national association) But how self-critical can it be? The funders But will they take on-board the complexities? Neutral body But is there an obvious body to do the work? What is the purpose of the benchmarking? Is it linked to funding, with penalty clauses for non- compliance? Is it to support the development of the community, by highlighting best practices?
18
18 Technical Issues Web Services There is a need to develop from use of interactive Web sites to services designed for machine use There may be a role for a “Web Service” approach in which a rich set of input can be provided (e.g. using SOAP). EARL There is a need for a neutral and reusable output format from benchmarking services W3C’s EARL (Evaluation and Reporting Language) may have a role to play As EARL is based on RDF it should be capable of describing the benchmarking environment in a rich and machine understandable way See
19
19 Recommendations (1) Standards Bodies (e.g. W3C & Equivalent) There is a clear need for rigourous definitions to assist in Web auditing in order to ensure that valid comparisons can be made across auditing services Examples: Definitions of a “page” Files which should be analysed How to handle robot exclusion protocol User-agent view
20
20 Recommendations (2) Applications Developers There is to ensure that Web-based benchmarking services can be tailored and the output can be reused Benchmarking services should be capable of emulating a range of user agents Benchmarking services should provide user control over compliance with the Robot Exclusion Protocol Benchmarking services should provide user control over definitions of files to be analysed Benchmarking services should provide user control over the definition of a page (e.g. include redirected pages, sum results of original and redirected page, etc.)
21
21 Recommendations (3) There are benefits to communities in monitoring trends and sharing best practices which have been spotted in benchmarking work Let’s share the results and issues across our related communities Let’s share the approaches to benchmarking across bodies involved in benchmarking
22
22 Questions Any questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.