Benchmark Methodology - Update

Slides:



Advertisements
Similar presentations
Web Center Certification Administration Web Center Certification Training Intuit Financial Services University.
Advertisements

Episodes a shared approach for timing web pages Steve Souders Google Disclaimer: This content does not necessarily reflect.
Client side performance in Web based Banking applications Divakar Prabhu Infosys Limited (NASDAQ: INFY)
Understanding Website Complexity: Measurements, Metrics, and Implications Michael Butkiewicz Harsha Madhyastha UC Riverside 1 Vyas Sekar Intel Labs.
1 Reproduction interdite. © ip-label 2012 Arnaud Becart ip-label.
Performance Assessment Min Song, Ph.D. Is 465. LEARNING OUTCOMES 4.1 Compare efficiency IT metrics and effectiveness IT metrics 4.2 List and describe.
Introduction 2: Internet, Intranet, and Extranet J394 – Perancangan Situs Web Program Sudi Manajemen Universitas Bina Nusantara.
Chapter 11 ASP.NET JavaScript, Third Edition. 2 Objectives Learn about client/server architecture Study server-side scripting Create ASP.NET applications.
© 2004 Keynote Systems Customer Experience Management (CEM) Bonny Brown, Ph.D. Director, Research & Public Services.
Increasing Website ROI through SEO and Analytics Dan Belhassen greatBIGnews.com Modern Earth Inc.
Alexander Hartmann.  Free service offered by Google that generates detailed statistics about the visitors to a website. A premium version is also available.
Expression Web 2 Concepts and Techniques Expression Web Design Feature Web Design Basics.
AdWords Instructor: Dawn Rauscher. Quality Score in Action 0a2PVhPQhttp:// 0a2PVhPQ.
Online Search Marketing OMI Certification Course – Discovery Documentation.
DNN Performance & Scalability Planning, Evaluating & Improving : Part 2.
Engagement Marketing The Importance of Monitoring and Measuring.
November 2003 Presented to “Commercializing RDF” Semantic Software Solutions for Enterprise Web Management International World Wide Web Conference 2004.
The Usage and Value of Local Search Sources comScore study findings / marketer application Stuart McKelvey, CEO - TMP Directional Marketing Search with.
Ideas to Improve SharePoint Usage 4. What are these 4 Ideas? 1. 7 Steps to check SharePoint Health 2. Avoid common Deployment Mistakes 3. Analyze SharePoint.
استاد : مهندس حسین پور ارائه دهنده : احسان جوانمرد Google Architecture.
Paul Roy, Alex Polak, Gregory Bershansky MSN Performance & Reliability Team Microsoft.
Lecture 6 Title: Web Planning, Designing, Developing for E-Marketing By: Mr Hashem Alaidaros MKT 445.
Case study “Drive to web” campaign by L’Oréal Paris International Women’s Day campaign.
Usability Issues in Metasearch Interface Design: persectives of an information provider LITA Human Machine Interface Interest Group June 25, 2004 Oliver.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc. All rights reserved. 4-1 BUSINESS DRIVEN TECHNOLOGY Chapter Four: Measuring the Success of Strategic.
Setting up and maintaining a web presence Your brand on the internet.
See it. Click it. Get it. INTERACTIVE VIDEO “Where Psychology Meets Technology” THE FUTURE IS NOW.
MGA Duplica Replication Tool. 1. High Availability and Avoidance of Data Loss  Replicate to alternate databases 2. Split activities across databases.
Search Engine using Web Mining COMS E Web Enhanced Information Mgmt Prof. Gail Kaiser Presented By: Rupal Shah (UNI: rrs2146)
?. When designing, you should analyze your target audience. Design you webpage around your target audience Target Audience.
Deeper into the SEO hole. Does SEO boil down to Crawlability and Content Quality?
What is Seo? SEO stands for “search engine optimization.” It is the process of getting traffic from the “free,” “organic,” “editorial” or “natural” search.
111 State Management Beginning ASP.NET in C# and VB Chapter 4 Pages
© 2015 eHealthcare Solutions, Inc. Proprietary and confidential. 1 Viewability Case Studies April 2015 – December 2015.
Introduction to Digital Analytics Keith MacDonald Guest Presentation.
Online Marketing Services for Industrial Companies
15 free things you can do to climb higher on Google
WEB TESTING
What is Google Analytics?
BUILD SECURE PRODUCTS AND SERVICES
Best Practices Consortium
Viewability Case Studies
Creating & Customizing Business for Sale Websites
Benchmark Methodology - Update
Debugging Front-end Performance
Links and Comments in HTML5
GAMING POV.
Cascading Style Sheets
Search Engine Optimisation
Translating Google Analytics into Marketing Metrics
DIGITAL CURB APPEAL Learn how to leverage on technology and utilize all your social media platforms for publicity. .l.
UVU Annual Web Audit, SEO and Accessibility Coordination
GUIDE BY- SEOCZAR IT SERVICESSEOCZAR IT SERVICES.
SEO - Drive Traffic and Grow Visibility
IST 6160 Enthusiastic Studysnaptutorial.com
What is the Difference between AMP and PWA
Recommended Budget Reductions
Evaluating Transaction System Performance
Trust and Culture on the Web
Guerrilla Marketing Tactics
Maintaining Your Site Module 8: Web Publishing and Maintenance
Listing Builder.
Structural / Functional Site Diagramming
Time for a new way to measure User Experience?!
Getting to the Top of.
Helpful Things To Know For Successful Digital Marketing Strategy Presented By:- Abhinav Shashtri.
Report from the trenches of an HTML5 game provider
Various mobile devices
Finance 9.2 Upgrade IT APPLICATION SERVICES.
Social Media Interview
Presentation transcript:

Benchmark Methodology - Update February 2017 ©2017 Dynatrace Systems Confidential

©2016 Keynote Systems Confidential Agenda Current Approach and Challenges Benchmarking Evolved – Key Changes Browser Processing W3C Visual Complete Availability Ranking Metrics Key Questions How are sites selected? How are pages/paths selected? Why the use of Chrome? Why not multiple browsers? Why the use of Mobile Emulation? ©2016 Keynote Systems Confidential

Current Approach and Challenges Modern sites cannot be accurately measured without including Browser Processing Dynatrace has found that many pages have up to 50% of their time spent in Browser Execution Not surprising given extremely high amounts of 1st party and 3rd party JS overhead Not surprising given the complexity of these web applications Current benchmark is massively underreporting performance by excluding this processing time There is less correlation between actual perception of performance and end to end metric End to end is captured when the last network request has returned Includes all third party tags, all lazy loaded content and all content loaded below the fold Sites have so much 3rd party overhead that in most cases push out this end to end metric without any real impact on a site visitor The current benchmark is overreporting load time by only using end to end Multiple browsers are important for QA testing but do not necessarily reveal stark differences in performance The goal of the benchmark is to fairly compare members and to provide a view of a “typical” experience The goal is not to capture every possible user To that end a single browser (especially the most widely used browser – Chrome) is a good proxy for this “typical” experience Current Approach End to end is the core performance metric Browser Processing not included Multiple browsers are used ©2016 Keynote Systems Confidential

Benchmarking Evolved – Key Changes W3C (Navigation Timing) Metrics Browser Processing Visual Complete Single Browser Simpler Rankings ©2016 Keynote Systems Confidential

W3C (Navigation Timing) Metrics By capturing key Navigation Timing metrics, we can slice the data much more precisely than end to end No single metric is perfect at capturing when a page is ready but many are better than end to end At minimum, looking at the LoadEventEnd (or commonly called onload) allows us to exclude all lazy loaded or post loaded third parties ©2016 Keynote Systems Confidential

©2016 Keynote Systems Confidential Browser Processing The site below is a common example of the time taken in a typical page load that is tied to Browser Processing Browser Processing in this cases means excessive execution – not allowing the network fetching/threading to continue To be very clear, this is not parallel processing which happens for every page and site but excessive execution that serially interrupts network fetching/loading If Browser Processing were excluded (as it is in the current benchmark data) these pages would appear to be much faster than they actually are (Home for example has more time in execution than in network/back-end) Given the amount of processing is completely dependent on the page’s design – its impact on each page/site will be unique ©2016 Keynote Systems Confidential

©2016 Keynote Systems Confidential Visual Complete Even more useful than the W3C metrics is a measure of above the fold render When is the page rendered above the fold and likely ready to use This metric is “visual complete” Once available in Q3 this will become the primary metric for ranking in the benchmark results Speed Index Visually Complete Response Time ©2016 Keynote Systems Confidential

©2016 Keynote Systems Confidential Simpler Rankings Data should be easy to understand and the method of calculation transparent Each benchmark will have a single set of data – no longer combining data into a “meta” ranking Those who purchase the benchmark will have the ability to sort this data (in a CSV) based on several metrics Part of the weekly publication First Paint, Onload, Visual Render, End to End Dynatrace will also rank based on what it considers to be the most accurate metric available at the time Likely Onload first and then moving to Visual Render Variability rankings will be removed ©2016 Keynote Systems Confidential

©2016 Keynote Systems Confidential Key Questions How are sites selected? Dynatrace selects sites based a mix of factors included brand presence, traffic volumes (where known), measurability etc. Our hope is to represent the core sites in a vertical or industry How are pages/paths selected? Many of our benchmarks are Home Pages, as Home Pages are still a primary entry point to the site and can be indicative of overall site performance For our transactional (multi-step) benchmarks, Dynatrace selects common and comparable sets of pages that represent the most important functionally in the measured industry Why the use of Chrome? Chrome continues to dominate in terms of market penetration There are typically not major differences between browsers in terms of error and performance, allowing Chrome to be a good proxy for a “typical user” Why not multiple browsers? If we think about the common performance issues – back-end, network/infrastructure, third parties, core design etc – they do not vary by browser Similar with errors where most errors are tied to infrastructure/application or third party issues and will be captured by ANY browser There are corner cases where both errors (typically functional) or performance can be unique in a specific browser version. The best way to capture this is with Real User (RUM) data or functional testing and not with a benchmark. Why the use of Mobile Emulation? The goal of the benchmark is to help our customers understand how they can improve their site. By removing the high levels of errors and performance variability seen in all mobile networks we provide a accurate proxy for mobile load times without including errors and issues that are outside the site owners control. ©2016 Keynote Systems Confidential