Performance Bottlenecks Evident in the Top 50 Sites of Multiple Industries


Web pages are trying to do too many things, and often failing to optimize along the way.

It’s not just ecommerce sites, either: for this quarter’s State of the Union Report, we decided to look at the top 50 websites of four key industries to see what web performance bottlenecks were prevalent, and we learned quite a bit in analyzing news, travel and sports sites, along with ecommerce.

Whether it’s due to a steady uptick in the amount of images, fancy JavaScript or other plugins in a push to create eye-catching pages, the slowdown in page load times and how long it takes for a user to interact with sites is apparent.

As we’ve found in the past:

  • Pages keep getting bigger in their total size
  • The number of resource requests is increasing dramatically
  • Performance takes a hit due to page complexity and large, unoptimized images.

These trends threaten user retention and can have a huge impact on the bottom line.

People Don’t Like Waiting, and They Won’t Wait for Long

Three seconds or less – that’s been the gold standard for some time as the target to get pages in front of users. After that point, a good number of them leave – potentially upwards of 50% – according to various studies.

What this translates to is abandoned shopping carts, travel bookings that don’t get made, articles that go unread, ads that don’t get seen, and missed opportunities to gain subscribers.

Consider this transactional equation*, where x = visitors and y = average revenue per customer, multiplied by 365 days, assuming a 2% conversion rate and a general website bounce rate of 40% when sites load slowly:

(.40x)(.02)(y)365 = $Lost Revenue$

That’s potentially a huge sum of money – companies can’t afford these missed opportunities.

*Forty percent represents a conservative general estimate of bounces resulting from slow load times.

Our Purpose and Process for the State of the Union Report

As we’ve done since 2010, we measured and analyzed the performance of the top websites, looking at key web page metrics from load time to TTI, page size/composition and the adoption of performance best practices. We used widely-available tools, including WebPagetest.org and HTTPArchive.org, to obtain real-world snapshots of these pages’ metrics (as well as for web pages in general). Our findings are detailed in our quarterly “state of the union” reports, some of which you can find here.

The difference for this report was that in evaluating the top 50 websites of four industries as ranked by information technology company SimilarWeb, we also wanted to compare the page compositions and bottlenecks of the various sectors’ sites.

The results were interesting.

Below are some of our key findings, and we will be revisiting each of the categories for a more granular look in the coming weeks as well.

Finding 1: Page Size, Composition and Optimization Levels Vary by Industry

Different industries build their respective homepages around what their users want to accomplish – and of course, what the respective industries want their users to do. Form serves function.

According to HTTP Archive, the average website in general looks like this:

However, the median size of websites in the four categories tested varied quite a bit:

  • Ecommerce sites had a median size of 1.4 MB (actually an improvement from our previous Summer 2015 ecommerce report)
  • News sites were slightly larger than ecommerce sites at 1.6 MB
  • Travel sites weighed in at 3.3 MB
  • And sports sites came in with a hefty median size of 4.2 MB. Ouch.

The complexity of the sites in each category also contributed to the slowdown.

  • Ecommerce sites had a median of 97 requests, with a mix of images and JavaScript being the most prominent sources
  • News sites were in the middle again with a median of just over 122 requests
  • Sports sites had the largest number of requests, with a median of 148
  • Travel sites had the lowest median of requests of the four categories, at 92.

If you examine both of the charts above, you might notice a theme: sports sites had the most requests and the biggest page size, while ecommerce was generally at the other end, with news and travel sites somewhere in the middle (although travel sites had a few less requests than ecommerce sites when considering the median number).

This factors directly into the resultant Time to Interact (TTI), which is the measurement of the point in time the central content has rendered for the user on a webpage, as can be seen in this times filmstrip view generated by WebPagetest:

  • The median ecommerce site was just shy of the 3-second TTI target, at 3.1 seconds (still, a marked improvement from the 5.5 seconds we noted in our Summer 2015 State of the Union report)
  • News and travel sites tied with a TTI of 4.1 seconds
  • And sports sites were well outside the ideal time at 5.2 seconds.

As they pile up, requests lead to round trips and latency, and the TTI reflects that, as you can see in the chart above.

Finding 2: Load Times and Resources Even Vary Between Localized Sites from the Same Companies

In global, ranked lists from SimilarWeb and Alexa, you’ll find geographically-localized sites from some of the world’s major international players, such as Amazon.com. It’s important to note that these pages aren’t identical, however, even if they appear to be formatted the same.

Why?

Differences in the resources used, and possibly the locations and capabilities of the servers provisioned for each site lead to variations in the load times and Time to Interact.

In looking at the various Amazon sites from the US, Germany, Japan, the UK, Spain and Canada, while their load times are generally good and within the expectations of users – due to their use of optimization – there are some notable differences.

Consider these filmstrip views from the testing, as generated by WebPagetest.org:

There are differences in the number of HTML requests and other assets, which you can see in the following charts:

The takeaway here is that each site must be optimized individually to ensure effective usage of resources and optimization techniques in order to keep the user experience in the green.

Finding 3: Most Sites Still Fail to Employ Core Image Optimization Techniques

Across all the categories researched, core optimization techniques weren’t being utilized by a majority of websites, despite the availability of tools and resources.

As noted earlier, each industry featured websites with a different spread of resources, from images to HTML, JavaScript and other elements, but the bulk of all of them came down to images, which generally make up 50-60% of a page’s weight.

Compressing image files lightens a web page’s overall payload. Fewer bytes mean reduced bandwidth and faster pages.

And yet, only a small minority of sites from each category are getting an “A” in this category from WebPagetest:

Ecommerce Sports News & Media Travel & Hospitality
A – 10% A – 10% A – 8% A – 6%
B – 8% B – 6% B – 10% B – 8%
C – 20% C – 12% C – 18% C – 12%
D – 8% D – 4% D – 14% D – 12%
F – 36% F – 46% F – 30% F – 26%
N/A – 18% N/A – 22% N/A – 20% N/A – 36%

By utilizing image compression, site owners can improve their sites by:

  • Reducing the amount of time required for images to be transmitted or downloaded and
  • Increasing the number of images that can be stored in the browser cache, thereby improving page render time on repeat visits to the same page.

Optimize Your Site, and Your Customers Will Thank You

If you want to keep your users flying, reading, watching and buying, you need to optimize your sites, so they’ll stick around.

In order to pull this off effectively, you’ll need an automation solution, though. While many performance optimization techniques can be performed manually by developers, hand-coding pages for performance is specialized, time-consuming work.

On highly-dynamic sites that contain hundreds of objects per page, it’s a never-ending task, as both browser requirements and page requirements continue to develop.

Automated front-end performance optimization solutions apply a range of performance techniques that deliver faster pages consistently and reliably across the entire site.

With automated WPO, pages can be optimized individually without the labor normally associated with this task, saving you time and money while keeping your users happy.

Get the report: 2016 State of the Union: Multi-Industry Web Performance (Desktop Edition)

Kent Alstad

Kent is an entrepreneur, software architect, and technology innovator. Before taking his former role of VP Acceleration at Radware, Kent was CTO at Strangeloop Networks, where he was instrumental in authoring all of Strangeloop’s issued and pending patents. Prior to helping create Strangeloop, he served as CTO at IronPoint Technology. Kent also founded Eclipse Software, a Microsoft Certified Solution Provider, which he sold to Discovery Software in 2001. In more than 25 years of professional development experience, Kent has served as architect and lead developer for successful production solutions with The Active Network, ADP, Lucent, Microsoft, and NCS. ”Port View”, an application Kent architected for the Port of Vancouver, was honoured as ”Best Administrative System” at the 1996 Windows World Open Competition.

Contact Radware Sales

Our experts will answer your questions, assess your needs, and help you understand which products are best for your business.

Already a Customer?

We’re ready to help, whether you need support, additional services, or answers to your questions about our products and solutions.

Locations
Get Answers Now from KnowledgeBase
Get Free Online Product Training
Engage with Radware Technical Support
Join the Radware Customer Program

CyberPedia

An Online Encyclopedia Of Cyberattack and Cybersecurity Terms

CyberPedia
What is WAF?
What is DDoS?
Bot Detection
ARP Spoofing

Get Social

Connect with experts and join the conversation about Radware technologies.

Blog
Security Research Center