In our hurry to meet deadlines and satisfy all the different committees and audiences attached to our homepage, it can be easy to lose touch with some of the small things. In missing the small things, we end up with a big thing. A big page. How much stuff does a homepage have to require you to download before it’s too big? Are you hurting your user experience by making them wait? We looked at 100 .edu websites to see just how things stack up to see how we’re doing in the area of site optimization. I’ll give you a hint: there’s good news, and bad news.
The data presented in this article was all collected using Firebug in Firefox 3.6.8 with caching disabled. Pages with autocycling centerpieces were allowed to load all items once for measurement. Any externally linked assets were included in measurements (i.e. loading jQuery from the Google CDN). Values for “other” data were calculated based on the total site size - (images + CSS + JS). In instances where this resulted in “other” values <0, the value was set to 0 (this occurred twice, and these values were set to .1 only for the purpose of display in the logarithmic chart below). Values over 1MB were rounded off to the nearest 10th, this primarily impacts only sites that had more than 1MB of images, which results in a margin of error +/- 99KB in the image and other values. For the sake of comparisons, load times will be weighed in proportion to each other to remove some of the effect of the fast connection I tested on. The baseline 0% load time was set to the average of all sites: 4.05 seconds on my [fast] connection. 100 websites were selected based on a mix of results from a Google search for "site:.edu" and recent listings on eduStyle. All sites were tested from their primary home page.
First, let’s throw around a little context, yeah? The challenge here is that there isn’t necessarily a “right” size for a page to be. In fact, how it is optimized or how it loads can dramatically impact your capabilities. You can stand to have a huge page if you are progressively loading objects, or loading them in the background in a way that doesn’t interrupt the user experience. Pear Analytics wrote about how long a user will wait on a page back in ’09. Despite the age, the data is still solid (and likely overestimates a hair now, given that we’re trending downward - Jakob Nielsen mentions in ’93 people could be expected to hold their attention for 10 seconds). Basically, count on 4 seconds (oddly enough, right at our overall average given my connection). From their article: “Akamai said in 2006 that you could lose up to 33% of your visitors if your page took more than 4 seconds to load on a broadband connection.”
Here’s how some big sites compare:
- Amazon: 126 requests - 691.6KB - 5.68s
- CNN: 115 requests - 968.1KB - 10.12s
- Facebook: 142 requests - 608.1KB - 12.02s
- Fox: 82 requests - 245.9KB - .8s
- Google: 10 requests - 181KB - 1.01s
Obviously, they’re a bit all over. Both CNN and Facebook had longer load times, but in part due to how the pages load up media and make secondary calls. Facebook, for instance, takes time loading elements progressively across the page. None of the sites were over a megabyte total though. Google was expectedly small, and Amazon was even quite reasonable given the number of graphics their homepage displays.
Breaking Us Down
In the end, 40% of the sites were a full megabyte or larger. I’m speaking qualitatively here, but even with fast load times, I really feel like that’s a heck of a lot of data to load for a university homepage. But, 53 of the samples were smaller than CNN which is a bit more encouraging.
In the following two charts, we break down all 100 sites based on asset types. In the first, we look at it school by school, in the second, it’s just looking at the range of each asset type. Be aware that both graphs are logarithmic due to the range, but you can turn off the values you don’t want to see in both. Looking across the middle of the pack, things show some relative stability, but at the far ends, things go a bit wacky, which is to be expected on a bell-curve-like situation such as this.
Naturally, all this data impacts your load times, which is what we are ultimately getting at. The average load time occurs at a point that has 67 schools at/under the average, and 33 over it. This is partly because of the long tail of the faster sites. About a second is as fast as they come, and from there to the 4 second mark (again, all these times are on my machine on a fast connection) is where a lot of folks fall. But, the bad sites get really bad, which pulls the average away from the middle of the pack. In fact, the median time of 290% normal didn’t occur until the farthest end of the data (15.8s by my timing).
My big worry is the role my high bandwidth connection played. Do the same test from home (or gods forbid a mobile device) and you will absolutely see that 0% vertex start drifting substantially to the left. That’s bad, because that will put a large percentage of sites well beyond that golden 4 second mark. What’s really frightening is just how bad the load times get, in part due to sites with enormous Flash overhead, or forcing users to download movie files even if it’s not playing.
Finally, the last big point to pull out is the number of HTTP requests a page has to make in order to build the site for the users. This isn’t directly related to overall size mind you, but can impact things like server load, as well as wait times if requests are backed up or delayed. It should go without saying that the more load you put on your server, the slower it will respond. The average number of requests was 55, the median was 109.
So, what can we do to improve load times? There are a number of options that we could all take advantage of to some degree. A lot of this is a matter of good planning and good habits. I think a lot of the time many of us are forced to work for speed, rather than quality, so while these may be obvious, I think it still needs to be something we pay attention to.
If you can save graphics as non-transparent, 8-bit PNGs, do it. This is about on par with using GIFs (which are pretty equally fine for small images with limited colors), but the compressed size is a bit better for smaller sized graphics and icons/buttons. 24-bit PNGs can be great for high quality images, but will cause the size to balloon (be sure to catch this guide to the PNG format). For JPGs, don’t save using the maximum quality settings if you don’t need to. In some instances, you can save up to ten times the amount of space if you just turn down the quality settings.
Also make sure - especially with photography - the image is being saved at the native size it will be shown. Don’t just upload a 12 megapixel photo from a camera onto your homepage without shrinking it down first. And speaking of small, in the case of things like icons and background gradients that are applied with CSS, consider using sprites. Sprites won’t save you a lot of space, but can cut down the number of HTTP requests the page has to go out and make, speeding up overall page loading.
Photoshop users should definitely read this guide for the Save for Web & Devices tool. It will offer up a number of suggestions to make sure you are putting your images out the best way possible for web use. And finally, there are a number of other tools for image optimization as well that can meet different needs (I swear, that’s the last Six Revisions link I have).
Minify and Consolidate
Flash: Kill it With Fire
Sure, jQuery has overhead. But it’s also pretty ubiquitous, so if you load it from something like the Google CDN, you’re likely to save your users some download time. Flash is simply in its sunset years now. It’s a search problem, maintenance problem, and accessibility problem. And more than that, complex video players, menu tools, and centerpiece rotators can add a ton of weight to the page, and it’s all stuff that can, mostly, be done with jQuery now.
Look for Random Crap
In many cases, the stuff slowing down your pages can be completely random, unexpected things. This is especially true if you use a lot of third party widgets or code that pulls from outside your site. Last year, Google announced that page load times can now affect your pagerank. So it’s important you identify the sources of that slowdown since a slow page can not only drive away users, but could keep them from finding you in the first place (in extreme cases, since relevancy still plays the big part in pagerank).
The Google Webmaster Central Blog offers a few recommended tools for tracking down these elements so that you can address them. In fact, they have an entire site devoted to speed, which has even more information and tools.
So, the good news is that I generally feel we are a bit better off than I expected when I first started this project. The bad news is that the signs of data bloat are already poking their heads around our sites. As we strive to make more dynamic, complex sites, we’re doing it at the risk of optimization. I’m as guilty as anyone for working in haste to get something done, but not going back to clean it up and make it faster. Shame on me.
In the end, it’s a quality of web life issue. Sure, a lot of folks have faster connections now, and that’s great. Just because we have faster cars today doesn’t mean we load them down with a couple extra tons of steel though. But we have made huge strides in car features and safety at the same time. In that same way, we can still do a lot to make our sites better and more feature rich, and do it in a way that doesn’t prevent the page from loading quickly and in a way that makes sense.
Finally, I don’t present all this to you with the direction of making your site as small as possible. An anorexic web site has just as many problems. In fact, some of the smallest sites in the data were some of the… least modern, shall we say. They were small because they lacked a lot of common, newer elements or more visually engaging imagery. Instead, I would say we aim for a sweet spot somewhere between about 500KB and 800KB. Hit that mark, and you should feel pretty good about what you have going on for your users.