What makes web sites slow?
- The file size of the HTML document
- The file size of the dependencies in the document (scripts, images, multimedia elements)
- The complexity of the HTML (simpler pages are easier to render for the browser)
- The speed of the connection of the user
- The speed of third party servers as content may be pulled and included from them
- The response time of the DNS servers resolving the domains and pointing you to these other servers
- The responsiveness and speed of the visitors’ computer (how busy is the machine with other tasks – as that impedes on the rendering time of the browser)
- The responsiveness of the server
These are the technical parts of the equation. Then there is also the human factor. Web pages are considered to be not fully loaded until they show up and don’t “jump around” or “have no loading images”.
Things to do to make web sites faster
There are some well-known general best practices you can follow to overcome some of these technical and human factors and ensure a quick response web site:
- Optimize all the HTML and dependencies as much as you can without losing quality (this can include stripping the HTML documents of any comments and superfluous linebreaks, which should be part of the publication process. In order to keep sites maintainable you still need those in the source documents)
- Reduce dependencies by using the least amount of file includes (collate several scripts into one include, use CSS sprite techniques to load all images at once)
- Make sure that you don’t include third-party content from their servers: set up a script that caches RSS feeds locally and use that one instead. The benefit is not only that you don’t have to deal with the DNS server delays but you are also independent of the other server should it go down.
- If possible, define dimensions for images and their container elements. This will ensure that the first rendering of the page will be correct and there won’t be any “jumping around” when the images are loading.
Best practices vs. special speed requirements
Unfortunately some of these tricks clash with what we consider best practices in web development. Cutting down on the number of included files for example impedes maintainability of the product. In order to make it as easy as possible to maintain the look and feel of a site with different pages (home, articles, archive…) it does make sense to keep the different styles in own includes and only add them to the pages that really use them. You could have one base CSS include and then one for the homepage, one for articles and so on.
Luckily there are technical solutions for most of these problems.
Using single includes for several style sheets or scripts
Mission almost possible: tackling the onload problem
One other really big issue is that unless you embed your scripts in the body of a document you’ll have to start them when the document has finished loading. This results in a slight delay, and can cause problems.
The delay is caused by the way browsers load, parse and render documents. If you call your scripts with the onload event on the window, all of these following steps will have to be finished:
- HTML is parsed
- External scripts/style sheets are loaded
- Scripts are executed as they are parsed in the document
- HTML DOM is fully constructed
- Images and external content are loaded
- The page is finished loading
For web applications where a premature activation of elements can result in a failure of the app this is absolutely vital. If your problem is of a cosmetic nature, there might be a workaround.
Avoiding the on-load problem with on-demand pulling of content
The last idea stems from a time that may just have been before you even started developing for the web! Netscape, the ill-fated (but IMHO at that time better) competitor to Internet Explorer during the browser wars had a custom HTML attribute for images called ‘lowsrc’ which enabled you to define an image that was of much smaller file size than the real one and was loaded first and then covered by the real one while it was loading. This allowed you to give even users on ridiculously slow connections a preview of what there is to come.
This trick can also be immensely effective when you include lots and lots of smaller images from several servers (like gravatars for example) as these are not likely to be cached. Simply use a placeholder graphic initially and replace them with dynamically created images when the page has loaded.