The image shows how items from a site are downloaded (credit: getfirebug.com). You can see that items are downloaded in a sequence rather than in parallel. This is one source for improvement.
Possibilities to improve performance of your site fall roughly into these two categories:
- Remove files and optimize files.
- Migrate files to a content delivery network for lower latencies.
Any intent to accelerate your site should start with a thorough analysis of the bottlenecks. It won't help reducing file sizes if your real problem is the amount of server connections which each wait for response for a long time. In turn, if you have a lot of videos and 80 external files of 2Mb each, no wonder your side will slow down. You should attack the problem at its root.
In this article I will explain how to analyze and remove loading bottlenecks and how to increase the loading speed of your site.
Analyzing the ProblemThere are several analysis tools you can use. There are of different levels of sophistication and consequently utility. None of them are difficult to use.
Some web-driven applications give you benchmarks of your site. You only have to type your URL and you'll get a first idea. You'll get feedback such as "too many files" or "this file is too big." You'll see estimates of how long your site will take to download at different bandwidths.
A short collection of sites that provide these online speed tests for free and without registration:
These on-line performance tools are fast and comfortable, however useful only to an extent. They miss information about latency. Latency is often more important than file size. Often it can take longer to wait for a server to respond than to actually download the file.
This information you'll get with firebug. Firebug is a firefox extension for debugging, editing, and profiling of websites. There is a firebug lite plugin for opera and internet explorer, however without the profiling option, so for the analysis you have to load the site in firefox and firebug shows you how long each of the components take. You'll also see that many files download in parallel. To get you started, see the firebug tutorial at ibm.
File Optimization: Reduce File Size, Remove FilesFirst step to a speed up: get rid of anything you don't need. Delete widgets you don't need. Remove scripts which are slow. Remove superfluous big images and videos. This reduces clutter and can make your site clearer.
HTML OptimizationHTML code are all the tags inside the greater than (>) and less than (<) signs. It is the organization of the HTML which can make a file appear load faster. If you move the CSS (inside
<height>attributes stops appearance of content. Without these attributes the display has to wait until image or table content has downloaded in order to determine size. You should also avoid scaling of images at loading time. Scale the images already to the size you want them displayed at. While we are at it, many
<meta>tags you don't need. While we are at it clear it out.
HTML documents typically are not cached because their content is constantly changing. So moving out CSS and scripts can help faster repeated loading if the browser caches your site.
This leads to cache control. The browser loads some files into memory, so it doesn't have to download them again. If the browser caches aggressively, the site will load much faster after the first time. There are some HTML meta tags to give the browser indication cache and server side commands for external files (.htaccess). Setting the file expiration to many days can help a lot. Read more about caching in the Caching Tutorial for Web Authors and Webmasters.
Image Optimization and CompressionImages can often be reduced a lot. Do you really need a file of resolution 1000x1000? Also, all meta information about the file, where it was created and so on, is not important for the viewer of the page. Again: scaling images during loading by width and height attributes costs time. Resizing them to correct size beforehand helps. JPEG and PNG (Portable Network Graphics) formats usually result in smallest files.
There are many free tools for optimizing file sizes and reducing resolutions. Gimp is a visual application similar to photoshop. One of the best command line tools that does everything from converting and editing is ImageMagick. Pngcrush and pngtopnm are optimizers for PNG files. Both perform lossless compression, meaning no quality will be lost. Jpegoptim and jpegtran provide lossless and lossy methods for size reduction of JPEGs.
CSS MinificationCSS minification involves deleting all unnecessary characters, such as whitespaces, and shortening names. There are on-line minifiers, like this css compressor tool. CSStidy CSStidy is a command line tool, which optimizes CSS stylesheets.
A general rule for CSS use is to avoid CSS expressions. They are a way to set CSS properties dynamically. These expression might be evaluated many times during page load and can slow down page loading a lot.
Compressing files using gzipper has the risk however that some browsers may not support gzip files or not recognize the MIME type. File compression should be done server-side.
Content Delivery Network - Reducing LatencyIf you have many files (scripts, images and other multimedia, stylesheets) which have to be downloaded from servers this can cause a problem of latency. It can help to concatenate files to have less files, which means less waiting time for server response. However there is more you can do.
What is better, that visitors can downloaded and view your pages from a single server or from many servers around the world whichever is closest? What is better, that some experienced programmers take care of optimization of servers so that they can take care of even high demands, like a slashdot effect, or that you have to do it? I think the answer is obvious: content delivery network and the best of it is that it is free (until you really hit traffic).
Content Delivery Networks (CDN) are services that provide file servers distributed so as to maximize bandwidth. This means that visitors to a website can be served content from a server nearby. Big names in the CDN business are google, amazon, and yahoo. Google keeps the location of its data centers a secret, but in the image below you can see the server locations of the coral content delivery network, to give you an idea of what a CDN is.
Google's app engine's free service has some access quota, which can be extended for pay. A very good tutorial and introduction is provided at 24ways: using google app engine as your own CDN.
Yahoo performance rules, which provides general tips on performance improvement. It was one of the best resources I found during my research. You can find some more resources in the post Best Blog Tools.