- Many HTTP Requests (meaning generating more load to the web server)
- Less chance for caching any of those files
- If HTTP compression is enabled in the IIS many files mean more load and worse compression ratio than one merged file.
The solution for merging AJAX AXDs in a single file is described in those two great posts:
Script combining made easy [Overview of the AJAX Control Toolkit’s ToolkitScriptManager]
Script combining made better [Overview of improvements to the AJAX Control Toolkit’s ToolkitScriptManager]
The recapitulation when with a friend of mine (Julian) tested out this technique was reducing the loading time with about 2 seconds on a quite clean and well build site. Also the better HTTP compression reduced the size a bit.
If you are using ComponentArt’s Web.UI version 2007.2 or later controls there is solution posted on their blogs too:
Optimizing Web.UI Client Script Deployment
And couple more general advices:
- Load CSS at most top of you page
- Be smart :)
I was always keen on speed improvements and something that might be worth sharing are couple of techniques for improving caching efficiency and effect.
Lets review what happens when a user tries to load a web page:
1. The user’s browser sends request for the HTML (I’m skipping the part with dns lookups, three way handshaking etc)
3. When the browser requests some file (image for example) it first checks in the browser cache with two possibilities:
3.1 File is not in the cache, so there is another HTTP GET Reques sent to the server
3.2 File is in the cache – here starts the tricky part.
OK we have the file in the cache but the browser needs to check if there is newer version of this file and therefore sends Conditional Request to the server asking if the content has changed. Luckily sometimes the file is not changed (understood from the timestamp) and the server answers 304 (not modified) so we have saved bandwidth but not HTTP request.
The bad news is that nowadays bandwidth savings usually are less important than requests count!
For getting out of this uncomfortable situation there are HTTP Expires Headers – generally they define for how long a cached object is fresh and can be served directly from the cache without additional modification checks.
The tip for today: Add HTTP Expires Headers to all objects that you don’t change often, this way except from bandwidth you will save and couple otherwise expensive HTTP Requests.
Update: Another interesting thing that I stumbled upon is Contextual Precaching – generally when clicking on the search input on Yahoo’s homepage they start preloading images for the search result page – pretty neat huh?
For a work in progress in my spare time (a WordPress Theme) I decided to implement one quite useful technique for images merging that has the following pros:
- reduces HTTP requests
- usually leads to better compression
- no flickering with onmouseover events
I suggest using it when you have many looking-alike graphical elements (buttons, icons, panels) that you are setting as not-repeating CSS background.
Having prepared your image (merging couple buttons) in this case:
use CSS definitions like the following (notice the background vertical position 0,31,62 )
background:transparent url(image-optimization.gif) no-repeat 0 0;
background:transparent url(image-optimization.gif) no-repeat 0 -31px;
background:transparent url(image-optimization.gif) no-repeat 0 -62px;
And as usual everything wrapped in simple example
While investigating some performance issue on a project I’m working on I stumbled upon this very useful article by a Google system engineer. It is basic but still covers most of the major speed issues on today’s Web 2.0 rich websites and is worth reading:
While working on optimizing page load times for a high-profile AJAX application, I had a chance to investigate how much I could reduce latency due to external objects. Specifically, I looked into how the HTTP client implementation in common browsers and characteristics of common Internet connections affect page load time for pages with many small objects…