Bandwidth-related performance issues

While it's true that the typical user's bandwidth has grown significantly in the recent years, the bandwidth nevertheless remains a constant issue for web developers. However, many developers don't realize that there are actually two main sources of bandwidth problems. The first of these is fairly obvious: forcing your users to download files that are too large. But, there is also a second, less obvious source: forcing your users to download too many files at once.

Downloading excessively large files

Let's start with the obvious source first. If the files that your users need to download are too large, it doesn't matter whether they are images, videos, or JavaScript code files: Your site is going to load slowly. However, you can make a big difference in the size of any file you use by enabling compression at the web server level. On an Apache web server, this can be done by using mod_deflate, and most other web servers have similar options. Doing so will make your server compress the files that it sends to your users in such a way that your users' browser can easily decompress them … all without the user even knowing that any decompression is going on.

However, if turning on compression doesn't help enough, then your next steps depend on the file type. If your issues come from images or videos, then you simply have to find a way to use smaller files, for instance, by lowering their resolution. However, if JavaScript files are your main concern, there is another option: using a minification program.

Minification programs parse your code to create a new optimized version of it that eliminates comments, removes extra whitespace, and renames variables with shorter names. The only downside to using such a program is that it will make it harder for you to debug problems on your production servers, which shouldn't be an issue as long as you have a matching development environment where you don't minify your files. Further, if the minification truly becomes a problem, you can always temporarily switch your server back to the unminified files to do your debugging.

Together the two techniques of web server compression and minification can result in a major difference in file size. For example, the uncompressed jQuery library (version 1.11.0) is 276 KB, while the zipped version is only 82 KB and the zipped version of the minified jQuery code is only 33 KB. In other words, just by using these two techniques, it's possible to reduce jQuery's footprint by almost a factor of ten!

Downloading excessive number of files

Unfortunately though, even if you reduce the bandwidth of your code files and assets, there is also another, more subtle bandwidth issue to worry about that has nothing to do with the number of bytes your user downloads. To understand this issue, you have to understand how browsers handle requests for data, both those that come from the DOM (such as the <link> and <script> tags) and AJAX requests.

When a browser opens up a connection to a particular remote computer, it keeps track of how many other connections have already been opened to that computer's domain, and if too many are already open, it pauses until one of the previous requests completes. The exact number of connections that can occur before this happens varies by browser: In Internet Explorer 7, it's only two, but in most modern browsers, it's six or eight. Because of this limit, and because each request, no matter how small, has a certain minimum amount of time that it will take (also called latency), the actual amount of bandwidth used can be irrelevant. There are two main approaches for solving a bandwidth issue.

The first is, obviously, to make fewer requests. If your problem is too many Models being fetched at once, Collections can be very helpful in solving it; instead of fetching each Model class individually, simply create an endpoint on your server that can return all of the Models at once and then, use a Collection class to fetch them. Even though you will be downloading the same amount of data, this change in API will result in significantly less requests. Similarly, if your problem is too many images, you can combine all of the images into a single sprites file and then, use CSS to only display one image at a time.

The other option, if your application truly does require a large number of requests, is to use subdomains. When a browser counts how many connections it has outstanding, it doesn't just look at the source's domain but also at its subdomain. This means that you can fetch the maximum number of requests from http://example.com/ and then, fetch that same number of requests from foo.example.com, bar.example.com, and so on. This trick is often used to serve CSS and images more quickly, but it can just as easily be used to make a large number of simultaneous fetches (as long as you update your url methods appropriately to fetch from the correct subdomain).

Finally, there is one last solution to bandwidth issues, which doesn't really solve these issues so much as make them more palatable to the user. If you know that you're going to be making a request that will take long enough for the user to notice, you can give the user a visual wait indicator, such as adding an animated spinner image or changing the cursor's CSS property to wait. If you make these changes just before you start a fetch operation, you can then use that operation's success and failure callbacks (or, if you use the deferred style, a single complete callback) to undo the changes. While this won't make your data download any faster, it will make a difference in your user's experience.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.105.105