It’s crucial for a website like ours to be as responsive as possible. One of our goals over the last couple months has been to improve our website’s performance. Not that page load time was particularly slow for a single page web-app, but we knew there was room for improvement. This post will take a look at some of the steps we took to improve performance.

Before we even start you might be thinking: “Aren’t there libraries which solve all these problems for you? Why not just install one of those and be done with it?”

Normally I’d probably agree with you, and in this case we did start with a library – we used Django Compressor. But (for legacy reasons we wont go into here) it drove us insane. Eventually we just bit the bullet and built our own. Sometimes its easier just to roll your own library than to try and bend an existing solution to fit all your needs. What was probably most surprising about this process was just how many little changes were necessary to finally make a dent in the numbers. Here’s how we did it.

Measure Twice, Cut Once

We already knew about the Performance Golden Rule:

80-90% of the end-user response time is spent on the frontend.

Start there.

And two crucial tools: PageSpeed and YSlow. These tools will help you identify problems that cause your site to run slow.

Here were things we did:

Enable GZIP

We had GZIP enabled for all our pages (through django middleware), but we were serving static assets from an S3 bucket. There’s a workaround to this problem but it has the side effect of breaking anyone who doesn’t have support for GZIP. This is not acceptable. And its an unnecessary problem.

Instead, serve your assets with your web server and use a reverse-proxy CDN to cache them. Both Amazon CloudFront and CloudFlare can solve this job, however we found that CloudFlare (not amazon) was easier to use and did not have any worse performance. GZIP will be handled properly (with the addition of appropriateVary headers) and you’ll get two nice side benefits: (1) it works the same way locally and (2) there’s no additional step when you deploy.

Combine Scripts & Styles

We were already doing this but its worth mentioning because it makes such a big impact. Something like this:

def get_package(typ, package):

contents = “”

for file_name in package:

file_path = os.path.join(root_dir, file_name)

_, file_contents = get_file(file_path)

contents += file_contents + “\n”

if typ == ‘js’:

contents = process_javascript(contents)

return (hashlib.sha1(contents).hexdigest(), contents)

Don’t Combine All Scripts

This is in conflict with #2 but it turns out there’s a major downside to combining all your scripts together. Anytime any one of them changes you have to redownload the whole thing. Depending on the size of your scripts this could be a huge hit. So instead consider breaking up your large package into a couple of smaller packages. One to house relatively unchanging libraries (like jquery) and another to house the code that you are changing every day.

The other key here is to use a library like head.js to dynamically load these scripts asynchronously (so they are loaded simultaneously).

Embed Images

Embed your images directly in your css files using data-uris. You can generate them fairly easily:

def get_data_url(fp):

_, ext = os.path.splitext(fp)

encoded = “”

with open(fp, “rb”) as handle:

encoded = base64.standard_b64encode(

return “data:” + mime[ext] + “;base64,” + encoded

Embedded images like this work mostly across the board, but a simple fallback is just to serve the images directly for browsers that can’t handle the data-uris. (IE < 8)

If that doesn’t work out, at least sprite your images together. Then there’s only one request needed. (Like this:

Defer 3rd Party Libraries

For 3rd party libraries that aren’t really needed immediately (facebook, twitter, google+, …) load them after page load. This can work for tracking js libraries too.

Static Domain

Use a separate cookieless domain to serve assets. (We use This cuts down on http request overhead.


Use a URL scheme like this:


Not like this:


Because proxies don’t like the ? in the URL. Or this:


Because that makes things really hard to debug. Also use the contents of the file to generate the hash not a timestamp… ever so slight variations between server timestamps is something you’d rather not have to think about.

Caching Headers

Use the appropriate caching headers. Set Cache-Control to public and give it a really long max-age (like 1 year). Because we use hashes in the URLs of our assets we never have to worry about a stale cache.


Finally set up your library in such a way that you can add processing steps to the outputs of js and css packages. This way you can do things like minify your javascript (strip comments and whitespace), but it also gives you the ability to do more sophisticated types of processing (for example it allows you to use SASS or CoffeeScript).