Improving your website’s performance

Using Cache

One of the best ways to improve your website’s performance is by using cache. That way you can save lots of db requests and server processing time.

Using Cache for static data

Let’s start by caching all those reference tables with user roles, categories, error messages, cities, countries, etc. Normally this uses minimal cache and saves thousands of db requests, probably the best improvement possible.

Caching often requested dynamic data

Imagine you have a real state website for the UK with thousands of possible requests on it (as there are thousands of cities, towns, combinations of filters, etc. But you notice that due to peaks of population requests for London, Manchester, Liverpool and a few more take like 30% of all the overall requests. Of course there are filters, but the first request every user does is going to take the exactly same data as the others… why not cache the standard “house to rent in London” dashboard? You would be able to serve it without even going to the database.

Something similar can happen with the user object, if a user starts a session in our website, is going to worth it to cache this user for like 10 minutes since last access so we don’t need to request the db for this user on every request.

Warning: Make sure you clean the cache if delicate data changes (like the user), though you can consider maintaining it for a while if it’s not a huge problem (like the dashboard).

Caching static pages

We may have pages that always look the same no matter who requests them that do not need to be generated dynamically. Well, we can cache those pages and send them in a response without the server having to generate anything.

Examples of these pages could be the “Terms and conditions” page, the Contact page or even the Login page, as there isn’t any custom user data when he is not logged in.

Caching pieces of a page

Depending on the technology you are using you may even cache parts of the page, imagine you want a header with custom user links/data which makes incompatible the idea of caching static pages, well, you could add those pages as “cached pieces” having to generate only the part that’s dynamic. Or even go to the limit and cache the header with low priority (caching every current user’s dynamic header for 10 minutes shouldn’t use that much memory and will save processing time).

Caching the Home page

Again on the limit, but think about it, the Home page is the page that your users are going to request more times, even if their landing page is an internal one, as they like to land in the home, go inside, go back to home, etc. Caching the full home page can be an incredible wise move… except if the headers/sidebars are using any custom user data (you can try caching most of it though).

Consider cleaning the Home to make it unified for all the users (instead of a Hello [username] just show an impersonal “My profile” and “logout” links for example) as caching the page using your server config instead than your server-side technology (.Net/Java/PHP) allows it to do not even access your server logic. Your IIS/Apache will get the request and send an answer without even telling your .Net/Java/PHP.

Database

Reduce the amount of requests

We can do this by caching and also by placing requests together. Let’s imagine I want to create a post that has labels, categories, titles, etc.

I may do this step to step like “checking if there is any new label -> create label in db”, “checking if there is any new category -> create category in db”, “-> create post in db”, “-> insert details in audit table”. Or alternatively I could create an store procedure (or a sql statements builder like Entity Framework) to make sure I perform all those operations in just one request to the database.

Reduce the amount of transmitted data

I know this sounds crazy, but it could happen that someone decides to take all the rows from a database and filter using code logic. Which is fine… as long as your request doesn’t have thousands of rows… or hundreds of thousands.

If you are going to filter, paginate or even sort, let the database do its job. First of all, the database is a Data Engine, which means it’s specialized on dealing with these things and will most probably filter and order better than your codes. And secondly, transmitting a huge amount of data from your database server to your website server consumes incredible amounts of resources, even if they are stored in the same machine and you are not transmitting but using RAM. You are consuming RAM for this!!!!

Prepare data calculations beforehand

This is similar to caching but instead of storing the data in RAM we are actually storing it in the database as we need to perform some operations to retrieve that data which require lots of resources. This is normally performed to calculate arithmetical data like “amount of users”, “amount of posts”, “top 5 posters”, etc.

Some of those operations may take so much you may prefer to generate those fields on the night and have a 24 hours refreshment, or decide to go for the “every 1hour/15minutes”. In other cases the operations may take many less resources and you may consider not creating a job but update some table time to time each time a stored/table is requested or using a View, (for example if you just need how many active users you have, it doesn’t take that much to calculate, but you will improve the database by saving it and refreshing periodically).

Server Requests and transfer

Using data compression

Some browsers allow you to use data compression, you can configure your server to check that and if the browser is compatible, return compressed responses. This option will consume more processor resources as it needs to compress the responses but will reduce the bandwidth consumption and speed up the transaction improving your user’s experience.

Reducing file requests

One of the things that may overload your server is the amount of requests an user does for just accesing a page as he has to download many files: css, javacripts, images, etc. You can try to reduce the amount of these requests by placing some of the files together, like all the styles in an unique .css, same for scripts etc.

Alternatively with the scripts, you can use Google’s or some other open server to get common libraries like jquery allowing your users to get that file from another server (you save that request to yours) and also, as that user may have requested Google’s script before while visiting another page, loading the page quicker as he has it in cache yet. Gravatar does something similar with your user’s avatars.

Using image sprites

Continuing with the files reduction, you can use image sprites to place more than one image in the same file and make the user request such a sprite load with different images. Then, using some CSS tricks, you will be able to show the needed piece of that sprite in the browser. This trick works perfectly when loading smilies, as you don’t want 50 requests to your server every time a user loads your “post page” for the first time.

Leave a Reply

Close Bitnami banner
Bitnami