Optimize Your Site for High Traffic

Imagine having your webpage go viral on one of the most popular social news websites. You could get thousands of views and you could get conversions and followers and subscribers – but your website goes down. You miss out on so much traffic because your site won’t respond. This is what happens to many people, but it doesn’t have to happen you.

The time to protect yourself is now. You can optimize your website to handle all those visitors instead of having your site lock up.

The reason sites lock up is because the server isn’t optimized to handle so many requests at such a high rate. When a site gets on the front page of a social media website, it gets a burst of clicks.  This is where server efficiency comers into play. Follow the instructions below to optimize your site for such bursts.

MaxClients and KeepAliveTimeout

Let’s start from the root. If your website is hosted on a VPS or dedicated hosting, you can optimize your Apache server (if of course you use Apache) because you have root access. The two settings you’ll look at are the MaxClients and KeepAlive. First you need to navigate to your server’s root directory (you can probably do this through your website’s web-based file manager). Navigate to /etc/httpd/conf/httpd.conf and open the file. Find the line which starts with ServerLimit. Right below it you’ll see MaxClients. This is how many clients Apache will allow at one time; the rest will be queued up. Your server is probably powerful enough to handle more (you can find other guides to help you tweak this number) so go ahead and raise the value if your server can handle it. Make sure both the ServerLimit and MaxClients are identical and make sure you don’t raise the value too high (or your server may lock up).

The other setting we’ll be looking at in the httpd.conf is your KeepAliveTimeout. This specifies how many seconds Apache will wait for another request before closing the KeepAlive connection; Apache’s default is 15 seconds. If your KeepAliveTimeout is 15 and your MaxClients is 25, the  you see the obvious problem. At maximum, you will only be able to serve 25 visitors every 15 seconds. Since the KeepAliveTimeout controls the seconds since the last request, it’s safe to change it to 1 or 2. Your visitors won’t notice a difference in loading time but your server will be able to serve more requests. Save the file and restart your Apache server to apply the changes.

Cache mySQL Queries

mySQL has a great query cache which can help your site, especially if you have large tables. This caches queries and will serve a cached version of a query only if the query is the exact same (don’t worry about that if you didn’t understand it). It is enabled by default but the cache’s memory limit is usually too low. You can change this in /etc/my.cnf find query_cache_size and set it to a higher value (a good example would be 16 or 20). Save the file and restart your Apache server to apply the changes.

Reduce PHP and Database Queries

If you run a CMS system like WordPress or Joomla, take a look at your theme’s code and see where you could remove some PHP. Replacing PHP with static HTML will make your pages faster because the code doesn’t have to be executed. Removing the PHP also reduces CPU load. Here is an example of bad PHP usage in WordPress:

<a href="<?php bloginfo('url'); ?>"><?php bloginfo('name'); ?></a>

You could save two queries by changing the PHP into static text. After all, why should a database call be made to get the blog’s URL or name? If you ever do change any of these, you can simple come back to the code and change the static text.

<a href="http://w3techie.com">TechGeeks Blog</a>

Both lines look the exact same to search engines and users. The only difference is that the unoptimized version queries the database twice and increases server load (and page load time).

Optimize mySQL Databases

When a script deletes a row in a mySQL database, the space where the data used to be is called overhead. You should optimize your databases every once in a while; optimization removes all the overhead. Less overhead means that some database operations will be faster (such as a SELECT * query). You can optimize tables through phpMyAdmin; some plugins for popular CMS platforms can also optimize your tables.

Client Side Caching

How often do you change your website’s CSS, images, or javascript? Most of these never change, but if they don’t get cached they have to be loaded on every single page of your website. If your visitors decide to check out other pages of your site, you can greatly decrease the server load they generate by caching some objects on the client side. The easiest way to do this is site-wide through your .htaccess. We won’t go into the details, but look into it; it will help you survive your reddit front page appearance.

Server Side Caching

This is easy if you use any popular CMS. We use WordPress, and we use WP-Super-Cache for our server side caching. The plugin generates static HTML versions of our pages so no PHP execution has to be done for each visitor. This server side caching is so powerful that just this alone could let you survive on the front page of major social media sites.

Compression

Gzip is one of the easiest compression technologies you can enable on Apache. You can do it on a server level, through your .htaccess, or on a page level with a simple line of code. Even though gzip will increase CPU usage, it will probably reduce your page size by around 70%, and a faster loading page is important.

We also recommend that you compress your images using Yahoo’s SmushIt tool. The tool does removes meta-data and unneeded pixels from images, decreasing image size without sacrificing quality. If your images are hosted on your server, it’s essential to have them as small as possible to cope with traffic spikes.

Offload Scripts

Google has a public Ajax Library which you can use to call some large scripts (like jQuery). The scripts will be served from Google’s CDN; this has three advantages. You offload the scripts so your visitors won’t have to download them from your site, you allow browsers to continue downloading from your site while the scripts are being downloaded (javascript typically blocks parallel downloads from the same domain), and you have a high chance that your visitors will have already downloaded and cached these scrips (since they are from Google), so they might not have to download them at all.

Conclusion

Combine all these techniques and you can be sure that your website will be able to handle a traffic spike.You’ll also have a faster loading website, which can decrease your visitor bounce rates and increase visitor satisfaction. We also have a post about optimizing PHP on Apache which can help you decrease server load and speed up your pages.

Leave a Reply