My £4 a month server can handle 4.2 million requests a day

Probably.

This website is hosted on a very cheap (£4 a month) VPS. This server has 1 CPU and 2GB of RAM. The slowest part of this website can theoretically handle 4.2 million requests a day. I got these results by benchmarking by websites.

I will be benchmarking 2 websites that are hosted on this server. This very website and peepopoll.com

Peepopoll.com is basically a single web page with some javascript, it's all loaded from the disk and nothing hits any database of any sort.

My website however, uses a mixture of different techniques throughout it.

They are both hosted behind an Apache webserver, my personal website is using Django so has all the associated bits needed for that as well (wsgi, etc, etc)

Let's benchmark my websites home page first. This is a Django "flatpage" which is basically a webpage stored in a database.

I am using ApacheBench to preform this benchmarking via the following command:

ab -n 1000 -c 100 <website>

This tests the site 1000 times in blocks of 100 times each.

My home page

ab -n 1000 -c 100 'https://mark.mcnally.je/'

Time taken for tests:   18.413 seconds
Complete requests:      1000
Failed requests:        0
Requests per second:    54.31 [#/sec] (mean)

This basically shows that my website can easily handle that level of traffic, even though it does slow down considerably, it won't really be noticeable to individual users. I don't get 54 unique visitors a month but it's good to know if I got that many a second my website would hold up.

A blog post

Parts of the blog posts are cached using memcached for 10 mins after they are first loaded. It will be interesting to see how quick this is compared to directly reading from the database (which the homepage is)

ab -n 1000 -c 100 'https://mark.mcnally.je/blog/post/My%20Most%20useful%20Note%20taking%20tool/'

Time taken for tests:   15.749 seconds
Complete requests:      1000
Failed requests:        0
Requests per second:    63.50 [#/sec] (mean)

So using caching it looks like we can serve about 10 more requests a second.

A standard webpage

As mentioned earlier PeepoPoll.com is just an index.html file that also loads some javascript. This is all coming straight from the filesystem. There is no frameworks in place, it's just raw html, CSS and JS sitting behind a standard apache server.

ab -n 1000 -c 100 'https://peepopoll.com/'

Time taken for tests:   5.539 seconds
Complete requests:      1000
Failed requests:        0
Requests per second:    180.54 [#/sec] (mean)

Now that's pretty good! Triple the speed of using a cache.

What this tells us

These benchmarks show that a very cheap server can easily handle 50 requests a second to a "full stack" website. Caching provides a noticeable increase in the amount of requests we can handle and running just a static site triples the amount of requests able to be handled.

If we can handle 50 requests a second that means we can handle 4.2 million requests a day. So £4 is all you need to handle that amount of traffic! Probably.1


  1. Not taking into account any issues that may occur around CPU/RAM/Disk IO due to sustained levels of traffic as well as bandwidth issues. 


Last modified on Sept. 8, 2021, 12:21 p.m.

Published on Sept. 7, 2021, 9 p.m.