We’ve now put our latest addition to the Derpibooru server family into production service and this morning I spent an hour or two going over our HTTP cache configuration and we’re now seeing massively improved hitrates and overall pipelining performance.
What’s that mean for you? Well, stuff’s gonna go faster. We’ve also moved from a teensy tiny little 6 gigabyte HTTP cache to a 60 gigabyte cache - 24 gigs in RAM and another 36 on a solid state disk. This, coupled with improvements on deciding what we’re caching and for how long within our cache server we’re keeping stuff, is already yielding a massive speed boost.
We’ve also now configured our cache to provide a grace period to the backend. This is fancy talk for having the cache ask once - and only once - for any given image or thumbnail. Previously our cache server asked every time anyone asked, meaning we ended up with a lot of duplicated effort on our server to generate thumbnails etc. The grace period now means we only need to generate any image or other resource once and the cache will handle it for everyone asking for it (at which point it’s crazy fast). This means the application servers are much calmer, have more resources, and will answer requests for pages faster.
As for Spitfire itself, it’s an Intel Xeon i7 W3520 with 8 cores, 24 gigabytes of DDR3 RAM, 80GB of solid state disks and 2 terabytes of RAID1 storage, which will be coming online as a mirror to our GlusterFS store for images in the coming week, taking some load off Tank and longer-term replacing Tank altogether. With 1Gbps of connectivity it’s much better suited to serving up images than Tank was.
If you’re interested you can inspect the HTTP headers of images to see if they’ve been served from cache (X-Cache), how many times the image has been served from cache (X-Cache-Hits) and so on.
tl;dr we’ve got a new box and it is speeding up all of your image deliveries and taking massive load off the main server, enjoy.
As ever, Derpibooru is paid for without advertising - please consider donating over at this link to support our hosting costs.