Caching best practices

A cache is a place where you can store temporal data; it is used to increase the performance of the applications. Here, you can find some small tips to help you with your cache.

Performance impact

Adding a cache layer to your application always has a performance impact that you need to measure. It does not matter where you are adding the cache layer in your application. You need to measure the impact to know if the new cache layer is a good choice. First, make some metrics without the cache layer and, as soon as you have some stats, enable the cache layer and compare the result. Sometimes you can find that the benefit of a cache layer becomes a hell of management to keep the cache running. You can use some of the monitoring services we talked about in the previous chapters to monitor the performance impact.

Handle cache misses

A cache miss is when a request is not saved in your cache and the application needs to get the data from your service/application. Ensure that your code can handle cache misses and the consequent updates. To keep track of the rate of missing cache hits, you can use any monitoring software or even a logging system.

Group requests

Whenever possible, try to group your cache requests as much as possible. Imagine that your frontend needs five different elements from your cache server to render a page. Instead of doing five calls, you can try to group the requests, saving you some time.

Imagine that you are using Redis as your cache layer and want to save some values in the foo and bar variables. Take a look at the following code:

    $redis->set('foo', 'my_value');
    /** Some code **/
    $redis->set('bar', 'another_value');

Instead of doing that, you can do both the sets in a single transaction:

    $redis->mSet(['foo' => 'my_value', 'bar' => 'another_value']);

The preceding example will do both the sets in one commit, saving you some time and improving the performance of your application.

Size of elements to be stored in cache

It is more efficient to store large items in your cache than storing small items. If you start caching loads of small items, the overall performance will decrease. In this case, the serialization size, time, cache commit time, and capacity usage will grow.

Monitor your cache

If you have decided to add a cache layer, at least keep it monitorized. Keeping some stats of your cache will help you know how well it is doing (the cache hit ratio) or if it is reaching its capacity limit. Most of the cache software is stable and robust, but it does not mean that you will not have any problems if you keep it unmanaged.

Choose your cache algorithm carefully

Most of the cache engines support different algorithms. Each algorithm has its benefits and its own problems. Our recommendation is to analyze your requirements deeply and not use the default mode of the cache engine of your choice until you are sure that it's the correct algorithm for your use case.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset