Redis Caching VS Cloudflare page rules — NodeJS

Compare Redis Caching and Cloudflare Page Rules in NodeJS applications. Learn how Redis, an in-memory data store, and Cloudflare's optimization features impact site performance, speed, and server load. Discover the best practices for caching in NodeJS.

redis

TLDR; for caching

Caching is basically a way where you take away the heavy load off of your main database or file server in case of static media, like images, and video, etc, and throw it over to another preferably cheaper and faster platform/system, like Cloudflare, Redis or Memcached.

Why to cache

Suppose we have a blog that is being accessed a million times every minute, so that would mean ~16,667 read operations every second, now that is a very unrealistic situation but just to put things into scale, any free or small size self-hosted or managed database would struggle if not crash in a situation like that, but we don’t want it to crash without burning a hole in our pocket. this is where caching comes in.

This article will cover Redis implementation in NodeJS specifically, you may follow along for Cloudflare.

In my career working with NodeJS, I have come across two main ways to cache data, both cheap and simple.

Redis and Cloudflare functions.

1. Redis

redis_logo

Switch to dark mode to see the rest of the image

Redis has multiple use cases, like an in-memory database for caching, persistent general-purpose database (contrary to many believes Redis has data persistence). For the purpose of this article, we will concentrate on the in-memory database part.

Q. Why is Redis's in-memory database faster than other mainstream databases like Postgres, and MongoDB?
Ans. An In-Memory Database(IMDB) relies on the main memory of the computer (also known as the RAM) for data storage while others use the Disk Storage, which is quite a bit slower than the RAM.

Q. When should you use Redis?
Ans. One would want to use Redis, when they want to do caching server-side, and want full control over the cache, i.e conduct CRUD operations on the cache.

A few numbers:

Here is a postman request to my Node+Express API, which reads data from firebase firestore.

Without Redis

without_redis

Request to Direct IP without Redis Cache (464ms response time)

With Redis

with_redis

Request to Direct IP *with* Redis Cache (157ms response time)

We can observe a drastic reduction in the response times, between Redis and not Redis here is how it's implemented.

app.get("/sileo-depiction/:package", async (req, res) => {
    const dep_package = req.params.package; // package name
    try {
        const redisDepKey = `${dep_package}:depiction`; // redis key

        return client.get(redisDepKey, async (err, cachedData) => { // check if cached
            if (cachedData) { // if cached
                const parsedData = JSON.parse(depiction); // parse cached data
                res.status(200).send({ // send cached data
                    httpStatus: 200,
                    success: true,
                    success_binary: 1,
                    data: parsedData,
                    message: "Successfully retrieved data from cache",
                    meta_data: {
                        time_epoch = Math.floor(Date.now() / 1000),
                        time_iso = new Date().toISOString(),
                    }
                });
            } else { // if not cached
                const firestoreQueryNonce = await firebaseService.getRelease(dep_package); // get fresh data from firestore or other persistent database
                client.setex(redisDepKey, 259200, JSON.stringify(firestoreQueryNonce)); // cache data for 3 days || redis always store data in string format
                res.status(200).send({ // send fresh data
                    httpStatus: 200,
                    success: true,
                    success_binary: 1,
                    data: firestoreQueryNonce,
                    message: "Successfully retrieved data from database",
                    meta_data: {
                        time_epoch = Math.floor(Date.now() / 1000),
                        time_iso = new Date().toISOString(),
                    }
                });
            }
        });
    } catch (error) { // if error
        console.log(error);
        res.status(500).send({ // send error acknowledgement
            httpStatus: 500,
            success: false,
            success_binary: 0,
            data: null,
            message: "could not get data",
            meta_data: {
                time_epoch = Math.floor(Date.now() / 1000),
                time_iso = new Date().toISOString(),
            }
        })
    }
});

Revalidation

A good caching system would always account for the validity of the data it holds i.e does it need to get updated/revalidated. There are two key things one needs to take care of, stale data and invalid data. Now stale data in itself just implies that the data that we have in our cache is just simply old data and it would be prudent to delete it and cache it again in case there were some changes. Invalid data means that the data we have in cache is no longer equal to the data stored in our main/persistent database, and it needs to be updated, to put it simply we are unaware of the state of the data when its stale, but in case of invalid we are aware that the data has changed in one way or another.

here is how one would delete data from Redis.

app.get("/sileo-depiction-revalidate/:package", async (req, res) => {
    const dep_package = req.params.package; // package name
    try {
        const redisDepKey = `${dep_package}:depiction`; // redis key

        return client.del(redisDepKey, async (err, success) => { // check if cached
            if (success == 1) { // if success
                res.status(200).send({ // send success acknowledgement
                    httpStatus: 200,
                    success: true,
                    success_binary: 1,
                    message: "Successfully deleted cached depiction",
                    meta_data: {
                        time_epoch = Math.floor(Date.now() / 1000),
                        time_iso = new Date().toISOString(),
                    }
                });
            } else { // if failure
                res.status(200).send({ // send failure acknowledgement
                    httpStatus: 200,
                    success: true,
                    success_binary: 1,
                    data: firestoreQueryNonce,
                    message: "could not delete cached depiction",
                    meta_data: {
                        time_epoch = Math.floor(Date.now() / 1000),
                        time_iso = new Date().toISOString(),
                    }
                });
            }
        });
    } catch (error) { // if error
        console.log(error);
        res.status(500).send({ // send error acknowledgement
            httpStatus: 500,
            success: false,
            success_binary: 0,
            data: null,
            message: "could not delete data",
            meta_data: {
                time_epoch = Math.floor(Date.now() / 1000),
                time_iso = new Date().toISOString(),
            }
        })
    }
});

Deleting a key from Redis since Redis is a key-value pair DB (NodeJS Code)

After you delete the cached data, you would ideally wait for someone to request the data again if someone does it, as that would save you the memory that is needed to store the data, and also it would in a way provide the latest possible data at the time of request and caching.

2. Cloudflare Page Rules

cloudflare

Cloudflare is a popular web infrastructure and website security company that provides content delivery network and DDoS mitigation services.

We are gonna focus on the content delivery part.

Q. Why is Cloudflare a better option than Redis?
Ans. When using Cloudflare, the stress of writing code, managing resources, and dependencies is taken away from you, and it is usually way faster than Redis caching cause it's a distributed network of servers that cache and delivery the data eliminating the network lag.

Q. When should you use Cloudflare?
Ans. One would want to use Cloudflare, when they want to cache static media, like images, and videos, or data that doesn’t change for a really long period of time.

A few numbers

Without Cloudflare Caching

without_cloudflare

Standard Request to API without Cloudflare Cache

With Cloudflare Caching

with_cloudflare

Standard Request to API *with* Cloudflare Cache

We can see there is a ~150ms difference, and it's gonna stay quite consistent across your user base as with Cloudflare caching the network lag (respective to the distance between client and server) is almost next to none.

Here is how to do it.

I’m gonna assume here that you are already using Cloudflare for your web app

1. Log into your Cloudflare account and select the domain you want to have the caching on. login to cloudflare 2. Go into the Rules Tab go to rules

3. Press the Create Page Rule Button. click create page

4. Fill the form fill the form

TTL stands for Time To Live

You can set the Edge Cache TTL and Browser Cache TTL as per your preference

My recommendation is to keep the Browser Cache TTL minimum cause you cannot control the client-side cache.

5. Save and Deploy save and deploy

And there you go that's how you can cache your routes with Cloudflare page rules, these also work awesome with static content, like images, and videos.

Revalidation

Now if you want to revalidate, delete the cached data, it's quite simple as well.

1. Log into your Cloudflare account and select the domain to which you want to clear the cache. login to cloudflare

2. Go into the sidebar, then hover over the Cache button and select Configuration from the drop-down. go to cache page

3. Click the Custom Purge Button click custom purge button

4. Since we cached https://repo.plutorepo.com/sileo-depiction/* where the ‘*’ loosely means everything we will fill the same link with the ‘*’ to delete the whole cache or with a query parameter to delete a specific cached response. regex settings

5. Press the Purge Button.

After you delete the cached data, it would automatically get cached again whenever someone requests that data again.

Conclusion

Which one should you use?

Well it's hard to give a one size fits all response, but here are the things you should keep in mind when deciding to use either.

Q. Are you gonna store static media like images, videos?
Ans. If your answer is “yes”, then Cloudflare page rules are a perfect match.

Q. Are you only gonna store JSON data or other data that can be stored in a manageable string format?
Ans. If your answer is “yes” then either Redis or Cloudflare will get your job done.

Q. Do you want full control over how the cache is created, read, updated, deleted (CRUD)?
Ans. If your answer is “yes” you would want to use Redis only.

Ideally, any API at scale will probably use both in tandem, where you would update your cache data in Redis, and eventually, Cloudflare will revalidate it from your cache eliminating thousands of server calls to a few hundred, and almost every web app has to deliver some sort of static media files, be it icons, images, and videos, etc so Cloudflare page rules would be needed to take care of those.

Thank you for reading.

If you want to see more articles from me, consider following me.

Here is my Twitter in case you want to get in contact with me.