The write-behind pattern
A comprehensive overview of the write-behind pattern with Redis.
Welcome back, this is part 5 of the "caching patterns with Redis" series. Click here to get to the fourth part.
Assume our posts application has reached massive scales with a lot of traffic. Our database is struggling to keep up with the sheer amount of writes for the comments against our posts. How can we improve our application performance without scaling our hardware?
The write-behind pattern becomes our friend in this scenario. This pattern works well for a write-heavy workload that is okay with having eventual consistency. The write-behind caching pattern works by updating the cache first, and then the database is updated after a set period.
To see how this pattern functions, consider the image below.
Write the data into the Cache.
Return success acknowledgement.
After some time, save the data in the DB.
This approach has the following advantages:
It frees the database of the heavy workload of writes on each request, especially during peak hours.
It can reduce write requests to db through the consolidation of updates, a process called conflation. For example, if a value is changed from 1 to 2 in the cache, and then later from 2 to 3, the database will only be updated to change the value from 1 to 3
However, a disadvantage to this approach is whenever the Cache fails, the data that has not been saved in the DB is lost.
Let's see how we can use the write-through pattern in our posts application. For incoming comments, we can temporarily save them in a list in the cache then after every 1 minute, bulk write all the updates to the db.
// Router
router.post("/", async (req, res, next) => {
try {
const newComment = await createComment(req.body, req.user.id);
res.json(newComment);
} catch (error) {
next(error);
}
});
// save the comment in cache
import { v4 as uuidv4 } from "uuid";
export const createComment = async (
{ comment, postId }: ICreateComment,
userId: number
): Promise<IComment> => {
// Give every comment a unique uuid
const newComment: IComment = {
uuid: uuidv4(),
comment,
postId,
authorId: userId,
};
// Add the comment into redis list
await redis.rpush("comments", JSON.stringify(newComment));
return newComment;
};
// Cron to run bulkwrite job every minute
export const bulkWriteComments = cron.schedule(
"* * * * *",
async () => {
await writeBehindComments();
},
{
name: "bulkWriteComments",
scheduled: true,
runOnInit: true,
}
);
// Write in bulk from Cache
// TODO: Perform error handling.
export const writeBehindComments = async () => {
// Get the length of the comments list
const length = await redis.llen("comments");
const elements: IComment[] = [];
if (length == 0) {
return;
}
// Get the range of all items in the list that were
// counted in the length above.
const elemString = await redis.lrange("comments", 0, length - 1);
for (const m in elemString) {
elements.push(JSON.parse(elemString[m]) as IComment);
}
// Save the elements in db.
await db.comments.createMany({ data: elements });
// Remove the elements in redis after successfully saving them in db.
await redis.ltrim("comments", length, -1);
};
Thank you for reading this far, I hope you have liked the series and have "up'd" your skills🚀. Feel free to comment on your thoughts and improvements😊.