Open
Description
Description
I am trying to retrieve some posts from my MongoDB database but it was taking a lot of time to get posts to I implemented Redis caching in my API but now I am facing a problem that it's returning the same posts from the cache every time I hit the API How to get new and different posts from redis cache whenever I hit the API? Instagram uses Redis and they give me new and different posts every time I scroll so fast How to balance things in my case?
What can be a better approach to solve this problem with the API?
router.get('/posts', async (req, res) => {
try {
const cacheKey = `posts:${req.query.page || 1}`;
const cacheTTL = 60;
const cachedData = await redisClient.json.get(cacheKey, '$');
if (cachedData) {
console.log('Data fetched from Redis cache');
return res.json(cachedData);
}
const page = Number(req.query.page) || 1;
const limit = Number(req.query.limit) || 50;
const skip = (page - 1) * limit;
const result = await User.aggregate([
{ $project: { posts: 1 } },
{ $unwind: '$posts' },
{ $project: { postImage: '$posts.post', date: '$posts.date' } },
{ $sort: { date: -1 } },
{ $skip: skip },
{ $limit: limit },
]);
await redisClient.json.set(cacheKey, '$', result, 'EX', cacheTTL);
console.log('Data fetched from MongoDB and cached in Redis');
res.json(result);
} catch (err) {
console.error(err);
res.status(500).json({ message: 'Internal server error' });
}
});