Design a Caching Pattern to support multiple Caching implementations
- Simple Key Value implementation
- VIEW-GET design
- Scenario 1
- Request R1 received for Input I1
- Service checks cache
- Cache HIT then respond R1 with ouput O1
- Cache MISS then
- Forward request to responsible Service S1
- Retrive output O1 from S1 and store in cache
- Respond R1 with output O1
- Request R2 received for Input I1
- Cache hit occurs
- Scenario 2
-
Request R1 received for Input I1
-
Service checks cache
- Cache HIT then respond R1 with ouput O1
- Cache MISS then
- Forward request to responsible Service S1
- Request R2 received for Input I1
- Retrive output O1 from S1 and store in cache
- Respond R1 with output O1
-
Request R3 received for Input I1
-
Cache hit occurs
- Scenario 1 for VIEW: request from cache only, client is aware this data might be stale
- Request R1 received for Input I1
- REQUEST HANDLER forwards to Router R1
- Router R1 manages routing for VIEW vs GET request
- R1 forwards to GET CACHE
- GET CACHE performs "get from cache" and forwards ouput O1 to R2
- Router R2 manages routing for cache HIT vs cache MISS
- Cache HIT then
- forward O1 to RESPONSE HANDLER
- get cache details OC1 like how long O1 was cached
- Respond R1 with O1 and OC1
- Cache MISS then split messages as M1 and M2
- For M1
- forward empty O1 to RESPONSE HANDLER
- add message displaying that fetching details from source
- Respond R1 with O1 and OC1
- For M2
- forward R1 to PUT Cache
- PUT cache executes R1 and puts ouput O2 in cache
- forward to GET Cache and continue from step 5
- For M1
- Cache HIT then
- Scenario 1 for GET: request from source only, client is aware this is time consuming
- Request R1 received for Input I1
- REQUEST HANDLER forwards to Router R1
- Router R1 manages routing for VIEW vs GET request
- R1 forwards to EVICT CACHE
- EVICT CACHE removes I1 entry in cache
- Coverts GET to VIEW and forwards to Router R1
- Continue processing from step 3.in Scenario 1 for VIEW