Types of Cache

In common there are four types of Cache…

5.1. Application Server Cache:

In the “How does Cache work?” section we discussed how application server cache can be added to a web application.

  • A cache can be added in in-memory alongside the application server.
  • The user’s request will be stored in this cache and whenever the same request comes again, it will be returned from the cache.
  • For a new request, data will be fetched from the disk and then it will be returned.
  • Once the new request will be returned from the disk, it will be stored in the same cache for the next time request from the user.

Note: When you place your cache in memory ,the amount of memory in the server is going to be used up by the cache. If the number of results you are working with is really small then you can keep the cache in memory. 

Drawbacks of Application Server Cache:

  • The problem arises when you need to scale your system. You add multiple servers in your web application (because one node can not handle a large volume of requests) and you have a load balancer that sends requests to any node.
  • In this scenario, you’ll end up with a lot of cache misses because each node will be unaware of the already cached request.
  • This is not great and to overcome this problem we have two choices: Distribute Cache and Global Cache. Let’s discuss that…

5.2. Distributed Cache:

In the distributed cache, each node will have a part of the whole cache space, and then using the consistent hashing function each request can be routed to where the cache request could be found. Let’s suppose we have 10 nodes in a distributed system, and we are using a load balancer to route the request then…

  • Each of its nodes will have a small part of the cached data.
  • To identify which node has which request the cache is divided up using a consistent hashing function each request can be routed to where the cached request could be found. If a requesting node is looking for a certain piece of data, it can quickly know where to look within the distributed cache to check if the data is available.
  • We can easily increase the cache memory by simply adding the new node to the request pool.

5.3. Global Cache:

As the name suggests, you will have a single cache space and all the nodes use this single space. Every request will go to this single cache space. There are two kinds of the global cache

  • First, when a cache request is not found in the global cache, it’s the responsibility of the cache to find out the missing piece of data from anywhere underlying the store (database, disk, etc).
  • Second, if the request comes and the cache doesn’t find the data then the requesting node will directly communicate with the DB or the server to fetch the requested data.

A CDN is essentially a group of servers that are strategically placed across the globe with the purpose of accelerating the delivery of web content. A CDN-

  • Manages servers that are geographically distributed over different locations.
  • Stores the web content in its servers.
  • Attempts to direct each user to a server that is part of the CDN so as to deliver content quickly.

CDN is used where a large amount of static content is served by the website. This can be an HTML file, CSS file, JavaScript file, pictures, videos, etc. First, request ask the CDN for data, if it exists then the data will be returned. If not, the CDN will query the backend servers and then cache it locally.

Caching – System Design Concept

Caching is a system design concept that involves storing frequently accessed data in a location that is easily and quickly accessible. The purpose of caching is to improve the performance and efficiency of a system by reducing the amount of time it takes to access frequently accessed data.

Important Topics for Caching in System Design

  • What is Caching
  • How Does Cache Work?
  • Where Cache can be added?
  • key points to understand Caching
  • Types of Cache
  • Applications of Caching
  • What are the Advantages of using Caching?
  • What are the Disadvantages of using Caching?
  • Cache Invalidation Strategies
  • Eviction Policies of Caching

Similar Reads

1. What is Caching

...

2. How Does Cache Work?

...

3. Where Cache Can be Added?

Caching is used in almost every layer of computing....

4. Key points to understand Caching

Caching can be used in a variety of different systems, including web applications, databases, and operating systems. In each case, caching works by storing data that is frequently accessed in a location that is closer to the user or application. This can include storing data in memory or on a local hard drive....

5. Types of Cache

In common there are four types of Cache…...

6. Applications of Caching

Facebook, Instagram, Amazon, Flipkart….these applications are the favorite applications for a lot of people and most probably these are the most frequently visited websites on your list....

7. What are the Advantages of using Caching?

Caching optimizes resource usage, reduces server loads, and enhances overall scalability, making it a valuable tool in software development....

8. What are the Disadvantages of using Caching?

Despite its advantages, caching comes with drawbacks also and some of them are:...

9. Cache Invalidation Strategies

Cache invalidation is crucial in systems that use caching to enhance performance. When data is cached, it’s stored temporarily for quicker access. However, if the original data changes, the cached version becomes outdated. Cache invalidation mechanisms ensure that outdated entries are refreshed or removed, guaranteeing that users receive up-to-date information....

10. Eviction Policies of Caching

Eviction policies are crucial in caching systems to manage limited cache space efficiently. When the cache is full and a new item needs to be stored, an eviction policy determines which existing item to remove....

11. Conclusion

Caching is becoming more common nowadays because it helps make things faster and saves resources. The internet is witnessing an exponential growth in content, including web pages, images, videos, and more. Caching helps reduce the load on servers by storing frequently accessed content closer to the users, leading to faster load times. Real-time applications, such as online gaming, video streaming, and collaborative tools, demand low-latency interactions. Caching helps in delivering content quickly by storing and serving frequently accessed data without the need to fetch it from the original source every time....