Introduction
Most applications depend on data, whether it comes from a database or an API sends a network request to the API server and returns the data as the response, These round trips take time and can increase your application response time to users. Furthermore, most APIs limit the number of requests they can serve an application within a specific time frame a process known as rate limiting. To get around these problems, you can cache your data so that the application makes a single request to an api, and all the subsequent data rquests will retrieve the data from the cache.Redis, an in-memory database that stores data in the server memory, is a popular tool to cache data.
What is Caching?
A cache is a memory buffer used to temporarily store frequently accessed data. It improves performance since data does not have to be retrieved again from the original source. Caching is actually a concept that has been applied in various areas of the computer/networking industry for quite some time, so there are different ways of implementing cache depending upon the use case. For example very common cache, used on almost all PCs, is the web browser cache for storing requested objects so that the same data doesn't need to be retrieved multiple times.
Redis
Redis is an open source , in-memory data structure store used as a database, cache, message broker, and streaming engine. Redis provides data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperlogs, geospatial indexes and streams. Redis has a built-in replication Lua scripting, LRU eviction , transactions and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis cluster.
Redis setup
For locally running the redis server we can docker-compose with following configuration
version: '3.3'
services:
cache:
image: redis:6.2-alpine
restart: always
ports:
- '6380:6379'
For using the redis in our golang code we need to initiate the redis client in the following way: package providers
package providers
import (
"context"
"github.com/go-redis/redis/v8"
"log"
"time"
)
type CacheClient struct {
Client *redis.Client
Ctx context.Context
}
func NewCacheClient() CacheProvider {
client := redis.NewClient(&redis.Options{
Addr: "localhost:6380",
})
err := client.Ping(context.Background()).Err()
if err != nil {
log.Fatalf("unable to connect redis server %v", err)
}
return &CacheClient{
Client: client,
Ctx: context.Background(),
}
}
func (c *CacheClient) Get(Key string) string {
return c.Client.Get(c.Ctx, Key).Val()
}
func (c *CacheClient) Set(key string, value interface{}, expiryTime time.Duration) error {
return c.Client.Set(c.Ctx, key, value, expiryTime).Err()
}
func (c *CacheClient) Delete(key string) int64 {
return c.Client.Del(c.Ctx, key).Val()
}
In this example we are creating a function(NewCacheClient) to initialise the redis client which is taking the addr parameter and it also have many more parameters which we can also configure but for our use it is not required right now. After creating the redis client we are checking whether it is connected or not using the Ping() function.
A cache has some predefined functionality that is to set and get the data for which we have created an interface and our cache struct is implementing it.
package providers
import "time"
type CacheProvider interface {
Delete(key string) int64
Set(key string, value interface{}, expiryTime time.Duration) error
Get(Key string) string
}
Startup Code
package main
import (
"fmt"
"redisSetup/providers"
"time"
)
func main() {
cacheObject := providers.NewCacheClient()
// setting value in the cache
err := cacheObject.Set("Name", "Shivam", time.Duration(time.Now().Add(time.Minute*60).Unix()))
if err != nil {
return
}
// getting value form the cache
value := cacheObject.Get("Name")
fmt.Printf("value : %v\n", value)
// deleting value from cache
_ = cacheObject.Delete("Name")
value = cacheObject.Get("Name")
fmt.Printf("value : %v", value)
}
Advantages of Cache Memory
• Cache is faster and smaller memory which is used for storing the data and are used more frequently from the main memory locations.
• A mapping function is used to correspond the main memory block in the cache memory.
• A primary cache has a less access time comparable to processor register and it is always placed on the processor chip.
• The secondary cache is mention as the level 2 cache and it is also placed in the processor chip.
• It is generally placed between the primary cache and the rest of the memory.
• Cache memory is faster than main memory as a result of two main reasons they are.
• Static Random Access Memory (SRAM) is used in cache memory on the other hand, main memory uses Dynamic Random Access Memory (DRAM).
• Cache memory stores the instructions which may be required by the processor next time; as it helps in recovering the information faster as compared to Random Access Memory (RAM).
• It consumes less access time as compared to main memory.
• It stored program that can be executed within a short period of time.
• It stores data, instructions & information for a limited period.
• The Cache memory is high speed semiconductor memory which can speed up the CPU.
• It makes computer system more speedy.