admin管理员组

文章数量:1289383

I have used Redis for caching in .NET using the IMemoryCache implementations for both Redis and SQL Server. It is straight forward and no problem.

I've been trying to understand IDistributedCache using Redis and it is also extremely "simple" according to all the tutorials and information online.

However I cannot get my head around how this actually is supposed to work in the real world. In my understanding the idea of a distributed cache is that it exists on every server where your distributed app is running. In Redis you can configure clusters if you know the endpoints of the servers where your app is running.

So how does this work when my app is deployed to a cloud platform that automatically scales up and down the server instances? How to I dynamically tell Redis to create or remove cluster nodes from the servers as the cloud service creates them?

Am I just overthinking this and the IDistributedCache imlementation from the Redis stackexchange does this automatically with no configuration? I can't seem to find any info about this online.

I tried look on google and stack exchange for how to configure Redis, specifically using .NET IDistributedCache, clusters to automatically create new nodes when a cloud service scales up or down and app.

I found only basic tutorials on how to add StackExchange.Redis to the builder services and then access the cache. Nothing about scaling with a cloud service.

本文标签: