We've been using the enterprise library for quite some time now and so far it's been great, up until now we haven't had a single problem with it.
Without getting into too much detail we've got a storage medium containing about 10-20 million blobs whose sizes can range between 1 KB and 500 MB. Since these blobs are frequently requested, we intended to cache each of them as they're requested
on the server which requested them for a pre-determined period of time. Originally we started working with the IsolatedStorage backing store, but due to the files being placed on a limited partion on the server we ended up writing our own backing store that
allows the blobs to be stored in a location of our choosing. Everything has been working beautifully so far, until we noticed an out of memory error logged on the servers with increasing frequency.
Keep in mind we haven't had any problems with the backing store at all, the problem seems to be coming from the underlying Cache class storing the value to be cached in-memory along within the backing store. I originally thought the Cache class would simply
be communicating with the backing store to retrieve information contained within it. Whether that's in-memory if you're using the null backing store, in the cloud, or wherever the custom backing store is storing the data. However after looking at the code,
it seems to be storing them inside itself in-memory as well as in the backing store.
My question: Is there an easy way to change the Cache class to either not store the values in-memory or use a custom implementation without having to rewrite most of the caching block?
If we're going to end up rewriting most of the caching block to accomplish this, we'll probably just remove the dependency and write a custom caching provider. Has anyone on the entlib team ever considered rewriting that bit inside the Cache class to not
store the data and just request the data from the backing store? Since we've been having to put the null backing store type for in-memory caching, simply moving the in-memory caching logic to inside the NullBackingStore class would alleviate the dependency
the Cache class has on memory. Granted there would be a performance hit if you're accessing the data on the file system, but it's still faster than having to retrieve the blob across the network again.
Thanks for all your hard efforts, they're definitely appreciated!