Ahhhhhh! - Enterprise Database Caching Stores Cache in Memory

Topics: Caching Application Block
Dec 12, 2007 at 9:58 PM
Hello:

I have encountered some behavior with Caching that I did not expect. Call me naive, but I expected that the Enterprise Caching was going to the database each time to retrieve objects from the cache, but I have found that in my case, this is not true.

The other day, one of our developers discovered that the w3p.exe process on our DEV web server was running at about 650MB. He determined that the cause was a web service that I had created that uses the Enterprise Caching to store objects in SQL Server. After running a couple of tests and after I noticed that the memory on the server would decrease and increase at certain intervals I came to the conclusion that the cached objects are actually there in memory.

First of all, I ran a little test program that called my web service which added items to the cache. Then of course, I made sure that the cached objects were there byt running another request which used the key to retrieve the same object. Then ( and this is the important part ) I deleted the record from the database and did another request using the same key...and got the same object back. When I closed my test program, I noticed the record was back in the database.

Obviously this is a part of the design of the Enterprise Caching I was not aware of. However, does anyone know a way to force the Caching Block to look in the database every time that it needs to retrieve an object? I will probably take a performance hit in terms of time here, but any suggestions would be appreciated.

Thanks!
Sincerely,
Kevin Parkinson
Dec 14, 2007 at 12:58 PM
Hi Kevin,

The cache keeps an in-memory version of the stored elements and will use the database as a backing store; there is no built-in way to change this behavior. You could change the implementation, but it would amount to a reimplementation of the caching logic. Keep in mind that the cache is not only about storing and accesing the items; the cache also does expirations and scavenging, and that's where the in-memory representation is very useful.

Fernando

(I lost my previous answer, please excuse my terseness)
Dec 17, 2007 at 7:44 PM
Hi, Kevin,

This was a design decision when the block was created. It was a way to optimize for access time. It would be a big deal to take this out of the block, as it uses the in-memory representation as its primary store and the persistence mechanisms as a way to persist across AppDomain recycles.

bab (original block author)
Aug 27, 2011 at 11:31 AM
fsimonazzi wrote:
You could change the implementation, but it would amount to a reimplementation of the caching logic.

We're using Entlib v5, is there still no other way to achieve this than reimplementing the whole logic?

We would need a behavior like this:

  • client app instantiates the CacheManager (CM)
  • the CM doesn't deserialize all stored items from the backing store, but only a list of cached items
  • if I wanna get item "myItem" from the backing store, I'd call CM.GetData("myItem") and then the CM would only deserialize this single object

Is there a way to do something like this? Otherwise a client application that makes use of the CAB will have serious memory problems after a while as the cache may grow bigger an bigger and all those items are deserialized, which isn't necessary at all :/

Sep 4, 2011 at 5:44 PM

nobody any ideas, hm?

Sep 5, 2011 at 2:41 AM

Hi,

I'm afraid there is no other way than to change the out of the box implementation of caching. As what Bab said, it is a design decision as a way to optimize the access time. Regarding with memory issues, Fernando already mentioned that caching is not just about storing and accessing cached items, but also does expirations and scavenging.

 

Noel Angelo Bolasoc
Avanade Software
Avanade, Inc.
entlib.support@avanade.com