Cache Size option

Topics: Developer Forum, Project Management Forum, User Forum
Coordinator
Dec 16, 2007 at 2:18 PM
hi folks,

as requested I have implemented in the upcoming release the option to set an max. amount of
the cache size. The following code has been tested and it seems to be very slow:

DateTime startTime = DateTime.Now;
if ( (this.cacheAmountOfObjects * (1024 * 1024)) <= LocalCache.Size())
{
Console.WriteLine(@"Current Size of Cache: {0}", LocalCache.Size() == 0 ? 0 : LocalCache.Size() / (1024 * 1024));
}
DateTime stopTime = DateTime.Now;
TimeSpan duration = stopTime - startTime;
Console.WriteLine(duration);


Time overview:

with 40 XPath objects: 00:00:00.0625000
with 60 XPath objects: 00:00:00.1406250
with 80 XPath objects: 00:00:00.1562500
with 100 XPath objects: 00:00:00.3281250
with 105 - 110 XPath objects it received the max: 00:00:00.5312500

so I need a better way then to call everytime: LocalCache.Size()


/// <summary>
/// calculates the actual size of this instance.
/// </summary>
/// <returns>a <see cref="long"/> object.</returns>
public long Size()
{
Hashtable hd = null;
lock (typeof(Cache))
{
hd = new Hashtable(dict);
// hd = (HybridDictionary)dict;
}
long size = 0;

foreach (DictionaryEntry de in hd)
{
size += Formatters.Serialization.BinarySerialize(de).Length;
}
return size;
}

while i changed the 3 calls to 1 call:

testCount = LocalCache.Size();
if ((this.cacheAmountOfObjects * (1024 * 1024)) <= testCount) .... etc ..

with 40 XPath objects: 00:00:00.0781250
with 60 XPath objects: 00:00:00.1718750
with 80 XPath objects: 00:00:00.3593750
with 100 XPath objects: 00:00:00.3437500
with 105 - 110 XPath objects it received the max: 00:00:00.6250000

as you can see even if I use even one call it's in the avg. sometimes slower then with 3 calls of LocalCache.Size();

my suggestion is the following, we use the clean up interval to validate the size, in case the size exceeded the size we use a simple flag.

any suggestions / ideas are welcome :-)

Coordinator
Dec 16, 2007 at 2:26 PM
i forgot to mention the option, we also can make the following:

testCount += COM.Formatters.Serialization.BinarySerialize(msg.Data).Length;

this would mean we would calculate it every arrival on the server and while we delete items we first need to remove the amount from the counter... this is running fast but it happens many times :-(

Developer
Dec 16, 2007 at 3:53 PM
I suggest that the format of storing information should change.

The best way of having a max cap on memory would be to store everything on the server side as a byte array (so that the server is agnostic to what it contains, similar to memcached).

If you do that all you need to do is to have a counter that upon adding or removing data from the cache will simply add up the length of the byte array.
To get the size simply multiple sizeof(byte) with the amount of bytes you have (stored in the previous counter).

This will give you a very good way of knowing exactly how much memory you contain and will allow you to utilize a better eviction strategy when you reach the top.

The idea of serializing the data every time you want to get the real length of data stored is very wasteful.
Coordinator
Dec 18, 2007 at 8:24 AM
Edited Dec 18, 2007 at 11:59 AM
// post has been deleted (ronischuetz) ... all required have been implemented and will be available with next release.