InfiniTec - Henning Krauses Blog

Don't adjust your mind - it's reality that is malfunctioning


Posted under .NET Tools | Comments (0)


The .NET Runtime has four different heaps where it stores objects. There are many good articles on how the runtime manages object on these heaps so I won’t into great details about the first three. Here are a two good links to blogs with more information on this topic:

The Large Object Heap

Most of the time, the GC does a well job managing the memory of your programs.
But there are situations, where it performs sub-optimally. One of these situations are so-called Large Objects. All objects with a size greater than 85000 bytes fall within this category. These objects are stored on a separate heap, called Large Object Heap. This heap will neither be defragmented, nor is it collected often, as the GC processes this heap along with the Generation-2 heap.
To reproduce the problem, try to read a large file (say 70MB or so) from a file stream and store it in a MemoryStream. You will see that your memory usage will climb to about 250 MB. Now, destroy your Memorystream and read the file again into another MemoryStream. Now, depending on your memory pressure, the GC will do a Gen-2 Collection (remember.. this is expensive) and reclaim that memory, or you end up with about 500MB of consumed memory.
Now you will probably argue that 250MB are way to much because you could preallocate one large buffer of 70MB for the file via MemoryStream.SetLength(), so you’ll end up with a smaller memory consumption. But there are two things with this argument:
  1. On the second read of the file you memory usage will increase anyway, albeit not to 500 MB but perhaps to about 150MB.
  2. You might not always know in advance how much memory you will need. Think of a server application that reads data over a TCP connection, there might be no hint on how large the amount of received data is. So you end up with increasing your buffer every now and then.

The Solution

The classes you can download below will give you the ability to reuse previously allocated chunks of memory. And once you are done with the memory, you can return it to the pool.
Note that I used weak references within the pool, so the GC can reclaim the buffers within the pool if memory pressure comes up.
Additionally, I have developed a new MemoryStream that uses this object pooling technique.


Posted by Henning Krause on Saturday, February 12, 2005 12:26 PM, last modified on Saturday, February 12, 2005 1:25 PM
Permalink | Post RSSRSS comment feed