SmartThreadPool – What happens when you take on more than you can chew

I’ve cov­ered the top­ic of using Smart­Thread­Pool and the frame­work thread pool in more details here and here, this post will instead focus on a more spe­cif­ic sce­nario where the rate of new work items being queued out­strips the pool’s abil­i­ty to process those items and what hap­pens then.

First, let’s try to quan­ti­fy the work items being queued when you do some­thing like this:

   1: var threadPool = new SmartThreadPool();

   2: var result = threadPool.QueueWorkItem(....);

The work item being queued is a del­e­gate of some sort, basi­cal­ly some piece of code that needs to be run, until a thread in the pool becomes avail­able and process the work item, it’ll sim­ply stay in mem­o­ry as a bunch of 1’s and 0’s just like every­thing else.

Now, if new work items are queued at a faster rate than the threads in the pool are able to process them, it’s easy to imag­ine that the amount of mem­o­ry required to keep the del­e­gates will fol­low an upward trend until you even­tu­al­ly run out of avail­able mem­o­ry and an Out­OfMem­o­ryEx­cep­tion gets thrown.

Does that sound like a rea­son­able assump­tion? So let’s find out what actu­al­ly hap­pens!

Test 1 – Simple delegate

To sim­u­late a sce­nario where the thread pool gets over­run by work items, I’m going to instan­ti­ate a new smart thread pool and make sure there’s only one thread in the pool at all times. Then I recur­sive­ly queue up an action which puts the thread (the one in the pool) to sleep for a long time so that there’s no threads to process sub­se­quent work items:

   1: // instantiate a basic smt with only one thread in the pool

   2: var threadpool = new SmartThreadPool(new STPStartInfo

   3:                                          {

   4:                                              MaxWorkerThreads = 1,

   5:                                              MinWorkerThreads = 1,

   6:                                          });

   7:  

   8: var queuedItemCount = 0;

   9: try

  10: {

  11:     // keep queuing a new items which just put the one and only thread

  12:     // in the threadpool to sleep for a very long time

  13:     while (true)

  14:     {

  15:         // put the thread to sleep for a long long time so it can't handle anymore

  16:         // queued work items

  17:         threadpool.QueueWorkItem(() => Thread.Sleep(10000000));

  18:         queuedItemCount++;

  19:     }

  20: }

  21: catch (OutOfMemoryException)

  22: {

  23:     Console.WriteLine("OutOfMemoryException caught after queuing {0} work items", queuedItemCount);

  24: }

The result? As expect­ed, the mem­o­ry used by the process went on a pret­ty steep climb and with­in a minute it bombed out after eat­ing up just over 1.8GB of RAM:

image 

image

All the while we man­aged to queue up 7205254 instances of the sim­ple del­e­gate used in this test, keep this num­ber in mind as we look at what hap­pens when the clo­sure also requires some expen­sive piece of data to be kept around in mem­o­ry too.

Test 2 – Delegate with very long string

For this test, I’m gonna include a 1000 char­ac­ter long string in the clo­sures being queued so that string objects need to be kept around in mem­o­ry for as long as the clo­sures are still around. Now let’s see what hap­pens!

   1: // instantiate a basic smt with only one thread in the pool

   2: var threadpool = new SmartThreadPool(new STPStartInfo

   3:                                          {

   4:                                              MaxWorkerThreads = 1,

   5:                                              MinWorkerThreads = 1,

   6:                                          });

   7:  

   8: var queuedItemCount = 0;

   9: try

  10: {

  11:     // keep queuing a new items which just put the one and only thread

  12:     // in the threadpool to sleep for a very long time

  13:     while (true)

  14:     {

  15:         // generate a 1000 character long string, that's 1000 bytes

  16:         var veryLongText = new string(Enumerable.Range(1, 1000).Select(i => 'E').ToArray());

  17:  

  18:         // include the very long string in the closure here

  19:         threadpool.QueueWorkItem(() =>

  20:                                      {

  21:                                          Thread.Sleep(10000000);

  22:                                          Console.WriteLine(veryLongText);

  23:                                      });

  24:         queuedItemCount++;

  25:     }

  26: }

  27: catch (OutOfMemoryException)

  28: {

  29:     Console.WriteLine("OutOfMemoryException caught after queuing {0} work items", queuedItemCount);

  30: }

Unsur­pris­ing­ly, the mem­o­ry was ate up even faster this time around and at the end we were only able to queue 782232 work items before we ran out of mem­o­ry, which is sig­nif­i­cant­ly low­er com­pared to the pre­vi­ous test:

image

Parting thoughts…

Besides it being a fun lit­tle exper­i­ment to try out, there is a sto­ry here, one that tells of a worst case sce­nario (albeit one that’s high­ly unlike­ly but not impos­si­ble) which is worth keep­ing in the back of your mind of when util­is­ing thread pools to deal with high­ly fre­quent, data intense, block­ing calls.