Memory leak in ADO.NET DataSet

Yan Cui

I help clients go faster for less using serverless technologies.

This article is brought to you by

MongoDB 8.0 is here to change the game. Faster reads and inserts, and brand-new vector search to support modern AI-powered apps.

Learn More

Over the last couple of years, there have been many discussions/debates on DataSet vs Collections, and there was a very good article in MSDN magazine on just that:

http://msdn.microsoft.com/en-gb/magazine/cc163751.aspx#S7

To add to the Dark Sides of DataSet, there is a little known feature/bug/annoyance in the DataTable.Select() method – every time you call the Select() method it creates a new index implicitly without you having any control over it, and the index is not cleared until you call DataTable.AcceptChanges().

If your application has to deal with a large amount of data and have to use the Select() method repeatedly without calling AcceptChanges() then you might have a problem! Why? Consider these two factors:

1. the bigger the DataTable, the bigger the index, and if the index object is bigger than 85kb it gets allocated to the Large Object Heap which are not cleared automatically by the Garbage Collector/takes much longer to clear than small objects

2. in a 32-bit windows system, there’s a 2GB Virtual Address Space limit for each process, and in practice, you will usually get an OutOfMemoryException when your process has used around 1.2GB – 1.5GB of RAM

combine them and it’s not hard to imagine a scenario where your process might actually run out of memory and crash out before it completes its task! (Believe me, it was a hard learned lesson from my personal experience!)

Solutions:

1. unless you actually need some of the features DataSet offers such as the ability to keep multiple versions of the same row (Original, Current, etc.) you might be better off with using POCO (plain old CLR object) instead which are simple, lightweight and you can use LINQ to Objects with i4o to get some impressive performance improvements. After I implemented this change, my application went from crashing out with OutOfMemoryException to maxing out at 70MB throughout its lifetime and finished in about 15% of the time it’d have taken using DataSet.

2. if getting rid of DataSet altogether takes a little too much time and effort than you can afford, then there’s a quick workaround by using a DataView and dynamically change the Filter string every time you intend to call the Select() method.

If you wish to learn more about Garbage Collection in general, you should read Maoni’s WebLog which covers all things CLR Garbage Collector! He also wrote a nice article focused on Large Object Heap back in June 2008 which is well worth a read:

http://msdn.microsoft.com/en-us/magazine/cc534993.aspx

Whenever you’re ready, here are 3 ways I can help you:

  1. Production-Ready Serverless: Join 20+ AWS Heroes & Community Builders and 1000+ other students in levelling up your serverless game. This is your one-stop shop for quickly levelling up your serverless skills.
  2. I help clients launch product ideas, improve their development processes and upskill their teams. If you’d like to work together, then let’s get in touch.
  3. Join my community on Discord, ask questions, and join the discussion on all things AWS and Serverless.

Leave a Comment

Your email address will not be published. Required fields are marked *