Extension methods for compressing/decompressing string

You can become a serverless blackbelt. Enrol to my 4-week online workshop Production-Ready Serverless and gain hands-on experience building something from scratch using serverless technologies. At the end of the workshop, you should have a broader view of the challenges you will face as your serverless architecture matures and expands. You should also have a firm grasp on when serverless is a good fit for your system as well as common pitfalls you need to avoid. Sign up now and get 15% discount with the code yanprs15!

Serialization Overhead

When it comes to serializing/deserializing objects for transport through the wire, you will most likely incur some overhead in the serialized message though the amount of overhead varies depends on the data interchange format used – XML is overly verbose where as JSON is much more light-weight:

public class MyClass
{
    public int MyProperty { get; set; }
}

var myClass = new MyClass { MyProperty = 10 };

XML representation:

<MyClass>
    <MyProperty>10</MyProperty>
</MyClass>

JSON representation:

{“MyProperty”:10}

As you can see, a simple 4-byte object (MyProperty is a 32-bit integer) can take up over 11 times the space after it’s serialized into XML format:

Binary XML JSON
4 byte 46 bytes 17 bytes

This overhead translates to cost in terms of both bandwidth as well as performance, and if persistence is involved then there’s also the storage cost. For example, if you need to persist objects onto an Amazon S3 bucket, then not only would you be paying for the wastage introduced by the serialization process (extra space needed for storage) but also the additional bandwidth needed to get the serialized data in and out of S3, not to mention the performance penalty for transferring more data.

Using Compression

An easy way to cut down on your cost is to introduce compression to the equation, considering that the serialized message in XML/JSON is text which can be easily compressed into 10-15% of its original size, there’s a compelling case to do it!

There are a number of 3rd party compression libraries out there to help you do this, for instance:

  • SharpZipLib – a widely used library with support for Zip, GZip, Tar and BZip2 formats.
  • SeverZipSharp – codeplex project which provides a wrapper for the native 7Zip library to provide data (self-)extraction and compression in all 7-ziop formats.
  • UnRAR.dll – native library from the developer of WinRAR to help you work with the RAR format.

The .Net framework also provides two classes for you to use – DeflateStream and GZipStream – which both uses the Deflate algorithm (GZipStream inherits from the DeflateStream class) to provide lossless compression and decompression. Please note you can’t use these classes to compress files larger than 4GB though.

Here’s two extension methods to help you compress/decompress a string using the framework’s DeflateStream class:

public static class CompressionExtensions
{
    /// <summary>
    /// Returns the byte array of a compressed string
    /// </summary>
    public static byte[] ToCompressedByteArray(this string source)
    {
        // convert the source string into a memory stream
        using (
            MemoryStream inMemStream = new MemoryStream(Encoding.ASCII.GetBytes(source)),
            outMemStream = new MemoryStream())
        {
            // create a compression stream with the output stream
            using (var zipStream = new DeflateStream(outMemStream, CompressionMode.Compress, true))
                // copy the source string into the compression stream
                inMemStream.WriteTo(zipStream);

            // return the compressed bytes in the output stream
            return outMemStream.ToArray();
        }
    }
    /// <summary>
    /// Returns the base64 encoded string for the compressed byte array of the source string
    /// </summary>
    public static string ToCompressedBase64String(this string source)
    {
        return Convert.ToBase64String(source.ToCompressedByteArray());
    }

    /// <summary>
    /// Returns the original string for a compressed base64 encoded string
    /// </summary>
    public static string ToUncompressedString(this string source)
    {
        // get the byte array representation for the compressed string
        var compressedBytes = Convert.FromBase64String(source);

        // load the byte array into a memory stream
        using (var inMemStream = new MemoryStream(compressedBytes))
            // and decompress the memory stream into the original string
            using (var decompressionStream = new DeflateStream(inMemStream, CompressionMode.Decompress))
                using (var streamReader = new StreamReader(decompressionStream))
                    return streamReader.ReadToEnd();
    }
}

Please NOTE that the compressed string can be longer than the uncompressed string when the uncompressed string is very short, as always you should make a judgement based on your situation whether compression is worthwhile given that it also requires additional CPU cycles for the compression/decompression steps.

The good news is, as serialized messages tend to blow up fairly quickly (especially when there are arrays involved), in almost all cases you should see a significant saving on the size of the serialized message and therefore storage and bandwidth cost as well!

Liked this article? Support me on Patreon and get direct help from me via a private Slack channel or 1-2-1 mentoring.
Subscribe to my newsletter


Hi, I’m Yan. I’m an AWS Serverless Hero and I help companies go faster for less by adopting serverless technologies successfully.

Are you struggling with serverless or need guidance on best practices? Do you want someone to review your architecture and help you avoid costly mistakes down the line? Whatever the case, I’m here to help.

Hire me.


Skill up your serverless game with this hands-on workshop.

My 4-week Production-Ready Serverless online workshop is back!

This course takes you through building a production-ready serverless web application from testing, deployment, security, all the way through to observability. The motivation for this course is to give you hands-on experience building something with serverless technologies while giving you a broader view of the challenges you will face as the architecture matures and expands.

We will start at the basics and give you a firm introduction to Lambda and all the relevant concepts and service features (including the latest announcements in 2020). And then gradually ramping up and cover a wide array of topics such as API security, testing strategies, CI/CD, secret management, and operational best practices for monitoring and troubleshooting.

If you enrol now you can also get 15% OFF with the promo code “yanprs15”.

Enrol now and SAVE 15%.


Check out my new podcast Real-World Serverless where I talk with engineers who are building amazing things with serverless technologies and discuss the real-world use cases and challenges they face. If you’re interested in what people are actually doing with serverless and what it’s really like to be working with serverless day-to-day, then this is the podcast for you.


Check out my new course, Learn you some Lambda best practice for great good! In this course, you will learn best practices for working with AWS Lambda in terms of performance, cost, security, scalability, resilience and observability. We will also cover latest features from re:Invent 2019 such as Provisioned Concurrency and Lambda Destinations. Enrol now and start learning!


Check out my video course, Complete Guide to AWS Step Functions. In this course, we’ll cover everything you need to know to use AWS Step Functions service effectively. There is something for everyone from beginners to more advanced users looking for design patterns and best practices. Enrol now and start learning!