No covariance for value type

Check out my new course Learn you some Lambda best practice for great good! and learn the best practices for performance, cost, security, resilience, observability and scalability.

For a while now I’ve been wondering why C#’s support for covariance does not cover value types, both in normal array covariance and covariance in the generic parameter introduced in C# 4:

   1: void Main()

   2: {

   3:     int i = 0;

   4:     string str = "hello world";

   5:     

   6:     TestMethod(i);       // legal

   7:     TestMethod(str);     // legal

   8:     TestMethod2(Enumerable.Empty<int>());           // illegal

   9:     TestMethod2(Enumerable.Empty<string>());        // legal

  10:     

  11:     Console.WriteLine(i is object);                 // true

  12:     Console.WriteLine(new int[0] is object[]);      // false

  13:     Console.WriteLine(new string[0] is object[]);   // true

  14:     Console.WriteLine(new uint[0] is int[]);        // false

  15: }

  16:  

  17: public void TestMethod(object obj)

  18: {

  19:     Console.WriteLine(obj);

  20: }

  21:  

  22: public void TestMethod2(IEnumerable<object> objs)

  23: {

  24:     Console.WriteLine(objs.Count());

  25: }

Until I stumbled upon this old post by Eric Lippert on the topic of array covariance, which essentially points to a disagreement in the C# and CLI specification on the rule of array covariance:

CLI

"if X is assignment compatible with Y then X[] is assignment compatible with Y[]"

C#

"if X is a reference type implicitly convertible to reference type Y then X[] is implicitly convertible to Y[]"

Whilst this doesn’t directly point to the generics case with IEnumerable<out T>, one would expect they are one and the same, otherwise you end up with different rules for int[] and IEnumerable<int> where (new int[0] is IEnumerable<int>) == true.. now that would be weird!

References:

Eric Lippert – Why is covariance of value-typed arrays inconsistent?

Question on StackOverflow – why does my C# array lose type sign information when cast to object?

Liked this article? Support me on Patreon and get direct help from me via a private Slack channel or 1-2-1 mentoring.
Subscribe to my newsletter


Hi, I’m Yan. I’m an AWS Serverless Hero and the author of Production-Ready Serverless.

I specialise in rapidly transitioning teams to serverless and building production-ready services on AWS.

Are you struggling with serverless or need guidance on best practices? Do you want someone to review your architecture and help you avoid costly mistakes down the line? Whatever the case, I’m here to help.

Hire me.


Check out my new course, Learn you some Lambda best practice for great good! In this course, you will learn best practices for working with AWS Lambda in terms of performance, cost, security, scalability, resilience and observability. Enrol now and enjoy a special preorder price of £9.99 (~$13).

Start Learning


Are you working with Serverless and looking for expert training to level-up your skills? Or are you looking for a solid foundation to start from? Look no further, register for my Production-Ready Serverless workshop to learn how to build production-grade Serverless applications!

Find a workshop near you