Some people peeked (and poked) under the covers and have been surprised at how SQL Server is twiddling their UDT bits during serialization.  Let's peek together.

Consider the following fragment of a UDT:

[Microsoft.SqlServer.Server.SqlUserDefinedType(Format.Native)]
public struct Simple : INullable
{
  private bool isNull;
  public int val;

  public static Simple FromInt(SqlInt32 i)
  {
    if (i.IsNull)
      return Null;
    Simple u = new Simple();
    u.val = i.Value;
    return u;
  }

  ...
}

Notice that we're using native serialization in this UDT.

Now consider the following use of the UDT from T-SQL:

declare @s Simple
set @s = Simple::FromInt(3)
print convert(varbinary(100), @s)
set @s = Simple::FromInt(-3)
print convert(varbinary(100), @s)

What do we expect?  In other words, how do we expect SQL Server to serialize this UDT?  Here's the result:

0x0080000003
0x007FFFFFFD

How do we explain this?  The leading "00" in each result looks like (and is) the stored form of the Boolean, but the next four bytes are a bit strange at first glance: they look like big endian two's complement representations of the numbers, but with the first bit flipped.

In fact, that's exactly right.  SQL Server ensures that the bytes are serialized in big endian order and then flips the first bit.

Why?  Sorting.  If the numbers are stored in this form, then the order of the binary representation agrees with the normal ordering of the integers, so comparisons can be done by comparing the bytes.

Why do we care about that?  It means that if we call the UDT byte-ordered, we (a) aren't lying, and (b) can perform equality and range queries over it.

Cheers,
-Isaac K. Kunen, Microsoft SQL Server