Given a class:
[DataContract]
public sealed class ChangedField
{
[DataMember(Name="I")] public ushort FieldId { get; set; }
[DataMember(Name="V")] public object Value { get; set; }
}
WireShark shows that, when sent via a WCF TCP binding, the encoding of the message is in binary (printable characters only, but you get the idea):
ChangedFielda.I..a.V....e:double..e http://www.w3.org/2001/XMLSchema.s.....a.
But if I serialise an instance of this type like so…
var ser = new DataContractSerializer(typeof(ChangedField));
var stream = new MemoryStream();
ser.WriteObject(stream, new ChangedField { FieldId = 1, Value = 1.23d });
…then the stream contains SOAP XML resembling this:
<ChangedField>
<I>1</I>
<V i:type="a:double" xmlns:a="http://www.w3.org/2001/XMLSchema">1.23</V>
</ChangedField>
So my question is how can I control DataContractSerializer to produce this binary representation in my own code?
As an aside:
As you can see, the message is bloated by the fact that the object property must have its type encoded (hence the URI). I’m going to change this to use a custom binary encoding, as in my scenario, the field ID determines the type (in this case, double).
the TCP binding uses the binary message encoder by default, where as you’re just serializing the data contract into XML in your second example. What happens with the binary message encoder is that it basically provides the data contract serializer with a custom XmlWriter implementation that generates a proprietary binary format, instead of XML.
If you want to use this with a different binding (say, HTTP), then you need to create a custom binding instead and add the BinaryMessageEncodingElement element instead of the normal TextMessageEncodingElement.