I am heavily using byte array to transfer objects, primitive data, over the network and back. I adapt java’s approach, by having a type implement ISerializable, which contains two methods, as part of the interface, ReadObjectData and WriteObjectData. Any class using this interface, would write date into the byte array. Something Like that
class SerializationType:ISerializable { void ReadObjectData (/*Type that manages the write/reads into the byte array*/){} void WriteObjectData(/*Type that manages the write/reads into the byte array*/){} }
After write is complete for all object, I send an array of the network.
This is actually two-fold question. Is it a right way to send data over the network for the most efficiency (in terms of speed, size)?
Would you use this approach to write objects into the file, as opposed to use typically xml serialization?
Edit #1
Joel Coehoorn mentioned BinaryFormatter. I have never used this class. Would you elaborate, provide good example, references, recommendations, current practices — in addition to what I currently see on msdn?
This should be fine, but you’re doing work that is already done for you. Look at the
System.Runtime.Serialization.Formatters.Binary.BinaryFormatterclass.Rather than needing to implement your own Read/WriteOjbectData() methods for each specific type you can just use this class that can already handle most any object. It basically takes an exact copy of the memory representation of almost any .Net object and writes it to or reads it from a stream:
Make sure you read through the linked documents: there can be issues with unicode strings, and an exact memory representation isn’t always appropriate (things like open Sockets, for example).