I am manually writing a large data block into a binary file with System.IO.BinaryWriter. I have chosen this due to the improved performance compared to a wide variety of other means of serialization & deserialization (I am currently deserializing with System.IO.BinaryReader).
I may need to use the serialized formats in other programming languages like Java and/or Rust. Would they be able to understand the raw binary written by System.IO.BinaryWriter and read it in a similar manner to .NETs 'System.IO.BinaryReader'?
(I am assuming that the new plaforms (Java/Rust) will have implicit knowledge of the specific order in which the raw binary was written.)
I am aware that protocol buffers is meant to be a performant and language agnostic framework for serializing/deserializing in this scenario but: (1) I am using F# and it struggles with the discriminated unions (2) It wasn't really that much effort to write my own custom serializer as my types aren't too complex
It depends on the types you write with the BinaryWriter.
byte, sbyte and byte[]: no problem.(U)IntXX: matter of endianness. The .NET BinaryWriter dumps these types in little endian format.float and double: If both systems use the same IEEE 754 standard, and both systems use the same endianness, then it is no problem.decimal: This is a .NET-specific type, similar to Currency but uses different format. Use carefully.char and char[]: Uses the current Encoding of the BinaryWriter. Use the same encoding on both sides and everything is alright.string: The length of the string is encoded in the so-called 7 bit-encoded int format (1 byte for up to 127 chars, etc), and uses the current encoding. To make things compatible maybe you should dump character arrays with manually dumped length information.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With