In the example below I am interning the string in the constructor which is fine. However when i deserialise the object from the binary formatter I don't think the string will be interned as the constructor should be called. How should I be ensuring the _name string is interned? ... or will it be interned ok?
Edit: So it seems to work (interns the strings correctly) without handling the OnDeserializedAttribute. How does it do that?
I'm using a memory profiler, with or without the method below it still interns the strings? Magic? :-/
[OnDeserializedAttribute]
private void OnDeserialized(StreamingContext context)
{
_name = string.Intern(_name);
}
Thanks
[Serializable]
class City
{
private readonly string _name;
public City(string t)
{
_name = string.Intern(t);
}
public string Name
{
get { return _name; }
}
public override string ToString()
{
return _name;
}
}
This is possible if you implement the ISerializable
interface (not the Attribute). It will let you do the deserialization.
But it seems very unnecessary. Are you sure you are accomplishing anything with this?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With