Delphi 2009 has changed its string type to use 2 bytes to represent a character, which allows support for unicode char sets. Now when you get sizeof(string) you get length(String) * sizeof(char) . Sizeof(char) currently being 2.
What I am interested in is whether anyone knows of a way which on a character by character basis it is possible to find out if it would fit in a single byte, eg find out if a char is ascii or Unicode.
What I'm primarily interested in knowing, is before my string goes to a database (oracle, Documentum) how many bytes the string will use up.
We need to be able to enforce limits before hand and ideally (as we have a large installed base) without having to change the database. If a string field allows 12 bytes, in delphi 2009 a string of length 7 would always show as using 14 bytes even though once it got to the db it would only use 7 if ascii or 14 if double byte, or somewhere in between if a mixture.
You could check the value of the character:
if ord(c) < 128 then
// is an ascii character
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With