In pre-.NET world I always assumed that int is faster than byte since this is how processor works.
Now it's matter habit of using int even when bytes could work, for example when byte is what is stored in database
Question: How .NET handles byte type versus int from point view of performance/memory.
Update: Thanks for the input. Unfortunately, nobody really answered the question. How .NET handles byte vs. int.
And if there is no difference in performance, then I like how chills42 put it: int for arithmetics bytes for binary Which I will continue to do.
Items are stored in files as a sequence of bytes, so if you're worried about disk space you should use bytes. Items are processed by your CPU in 32- or 64-bit integers (depending on your processor) so any item that's less than that amount will be "upgraded" to a 32- or 64-bit representation for runtime computation.
Performance-wise, an int is faster in almost all cases. The CPU is designed to work efficiently with 32-bit values. Shorter values are complicated to deal with. To read a single byte, say, the CPU has to read the 32-bit block that contains it, and then mask out the upper 24 bits.
byte datatype has a range from -128 to 127 and it requires very little memory (only 1 byte). It can be used in place of int where we are sure that the range will be very small. The compiler automatically promotes the byte variables to type int, if they are used in an expression and the value exceeds their range.
In C#, Byte Struct is used to represent 8-bit unsigned integers. The Byte is an immutable value type and the range of Byte is from 0 to 255. This class allows you to create Byte data types and you can perform mathematical and bitwise operations on them like addition, subtraction, multiplication, division, XOR, AND etc.
Your pre-.NET assumption were faulty -- there have always been plenty of computer systems around that, while nominally "byte-addressable", would have to set a single byte by reading a full word, masking it out to alter the one byte of it, write it all down -- slower than just setting a full word. It depends on the internals of the way the processor and memory are connected, not on the programmer-visible architecture.
Whether in .NET or native code, focus first on using the data as semantically correct for your application, not on trying to double guess the computer system's architect -- "Premature optimization is the root of all evil in programming", to quote Knuth quoting Hoare.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With