Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does using small datatypes (for example short instead of int) reduce memory usage?

My question is basically about how the C# compiler handles memory allocation of small datatypes. I do know that for example operators like add are defined on int and not on short and thus computations will be executed as if the shorts are int members.

Assuming the following:

  • There's no business logic/validation logic associated with the choice of short as a datatype
  • We're not doing anything with unsafe code

Does using the short datatype wherever possible reduce the memory footprint of my application and is it advisable to do so? Or is using short and the like not worth the effort as the compiler allocates the full memory ammount of a int32 for example and adds additional casts when doing arithmetic.

Any links on the supposed runtime performance impact would be greatly appreciated.

Related questions:

Why should I use int instead of a byte or short in C#

Integer summing blues, short += short problem

like image 299
thekip Avatar asked Jun 01 '11 08:06

thekip


People also ask

Should I use short instead of int?

Conclusion: Use int unless you conserving memory is critical, or your program uses a lot of memory (e.g. many arrays). In that case, use short .

Is short more efficient than int?

In most cases using int in a loop is more efficient than using short.

Which of the following data types are used to optimize memory usage in certain cases?

Using int8 or uint8 can reduce an array memory consumption to one-eighth of its original size. One of the most common applications of uint8 is image data.

Is Short smaller than int?

short datatype is the variable range is more than byte but less than int and it also requires more memory than byte but less memory in comparison to int. The compiler automatically promotes the short variables to type int, if they are used in an expression and the value exceeds their range.


2 Answers

From a memory-only perspective, using short instead of int will be better. The simple reason is that a short variable needs only half the size of an int variable in memory. The CLR does not expand short to int in memory.

Nevertheless this reduced memory consumption might and probably will decrease runtime performance of your application significantly. All modern CPUs do perform much better with 32bit numbers than with 16bit numbers. Additionally in many cases the CLR will have to convert between short and int when e.g. calling methods that take int arguments. There are many other performance considerations you have to take before going this way.

I would only change this at very dedicated locations and modules of your application and only if you really encounter measurable memory shortages.

In some cases you can of course switch from int to short easily without hurting performance. One example is a giant array of ints all of which do also fit to shorts.

like image 192
Florian Greinacher Avatar answered Oct 11 '22 13:10

Florian Greinacher


It makes sense in terms of memory usage only if you have in your program very large arrays (or collections built on arrays like List<>) of these types, or arrays of packed structs composed of same. By 'large' I mean that the total memory footprint of these arrays is a large percentage of the working set and a large percentage of the available memory. As for advisability, I'd venture that it is inadvisable to use short types unless the data your program operates on is explicitly specified in terms of short etc., or the volume of data runs into gigabytes.

like image 40
Anton Tykhyy Avatar answered Oct 11 '22 13:10

Anton Tykhyy