Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is the size of an array constrained by the upper limit of int (2147483647)?

I'm doing some Project Euler exercises and I've run into a scenario where I have want arrays which are larger than 2,147,483,647 (the upper limit of int in C#).

Sure these are large arrays, but for instance, I can't do this

// fails
bool[] BigArray = new BigArray[2147483648];

// also fails, cannot convert uint to int
ArrayList BigArrayList = new ArrayList(2147483648); 

So, can I have bigger arrays?

EDIT: It was for a Sieve of Atkin, you know, so I just wanted a really big one :D

like image 933
Ian G Avatar asked Feb 21 '09 20:02

Ian G


2 Answers

Anytime you are working with an array this big, you should probably try to find a better solution to the problem. But that being said I'll still attempt to answer your question.

As mentioned in this article there is a 2 GB limit on any object in .Net. For all x86, x64 and IA64.

As with 32-bit Windows operating systems, there is a 2GB limit on the size of an object you can create while running a 64-bit managed application on a 64-bit Windows operating system.

Also if you define an array too big on the stack, you will have a stack overflow. If you define the array on the heap, it will try to allocate it all in one big continuous block. It would be better to use an ArrayList which has implicit dynamic allocation on the heap. This will not allow you to get past the 2GB, but will probably allow you to get closer to it.

I think the stack size limit will be bigger only if you are using an x64 or IA64 architecture and operating system. Using x64 or IA64 you will have 64-bit allocatable memory instead of 32-bit.

If you are not able to allocate the array list all at once, you can probably allocate it in parts.

Using an array list and adding 1 object at a time on an x64 Windows 2008 machine with 6GB of RAM, the most I can get the ArrayList to is size: 134217728. So I really think you have to find a better solution to your problem that does not use as much memory. Perhaps writing to a file instead of using RAM.

like image 187
Brian R. Bondy Avatar answered Oct 12 '22 23:10

Brian R. Bondy


The array limit is, afaik, fixed as int32 even on 64-bit. There is a cap on the maximum size of a single object. However, you could have a nice big jagged array quite easily.

Worse; because references are larger in x64, for ref-type arrays you actually get less elements in a single array.

See here:

I’ve received a number of queries as to why the 64-bit version of the 2.0 .Net runtime still has array maximum sizes limited to 2GB. Given that it seems to be a hot topic of late I figured a little background and a discussion of the options to get around this limitation was in order.

First some background; in the 2.0 version of the .Net runtime (CLR) we made a conscious design decision to keep the maximum object size allowed in the GC Heap at 2GB, even on the 64-bit version of the runtime. This is the same as the current 1.1 implementation of the 32-bit CLR, however you would be hard pressed to actually manage to allocate a 2GB object on the 32-bit CLR because the virtual address space is simply too fragmented to realistically find a 2GB hole. Generally people aren’t particularly concerned with creating types that would be >2GB when instantiated (or anywhere close), however since arrays are just a special kind of managed type which are created within the managed heap they also suffer from this limitation.


It should be noted that in .NET 4.5 the memory size limit is optionally removed by the gcAllowVeryLargeObjects flag, however, this doesn't change the maximum dimension size. The key point is that if you have arrays of a custom type, or multi-dimension arrays, then you can now go beyond 2GB in memory size.

like image 37
Marc Gravell Avatar answered Oct 13 '22 00:10

Marc Gravell