In C# 2008, what is the Maximum Size that an Array can hold?
System.Int32.MaxValue
Assuming you mean System.Array, ie. any normally defined array (int[], etc). This is the maximum number of values the array can hold. The size of each value is only limited by the amount of memory or virtual memory available to hold them.
This limit is enforced because System.Array uses an Int32 as it's indexer, hence only valid values for an Int32 can be used. On top of this, only positive values (ie, >= 0) may be used. This means the absolute maximum upper bound on the size of an array is the absolute maximum upper bound on values for an Int32, which is available in Int32.MaxValue and is equivalent to 2^31, or roughly 2 billion.
On a completely different note, if you're worrying about this, it's likely you're using alot of data, either correctly or incorrectly. In this case, I'd look into using a List<T> instead of an array, so that you are only using as much memory as needed. Infact, I'd recommend using a List<T> or another of the generic collection types all the time. This means that only as much memory as you are actually using will be allocated, but you can use it like you would a normal array.
The other collection of note is Dictionary<int, T> which you can use like a normal array too, but will only be populated sparsely. For instance, in the following code, only one element will be created, instead of the 1000 that an array would create:
Dictionary<int, string> foo = new Dictionary<int, string>();
foo[1000] = "Hello world!";
Console.WriteLine(foo[1000]);
Using Dictionary also lets you control the type of the indexer, and allows you to use negative values. For the absolute maximal sized sparse array you could use a Dictionary<ulong, T>, which will provide more potential elements than you could possible think about.
Per MSDN it is
By default, the maximum size of an Array is 2 gigabytes (GB).
In a 64-bit environment, you can avoid the size restriction by setting the enabled attribute of the gcAllowVeryLargeObjects configuration element to true in the run-time environment.
However, the array will still be limited to a total of 4 billion elements.
Refer Here http://msdn.microsoft.com/en-us/library/System.Array(v=vs.110).aspx
Note: Here I am focusing on the actual length of array by assuming that we will have enough hardware RAM.
This answer is about .NET 4.5
According to MSDN, the index for array of bytes cannot be greater than 2147483591. For .NET prior to 4.5 it also was a memory limit for an array. In .NET 4.5 this maximum is the same, but for other types it can be up to 2146435071.
This is the code for illustration:
static void Main(string[] args)
{
// -----------------------------------------------
// Pre .NET 4.5 or gcAllowVeryLargeObjects unset
const int twoGig = 2147483591; // magic number from .NET
var type = typeof(int); // type to use
var size = Marshal.SizeOf(type); // type size
var num = twoGig / size; // max element count
var arr20 = Array.CreateInstance(type, num);
var arr21 = new byte[num];
// -----------------------------------------------
// .NET 4.5 with x64 and gcAllowVeryLargeObjects set
var arr451 = new byte[2147483591];
var arr452 = Array.CreateInstance(typeof(int), 2146435071);
var arr453 = new byte[2146435071]; // another magic number
return;
}
Here is an answer to your question that goes into detail:
http://www.velocityreviews.com/forums/t372598-maximum-size-of-byte-array.html
You may want to mention which version of .NET you are using and your memory size.
You will be stuck to a 2G, for your application, limit though, so it depends on what is in your array.
I think it is linked with your RAM (or probably virtual memory) space and for the absolute maximum constrained to your OS version (e.g. 32 bit or 64 bit)
I think if you don't consider the VM, it is Integer.MaxValue
Related
I want to make an array of size 10^9 elements where each elements can be an integer of the same size. I always get an OutOfMemoryException at the initialization line. How can I achieve this?
If this is not possible, please suggest alternative strategies?
Arrays are limited to 2GB in .net 4.0 or earlier, even in a 64 bit process. So with one billion elements, the maximum supported element size is two bytes, but an int is four bytes. So this will not work.
If you want to have a larger collection, you need to write it yourself, backed by multiple arrays.
In .net 4.5 it's possible to avoid this limitation, see Jon Skeet's answer for details.
Assuming you mean int as the element type, you can do this using .NET 4.5, if you're using a 64-bit CLR.
You need to use the <gcAllowVeryLargeObjects> configuration setting. This is not on by default.
If you're using an older CLR or you're on a 32-bit machine, you're out of luck. Of course, if you're on a 64-bit machine but just an old version of the CLR, you could encapsulate your "one large array" into a separate object which has a list of smaller arrays... you could even implement IList<int> that way so that most code wouldn't need to know that you weren't really using a single array.
(As noted in comments, you'll still only be able to create an array with 231 elements; but your requirement of 109 is well within this.)
I think you shouldn't load all this data into memory, store it somewhere in a file, and make a class that could work as an array but actually reads and writes data from and in the file
this is the general idea (this won't work as it is of course, plus you'll have to convert your int values into byte[] array before writing it and some other things)
public class FileArray
{
Stream s;
public this[int index]
{
get { s.Position = index * 4; return s.Read(); }
set { s.Position = index * 4; s.Write(value); }
}
}
that way you would have something that works just like an array, but the data would be stored on your hard drive
I'm trying to optimize some code where I have a large number of arrays containing structs of different size, but based on the same interface. In certain cases the structs are larger and hold more data, othertimes they are small structs, and othertimes I would prefer to keep null as a value to save memory.
My first question is. Is it a good idea to do something like this? I've previously had an array of my full data struct, but when testing mixing it up I would virtually be able to save lots of memory. Are there any other downsides?
I've been trying out different things, and it seams to work quite well when making an array of a common interface, but I'm not sure I'm checking the size of the array correctly.
To simplified the example quite a bit. But here I'm adding different structs to an array. But I'm unable to determine the size using the traditional Marshal.SizeOf method. Would it be correct to simply iterate through the collection and count the sizeof for each value in the collection?
IComparable[] myCollection = new IComparable[1000];
myCollection[0] = null;
myCollection[1] = (int)1;
myCollection[2] = "helloo world";
myCollection[3] = long.MaxValue;
System.Runtime.InteropServices.Marshal.SizeOf(myCollection);
The last line will throw this exception:
Type 'System.IComparable[]' cannot be marshaled as an unmanaged structure; no meaningful size or offset can be computed.
Excuse the long post:
Is this an optimal and usable solution?
How can I determine the size
of my array?
I may be wrong but it looks to me like your IComparable[] array is a managed array? If so then you can use this code to get the length
int arrayLength = myCollection.Length;
If you are doing platform interop between C# and C++ then the answer to your question headline "Can I find the length of an unmanaged array" is no, its not possible. Function signatures with arrays in C++/C tend to follow the following pattern
void doSomeWorkOnArrayUnmanaged(int * myUnmanagedArray, int length)
{
// Do work ...
}
In .NET the array itself is a type which has some basic information, such as its size, its runtime type etc... Therefore we can use this
void DoSomeWorkOnManagedArray(int [] myManagedArray)
{
int length = myManagedArray.Length;
// Do work ...
}
Whenever using platform invoke to interop between C# and C++ you will need to pass the length of the array to the receiving function, as well as pin the array (but that's a different topic).
Does this answer your question? If not, then please can you clarify
Optimality always depends on your requirements. If you really need to store many elements of different classes/structs, your solution is completely viable.
However, I guess your expectations on the data structure might be misleading: Array elements are per definition all of the same size. This is even true in your case: Your array doesn't store the elements themselves but references (pointers) to them. The elements are allocated somewhere on the VM heap. So your data structure actually goes like this: It is an array of 1000 pointers, each pointer pointing to some data. The size of each particular element may of course vary.
This leads to the next question: The size of your array. What are you intending to do with the size? Do you need to know how many bytes to allocate when you serialize your data to some persistent storage? This depends on the serialization format... Or do you need just a rough estimate on how much memory your structure is consuming? In the latter case you need to consider the array itself and the size of each particular element. The array which you gave in your example consumes approximately 1000 times the size of a reference (should be 4 bytes on a 32 bit machine and 8 bytes on a 64 bit machine). To compute the sizes of each element, you can indeed iterate over the array and sum up the size of the particular elements. Please be aware that this is only an estimate: The virtual machine adds some memory management overhead which is hard to determine exactly...
Anybody know what the max number of items in a List is?
How do I increase that size? Or is there a collection that takes infinite items? (as much as would fit in memory, that is)
EDIT:
I get an out of memory exception when Count = 134217728 in a list of ints. got 3Gb of RAM of which 2.2 are in use. Sound normal
List<T> will be limited to the max of an array, which is 2GB (even in x64). If that isn't enough, you're using the wrong type of data storage. You can save a lot of overhead by starting it the right size, though - by passing an int to the constructor.
Re your edit - with 134217728 x Int32, that is 512MB. Remember that List<T> uses a doubling algorithm; if you are drip-feeding items via Add (without allocating all the space first) it is going to try to double to 1GB (on top of the 512MB you're already holding, the rest of your app, and of course the CLR runtime and libraries). I'm assuming you're on x86, so you already have a 2GB limit per process, and it is likely that you have fragmented your "large object heap" to death while adding items.
Personally, yes, it sounds about right to start getting an out-of-memory at this point.
Edit: in .NET 4.5, arrays larger than 2GB are allowed if the <gcAllowVeryLargeObjects> switch is enabled. The limit then is 2^31 items. This might be useful for arrays of references (8 bytes each in x64), or an array of large structs.
The List limit is 2.1 Billion objects or the size of your memory which ever is hit first.
It's limited only by memory.
edit: or not, 2Gb's is the limit! This is quite interesting, BigArray, getting around the 2GB array size limit
On a x64 Machine, using .Net Framework 4 (Not the Client Profile), compiling for Any CPU in Release mode, I could chew up all the available memory. My process is now 5.3GB and I've consumed all available memory (8GB) on my PC. It's actually a Server 2008 R2 x64.
I used a custom Collection class based on CollectionBase to store 61,910,847 instances of the following class:
public class AbbreviatedForDrawRecord {
public int DrawId { get; set; }
public long Member_Id { get; set; }
public string FactorySerial { get; set; }
public AbbreviatedForDrawRecord() {
}
public AbbreviatedForDrawRecord(int drawId, long memberId, string factorySerial) {
this.DrawId = drawId;
this.Member_Id = memberId;
this.FactorySerial = factorySerial;
}
}
The List will dynamically grow itself to accomodate as many items as you want to include - until you exceed available memory!
From MSDN documentation:
If Count already equals Capacity, the capacity of the List is increased by automatically reallocating the internal array, and the existing elements are copied to the new array before the new element is added.
If Count is less than Capacity, this method is an O(1) operation. If the capacity needs to be increased to accommodate the new element, this method becomes an O(n) operation, where n is Count.
The interface defines Count and IndexOf etc as type int so I would assume that int.MaxValue or 2,147,483,647 is the most items you could stick in a list.
Really got to question why on earth you would need that many, there is likely to be a more sensible approach to managing the data.
I want to know what is the initial size of ArrayList in C#?
0. See below.
16. (I have to add characters to this answer, since 18 characters are minimum)
Edit, Oops - the initial capacity is 16. Initial size is of course 0, because it's empty. Have to learn how to read. Or you have to learn how to form your questions. ;)
Edit again; Initial capacity of an ArrayList in .NET 1.0 is 16. In 2.0 it was 4, and now - with .NET 3.5 - the initial capacty has been lowered to 0. I don't have an explanation of why, thou.
When adding a first element to the list, the capacity will be set to 4. There after, every time when arraylist.Count eq arraylist.Capacity, the capacity will double.
The ArrayList starts with Size = 0 (because it's empty) and Capacity = 16.
The capacity is doubled as necessary, and capacity-doubling is a O(n) operation where n is the new capacity. So if you're inserting 5,000 elements into your list, the framework's going to be doubling the ArrayList capacity nine times - and each doubling operation is twice as expensive as the previous one.
In other words - if you know you're going to be putting 5,000 elements in a list, you're much better off explicitly initializing it to hold 5,000 elements.
You can explicitly set the Capacity of an existing arraylist if you know you're about to insert a large number of elements. You can also decrease Capacity explicitly, but if you set Capacity < Count, you'll get an ArgumentOutOfRange exception.
NOTE: The following seemingly only holds true for .NET 3.5; in previous versions of the framework the values were different.
According to my tests here both the initial size and capacity are zero:
PS> $a = new-object system.collections.arrayList
PS> $a.Capacity
0
PS> $a.count
0
Also, looking at the source code in Reflector, the same holds true:
public virtual int Capacity
{
get
{
return this._items.Length;
}
...
}
And _items gets set to an empty object[] in the ctor.
The ArrayList is empty when you have created it, so the size is zero.
Unless you are stuck with framework 1, you should not use the ArrayList class at all. Use the strongly typed generic List<T> class instead.
ArrayList list = new ArrayList();
size = 0 before adding items to the arrayList, means no items are there.
Try yourself.
int capacity = (new ArrayList()).Capacity;
should give you the initial capacity.
I'm doing some Project Euler exercises and I've run into a scenario where I have want arrays which are larger than 2,147,483,647 (the upper limit of int in C#).
Sure these are large arrays, but for instance, I can't do this
// fails
bool[] BigArray = new BigArray[2147483648];
// also fails, cannot convert uint to int
ArrayList BigArrayList = new ArrayList(2147483648);
So, can I have bigger arrays?
EDIT:
It was for a Sieve of Atkin, you know, so I just wanted a really big one :D
Anytime you are working with an array this big, you should probably try to find a better solution to the problem. But that being said I'll still attempt to answer your question.
As mentioned in this article there is a 2 GB limit on any object in .Net. For all x86, x64 and IA64.
As with 32-bit Windows operating
systems, there is a 2GB limit on the
size of an object you can create while
running a 64-bit managed application
on a 64-bit Windows operating system.
Also if you define an array too big on the stack, you will have a stack overflow. If you define the array on the heap, it will try to allocate it all in one big continuous block. It would be better to use an ArrayList which has implicit dynamic allocation on the heap. This will not allow you to get past the 2GB, but will probably allow you to get closer to it.
I think the stack size limit will be bigger only if you are using an x64 or IA64 architecture and operating system. Using x64 or IA64 you will have 64-bit allocatable memory instead of 32-bit.
If you are not able to allocate the array list all at once, you can probably allocate it in parts.
Using an array list and adding 1 object at a time on an x64 Windows 2008 machine with 6GB of RAM, the most I can get the ArrayList to is size: 134217728. So I really think you have to find a better solution to your problem that does not use as much memory. Perhaps writing to a file instead of using RAM.
The array limit is, afaik, fixed as int32 even on 64-bit. There is a cap on the maximum size of a single object. However, you could have a nice big jagged array quite easily.
Worse; because references are larger in x64, for ref-type arrays you actually get less elements in a single array.
See here:
I’ve received a number of queries as
to why the 64-bit version of the 2.0
.Net runtime still has array maximum
sizes limited to 2GB. Given that it
seems to be a hot topic of late I
figured a little background and a
discussion of the options to get
around this limitation was in order.
First some background; in the 2.0
version of the .Net runtime (CLR) we
made a conscious design decision to
keep the maximum object size allowed
in the GC Heap at 2GB, even on the
64-bit version of the runtime. This is
the same as the current 1.1
implementation of the 32-bit CLR,
however you would be hard pressed to
actually manage to allocate a 2GB
object on the 32-bit CLR because the
virtual address space is simply too
fragmented to realistically find a 2GB
hole. Generally people aren’t
particularly concerned with creating
types that would be >2GB when
instantiated (or anywhere close),
however since arrays are just a
special kind of managed type which are
created within the managed heap they
also suffer from this limitation.
It should be noted that in .NET 4.5 the memory size limit is optionally removed by the gcAllowVeryLargeObjects flag, however, this doesn't change the maximum dimension size. The key point is that if you have arrays of a custom type, or multi-dimension arrays, then you can now go beyond 2GB in memory size.
You don't need an array that large at all.
When your method runs into resource problems, don't just look at how to expand the resources, look at the method also. :)
Here's a class that uses a 3 MB buffer to calculate primes using the sieve of Eratosthenes. The class keeps track of how far you have calculated primes, and when the range needs to be expanded it creates a buffer to test another 3 million numbers.
It keeps the found prime numbers in a list, and when the range is expanded the previos primes are used to rule out numbers in the buffer.
I did some testing, and a buffer around 3 MB is most efficient.
public class Primes {
private const int _blockSize = 3000000;
private List<long> _primes;
private long _next;
public Primes() {
_primes = new List<long>() { 2, 3, 5, 7, 11, 13, 17, 19 };
_next = 23;
}
private void Expand() {
bool[] sieve = new bool[_blockSize];
foreach (long prime in _primes) {
for (long i = ((_next + prime - 1L) / prime) * prime - _next;
i < _blockSize; i += prime) {
sieve[i] = true;
}
}
for (int i = 0; i < _blockSize; i++) {
if (!sieve[i]) {
_primes.Add(_next);
for (long j = i + _next; j < _blockSize; j += _next) {
sieve[j] = true;
}
}
_next++;
}
}
public long this[int index] {
get {
if (index < 0) throw new IndexOutOfRangeException();
while (index >= _primes.Count) {
Expand();
}
return _primes[index];
}
}
public bool IsPrime(long number) {
while (_primes[_primes.Count - 1] < number) {
Expand();
}
return _primes.BinarySearch(number) >= 0;
}
}
I believe that even within a 64 bit CLR, there's a limit of 2GB (or possibly 1GB - I can't remember exactly) per object. That would prevent you from creating a larger array. The fact that Array.CreateInstance only takes Int32 arguments for sizes is suggestive too.
On a broader note, I suspect that if you need arrays that large you should really change how you're approaching the problem.
I'm very much a newbie with C# (i.e. learning it this week), so I'm not sure of the exact details of how ArrayList is implemented. However, I would guess that as you haven't defined a type for the ArrayList example, then the array would be allocated as an array of object references. This might well mean that you are actually allocating 4-8Gb of memory depending on the architecture.
According to MSDN, the index for array of bytes cannot be greater than 2147483591. For .NET prior to 4.5 it also was a memory limit for an array. In .NET 4.5 this maximum is the same, but for other types it can be up to 2146435071.
This is the code for illustration:
static void Main(string[] args)
{
// -----------------------------------------------
// Pre .NET 4.5 or gcAllowVeryLargeObjects unset
const int twoGig = 2147483591; // magic number from .NET
var type = typeof(int); // type to use
var size = Marshal.SizeOf(type); // type size
var num = twoGig / size; // max element count
var arr20 = Array.CreateInstance(type, num);
var arr21 = new byte[num];
// -----------------------------------------------
// .NET 4.5 with x64 and gcAllowVeryLargeObjects set
var arr451 = new byte[2147483591];
var arr452 = Array.CreateInstance(typeof(int), 2146435071);
var arr453 = new byte[2146435071]; // another magic number
return;
}