OutOfMemoryException when declaring an array of 100mln floats [duplicate] - c#

This is my code:
int size = 100000000;
double sizeInMegabytes = (size * 8.0) / 1024.0 / 1024.0; //762 mb
double[] randomNumbers = new double[size];
Exception:
Exception of type 'System.OutOfMemoryException' was thrown.
I have 4GB memory on this machine 2.5GB is free when I start this running, there is clearly enough space on the PC to handle the 762mb of 100000000 random numbers. I need to store as many random numbers as possible given available memory. When I go to production there will be 12GB on the box and I want to make use of it.
Does the CLR constrain me to a default max memory to start with? and how do I request more?
Update
I thought breaking this into smaller chunks and incrementally adding to my memory requirements would help if the issue is due to memory fragmentation, but it doesn't I can't get past a total ArrayList size of 256mb regardless of what I do tweaking blockSize.
private static IRandomGenerator rnd = new MersenneTwister();
private static IDistribution dist = new DiscreteNormalDistribution(1048576);
private static List<double> ndRandomNumbers = new List<double>();
private static void AddNDRandomNumbers(int numberOfRandomNumbers) {
for (int i = 0; i < numberOfRandomNumbers; i++) {
ndRandomNumbers.Add(dist.ICDF(rnd.nextUniform()));
}
}
From my main method:
int blockSize = 1000000;
while (true) {
try
{
AddNDRandomNumbers(blockSize);
}
catch (System.OutOfMemoryException ex)
{
break;
}
}
double arrayTotalSizeInMegabytes = (ndRandomNumbers.Count * 8.0) / 1024.0 / 1024.0;

You may want to read this: "“Out Of Memory” Does Not Refer to Physical Memory" by Eric Lippert.
In short, and very simplified, "Out of memory" does not really mean that the amount of available memory is too small. The most common reason is that within the current address space, there is no contiguous portion of memory that is large enough to serve the wanted allocation. If you have 100 blocks, each 4 MB large, that is not going to help you when you need one 5 MB block.
Key Points:
the data storage that we call “process memory” is in my opinion best visualized as a massive file on disk.
RAM can be seen as merely a performance optimization
Total amount of virtual memory your program consumes is really not hugely relevant to its performance
"running out of RAM" seldom results in an “out of memory” error. Instead of an error, it results in bad performance because the full cost of the fact that storage is actually on disk suddenly becomes relevant.

Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.
64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space (the size of their virtual memory) is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.
However, object’s size in .NET is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>

You don't have a continuous block of memory in order to allocate 762MB, your memory is fragmented and the allocator cannot find a big enough hole to allocate the needed memory.
You can try to work with /3GB (as others had suggested)
Or switch to 64 bit OS.
Or modify the algorithm so it will not need a big chunk of memory. maybe allocate a few smaller (relatively) chunks of memory.

As you probably figured out, the issue is that you are trying to allocate one large contiguous block of memory, which does not work due to memory fragmentation. If I needed to do what you are doing I would do the following:
int sizeA = 10000,
sizeB = 10000;
double sizeInMegabytes = (sizeA * sizeB * 8.0) / 1024.0 / 1024.0; //762 mb
double[][] randomNumbers = new double[sizeA][];
for (int i = 0; i < randomNumbers.Length; i++)
{
randomNumbers[i] = new double[sizeB];
}
Then, to get a particular index you would use randomNumbers[i / sizeB][i % sizeB].
Another option if you always access the values in order might be to use the overloaded constructor to specify the seed. This way you would get a semi random number (like the DateTime.Now.Ticks) store it in a variable, then when ever you start going through the list you would create a new Random instance using the original seed:
private static int randSeed = (int)DateTime.Now.Ticks; //Must stay the same unless you want to get different random numbers.
private static Random GetNewRandomIterator()
{
return new Random(randSeed);
}
It is important to note that while the blog linked in Fredrik Mörk's answer indicates that the issue is usually due to a lack of address space it does not list a number of other issues, like the 2GB CLR object size limitation (mentioned in a comment from ShuggyCoUk on the same blog), glosses over memory fragmentation, and fails to mention the impact of page file size (and how it can be addressed with the use of the CreateFileMapping function).
The 2GB limitation means that randomNumbers must be less than 2GB. Since arrays are classes and have some overhead them selves this means an array of double will need to be smaller then 2^31. I am not sure how much smaller then 2^31 the Length would have to be, but Overhead of a .NET array? indicates 12 - 16 bytes.
Memory fragmentation is very similar to HDD fragmentation. You might have 2GB of address space, but as you create and destroy objects there will be gaps between the values. If these gaps are too small for your large object, and additional space can not be requested, then you will get the System.OutOfMemoryException. For example, if you create 2 million, 1024 byte objects, then you are using 1.9GB. If you delete every object where the address is not a multiple of 3 then you will be using .6GB of memory, but it will be spread out across the address space with 2024 byte open blocks in between. If you need to create an object which was .2GB you would not be able to do it because there is not a block large enough to fit it in and additional space cannot be obtained (assuming a 32 bit environment). Possible solutions to this issue are things like using smaller objects, reducing the amount of data you store in memory, or using a memory management algorithm to limit/prevent memory fragmentation. It should be noted that unless you are developing a large program which uses a large amount of memory this will not be an issue. Also, this issue can arise on 64 bit systems as windows is limited mostly by the page file size and the amount of RAM on the system.
Since most programs request working memory from the OS and do not request a file mapping, they will be limited by the system's RAM and page file size. As noted in the comment by Néstor Sánchez (Néstor Sánchez) on the blog, with managed code like C# you are stuck to the RAM/page file limitation and the address space of the operating system.
That was way longer then expected. Hopefully it helps someone. I posted it because I ran into the System.OutOfMemoryException running a x64 program on a system with 24GB of RAM even though my array was only holding 2GB of stuff.

I'd advise against the /3GB windows boot option. Apart from everything else (it's overkill to do this for one badly behaved application, and it probably won't solve your problem anyway), it can cause a lot of instability.
Many Windows drivers are not tested with this option, so quite a few of them assume that user-mode pointers always point to the lower 2GB of the address space. Which means they may break horribly with /3GB.
However, Windows does normally limit a 32-bit process to a 2GB address space.
But that doesn't mean you should expect to be able to allocate 2GB!
The address space is already littered with all sorts of allocated data. There's the stack, and all the assemblies that are loaded, static variables and so on. There's no guarantee that there will be 800MB of contiguous unallocated memory anywhere.
Allocating 2 400MB chunks would probably fare better. Or 4 200MB chunks. Smaller allocations are much easier to find room for in a fragmented memory space.
Anyway, if you're going to deploy this to a 12GB machine anyway, you'll want to run this as a 64-bit application, which should solve all the problems.

Changing from 32 to 64 bit worked for me - worth a try if you are on a 64 bit pc and it doesn't need to port.

If you need such large structures, perhaps you could utilize Memory Mapped Files.
This article could prove helpful:
http://www.codeproject.com/KB/recipes/MemoryMappedGenericArray.aspx
LP,
Dejan

Rather than allocating a massive array, could you try utilizing an iterator? These are delay-executed, meaning values are generated only as they're requested in an foreach statement; you shouldn't run out of memory this way:
private static IEnumerable<double> MakeRandomNumbers(int numberOfRandomNumbers)
{
for (int i = 0; i < numberOfRandomNumbers; i++)
{
yield return randomGenerator.GetAnotherRandomNumber();
}
}
...
// Hooray, we won't run out of memory!
foreach(var number in MakeRandomNumbers(int.MaxValue))
{
Console.WriteLine(number);
}
The above will generate as many random numbers as you wish, but only generate them as they're asked for via a foreach statement. You won't run out of memory that way.
Alternately, If you must have them all in one place, store them in a file rather than in memory.

32bit windows has a 2GB process memory limit. The /3GB boot option others have mentioned will make this 3GB with just 1gb remaining for OS kernel use. Realistically if you want to use more than 2GB without hassle then a 64bit OS is required. This also overcomes the problem whereby although you may have 4GB of physical RAM, the address space requried for the video card can make a sizeable chuck of that memory unusable - usually around 500MB.

Well, I got a similar problem with large data set and trying to force the application to use so much data is not really the right option. The best tip I can give you is to process your data in small chunk if it is possible. Because dealing with so much data, the problem will come back sooner or later. Plus, you cannot know the configuration of each machine that will run your application so there's always a risk that the exception will happens on another pc.

I had a similar problem, it was due to a StringBuilder.ToString();

Convert your solution to x64. If you still face an issue, grant max length to everything that throws an exception like below :
var jsSerializer = new JavaScriptSerializer();
jsSerializer.MaxJsonLength = Int32.MaxValue;

If you do not need the Visual Studio Hosting Process:
Uncheck the option: Project->Properties->Debug->Enable the Visual Studio Hosting Process
And then build.
If you still face the problem:
Go to Project->Properties->Build Events->Post-Build Event Command line and paste the following:
call "$(DevEnvDir)..\..\vc\vcvarsall.bat" x86
"$(DevEnvDir)..\..\vc\bin\EditBin.exe" "$(TargetPath)" /LARGEADDRESSAWARE
Now, build the project.

Increase the Windows process limit to 3gb. (via boot.ini or Vista boot manager)

Related

Can and will C# write objects to the pagefile?

I was wondering if it is possible for C# to write objects to the pagefile.
I already know that a virtual machine for a .NET application is limited to allow one object to only use up 2 GB of ram and will run out of Memory long before that - even on 64-bit version of Windows.
However I need to be able to load a huge amount of strings (not 1 big one but many small ones) and was wondering if it is possible to load them and let them be written to the swap space until they are needed again (unfortunately it is not possible for me to prevent the loading of all the strings because it is forced by an application calling the code I am working on).
To prototype it I tried out to see if the following program will run out of memory as well (which it will), even though it does not try to allocate one huge string:
public static void Main()
{
ICollection<StringBuilder> builders = new LinkedList<StringBuilder>();
double used_ram = 2*1024*1024*1024L;
int blocksize = 12800000;
int blocks = (int) (used_ram/blocksize);
for (int i = 1; i < blocks ; i++)
{
StringBuilder sb = new StringBuilder(blocksize/2);
builders.Add(sb);
}
Console.ReadLine();
}
I was hoping that the strings would be written to the swap space and was wondering if there is any way I can force the application to do just that.
EDIT: Changed the use of swap file to pagefile (thanks for the clarifications).
Ok so to enhance my question: if the only limitation of the runtime is that ONE object can not allocate more than 2 gb of memory - why is the above code running out of memory - because the list goes over the 2 gb of memory by holding reference to all the string builders?
Short answer: no you can't.
Using the swap is not something applications do: the memory management system of the operating system is responsible of that.
If you need to load more than 2GB of data in your process in one time (and not retrieve data from disk per chunks as necessary), then you have a serious design problem.
EDIT:
As you're already using a 64-bit OS, make sure you're compiling your application for x64 platforms (or AnyCPU) and your application is not running as a 32-bit process using WOW64.
I think your question amount to:
Can a 32 bit .NET application address more than 2GB of memory.
Yes, it is possible to increase the amount of memory that applications can address by adjusting the split between the memory allocated to user applications and the operating system but I would not recommend it in most cases as it can cause subtle and not so subtle problems with O/S functions, such as networking. For details on the /3GB and /USERVA switches please see this article. This will allow your application to address 3GB (minus overhead) of memory rather than 2GB.
Tools at your disposal are:
Switching to 64 bit .NET.
Using memory mapped files.
Writing your own paging system.
Using some backing store e.g. file system, database.
You can utilize all of the windows memory management functions and allocate as much memory as the system will allow, but what you are doing screams for some type of flush to disk system. Even if you could get windows to allocate that large an object you performance would be terrible. There are many out of the box system (called databases) which allow you to put large amounts of data in them and access them it very short order.
You may be doing something wrong. I've written .net applications that use way over 2 GB of memory. If you are running 64 bit windows, there's no reason your application shouldn't be able to use much more memory.
I recommend you should write your own Cache-Class. For an example have a look here
I think what cames closes to your approach on the .Net application level would be a MemoryMappedFile.
But this would mean the application wouldn't try to get all the strings in one go.

Virtual and Physical Memory / OutOfMemoryException

I am working on a 64-bit .Net Windows Service application that essentially loads up a bunch of data for processing. While performing data volume testing, we were able to overwhelm the process and it threw an OutOfMemoryException (I do not have any performance statistics on the process when it failed.) I have a hard time believing that the process requested a chunk of memory that would have exceeded the allowable address space for the process since its running on a 64-bit machine. I do know that the process is running on a machine that is consistently in the neighborhood of 80%-90% physical memory usage. My question is: Can the CLR throw an OutOfMemoryException if the machine is critically low on available physical memory even though the process wouldn't exceed it's allowable amount of virtual memory?
Thanks for your help!
There are still some reachable limits in place in a 64-bit environment. Check this page for some of the most common ones. In short, yes, you can still run out of memory, if your program loads a whopping 128GB of data into virtual memory. You could also still be limited by the 2GB max per-process limit if you do not have the IMAGE_FILE_LARGE_ADDRESS_AWARE environment variable set.
Another possibility is that the program tried to allocate a single block of memory larger than 2 gigabytes, which is a .NET limitation. This can happen when adding things to a collection (most often a Dictionary or HashSet, but also a List or any other collection that grows automatically.)
Dictionary and HashSet do this often if you're trying to put more than about 47 million items into the collection. Although the collection can hold about 89.5 million, the algorithm that grows the collection does so by doubling. If you start from an empty Dictionary and begin adding items, the collection doubles a number of times until it reaches about 47 million. Then it tries to double again and throws OutOfMemoryException.
The way to avoid the exception is to pre-allocate the collection so that its capacity is large enough to hold however many items you expect to put into it.
There's what you can theoretically address.
There's what physical me mory you have, and whe nyou exceed that you start using swap, which is often limited to the size of some chosen disk partitions.
As a rule of thumb one tends to have a small number (like one or two) of multiples of physical memory as swap.
So yes it's quite likely that you are out of available, as opposed to addressable memory.

.NET OutOfMemoryException

Why does this:
class OutOfMemoryTest02
{
static void Main()
{
string value = new string('a', int.MaxValue);
}
}
Throw the exception; but this wont:
class OutOfMemoryTest
{
private static void Main()
{
Int64 i = 0;
ArrayList l = new ArrayList();
while (true)
{
l.Add(new String('c', 1024));
i++;
}
}
}
Whats the difference?
Have you looked up int.MaxValue in the docs? it's the equivalent of 2GB, which is probably more RAM than you have available for a contiguous block of 'a' characters - that is what you are asking for here.
http://msdn.microsoft.com/en-us/library/system.int32.maxvalue.aspx
Your infinite loop will eventually cause the same exception (or a different one indirectly related to overuse of RAM), but it will take a while. Try increasing 1024 to 10 * 1024 * 1024 to reproduce the symptom faster in the loop case.
When I run with this larger string size, I get the exception in under 10 seconds after 68 loops (checking i).
Your
new string('a', int.MaxValue);
throws an OutOfMemoryException simply because .NET's string has a length limitation. The "Remarks" section in the MSDN docs says:
The maximum size of a String object in memory is 2 GB, or about 1 billion characters.
On my system (.NET 4.5 x64) new string('a', int.MaxValue/2 - 31) throws, whereas new string('a', int.MaxValue/2 - 32) works.
In your second example, the infinite loop allocates ~2048 byte blocks until your OS cannot allocate any more block in the virtual address space. When this is reached, you'll get an OutOfMemoryException too.
(~2048 byte = 1024 chars * 2 bytes per UTF-16 code point + string overhead bytes)
Try this great article of Eric.
Because int.MaxValue is 2,147,483,647, or, 2 gigabytes which needs to be allocated contiguously.
In the second example, the OS only needs to find 1024 bytes to allocate each time and can swap to hard-drive. I am sure if you left it running long enough you'd end up in a dark place :)
The String object can use a backing shared string pool to reduce memory usage. In the former case, you're generating one string thats several gigabytes. In the second case, its likely the compiler is auto-interning the string, so you're generating a 1024 byte string, and then referencing that same string many times.
That being said, an ArrayList of that size should run you out of memory, but its likely you haven't let the code run long enough for it to run out of memory.
The 2nd snippet will crash as well. It just takes a wholeheckofalot longer since it is consuming memory much slower. Pay attention to your hard disk access light, it's furiously blinking while Windows chucks pages out of RAM to make room. The first string constructor immediately fails since the heap manager won't allow you to allocate 4 gigabytes.
Both versions will cause an OOM exception, it's just that (on a 32bit machine) you will get it immediately with the first version when you try to allocate a "single" very large object.
The second version will take much longer however as there will be a lot of thrashing to get to the OOM condition for a couple of factors:
You will be allocating millions of small objects which are all reachable by the GC. Once you start putting the system under pressure, the GC will spend an inordinate amount of time scanning generations with millions and millions of objects in. This will take a considerable amount of time and start to play havoc with paging as cold and hot memory will be constantly paged in and out as generations are scanned.
There will be page thrashing as GC scans millions of objects in generations to try and free memory. Scanning will cause huge amounts of memory to be paged in and out constantly.
The thrashing will cause the system to grind to a halt processing overhead and so the OOM condition will take a long time to be reached. Most time will be spent thrashing on the GC and paging for the second version.
In your first sample you are trying to create a 2g string at one time
In the second example you keep adding 1k to an array. You will need to loop more than 2 million times to reach the same amount of consumption.
And it's also not all stored at once, in one variable. Thus, some of your memory usage can be persisted to disk to make room for the new data, I think.
Because a single object cannot have more than 2 GB:
First some background; in the 2.0 version of the .Net runtime (CLR) we made a conscious design decision to keep the maximum object size allowed in the GC Heap at 2GB, even on the 64-bit version of the runtime
In your first example, you try to allocate one object that 2 GB, with the object overhead (8 Bytes?) it's simply too big.
I don't know how the ArrayList works internally, but you allocate multiple objects of 2 GB each and the ArrayList - to my knowledge - only holds pointers which are 4 (8 on x64?) Bytes, regardless how big the object they point to is.
To quote another article:
Also, objects that have references to other objects store only the reference. So if you have an object that holds references to three other objects, the memory footprint is only 12 extra bytes: one 32-bit pointer to each of the referenced objects. It doesn't matter how large the referenced object is.
One reason your system could be coming to a halt is because .NET's code runs closer to the metal and you're in a tight loop which should consume 100% CPU provided the process priority allows it to. If you would like to prevent the application from consuming too much CPU while it performs the tight loop you should add something like System.Threading.Thread.Sleep(10) to the end of the loop, which will forcibly yield processing time to other threads.
One major difference between the JVM and .NET's CLR (Common Language Runtime) is that the CLR does not limit the size of your memory on an x64 system/application (in 32bit applications, without the Large Address Aware flag the OS limits any application to 2GB due to addressing limitations). The JIT compiler creates native windows code for your processing architecture and then runs it in the same scope that any other windows application would run. The JVM is a more isolated sandbox which constrains the application to a specified size depending on configuration/command line switches.
As for differences between the two algorithms:
The single string creation is not guaranteed to fail when running in an x64 environment with enough contiguous memory to allocate the 4GB necessary to contain int.MaxValue characters (.NET strings are Unicode by default, which requires 2 bytes per character). A 32 bit application will always fail, even with the Large Address Aware flag set because the maximum memory is still something like 3.5GB).
The while loop version of your code will likely consume more overall memory, provided you have plenty available, before throwing the exception because your strings can be allocated in smaller fragments, but it is guaranteed to hit the error eventually (although if you have plenty of resources, it could happen as a result of the ArrayList exceeding the maximum number of elements in an array rather than the inability to allocate new space for a small string). Kent Murra is also correct about string interning; you will either need to randomize the length of the string or the character contents to avoid interning, otherwise you're simply creating pointers to the same string. Steve Townsend's recommendation to increase string length would also make finding large enough contiguous blocks of memory harder to come by, which will allow the exception to happen more quickly.
EDIT:
Thought I'd give some links people may find handy for understanding .NET memory:
These two articles are a little older, but very good in depth reading:
Garbage Collection: Automatic Memory Management in the Microsoft .NET Framework
Garbage Collection Part 2: Automatic Memory Management in the Microsoft .NET Framework
These are blogs from a .NET Garbage Collection developer for information about newer version of .NET memory management:
So, what’s new in the CLR 4.0 GC?
CLR 4.5: Maoni Stephens - Server Background GC
This SO Question may help you observe the inner workings of .NET memory:
.NET Memory Profiling Tools

Why am I getting an Out Of Memory Exception in my C# application?

My memory is 4G physical, but why I got out of memory exception even if I create just 1.5G memory object. Any ideas why? (I saw at the same time, in the performance tab of task manager the memory is not full occupied, and I could also type here -- so memory is not actually low, so I think I hit some other memory limitations)?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace TestBigMemoryv1
{
class MemoryHolderFoo
{
static Random seed = new Random();
public Int32 holder1;
public Int32 holder2;
public Int64 holder3;
public MemoryHolderFoo()
{
// prevent from optimized out
holder1 = (Int32)seed.NextDouble();
holder2 = (Int32)seed.NextDouble();
holder3 = (Int64)seed.NextDouble();
}
}
class Program
{
static int MemoryThreshold = 1500; //M
static void Main(string[] args)
{
int persize = 16;
int number = MemoryThreshold * 1000 * 1000/ persize;
MemoryHolderFoo[] pool = new MemoryHolderFoo[number];
for (int i = 0; i < number; i++)
{
pool[i] = new MemoryHolderFoo();
if (i % 10000 == 0)
{
Console.Write(".");
}
}
return;
}
}
}
In a normal 32 bit windows app, the process only has 2GB of addressable memory. This is irrelevant to the amount of physical memory that is available.
So 2GB available but 1.5 is the max you can allocate. The key is that your code is not the only code running in the process. The other .5 GB is probably the CLR plus fragmentation in the process.
Update: in .Net 4.5 in 64 bit process you can have large arrays if gcAllowVeryLargeObjects setting is enabled:
On 64-bit platforms, enables arrays that are greater than 2 gigabytes (GB) in total size.
The maximum number of elements in an array is UInt32.MaxValue.
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
Just additional to the other points; if you want access to a dirty amount of memory, consider x64 - but be aware that the maximum single object size is still 2GB. And because references are larger in x64, this means that you actually get a smaller maximum array/list size for reference-types. Of course, by the time you hit that limit you are probably doing things wrong anyway!
Other options:
use files
use a database
(obviously both has a performance difference compared to in-process memory)
Update: In versions of .NET prior to 4.5, the maximum object size is 2GB. From 4.5 onwards you can allocate larger objects if gcAllowVeryLargeObjects is enabled. Note that the limit for string is not affected, but "arrays" should cover "lists" too, since lists are backed by arrays.
Just to add to the previous replies: you can go beyond the 2Gb limit on systems booted with the /3Gb [and optionally userva] boot flags.
Check that you are building a 64-bit process, and not a 32-bit one, which is the default compilation mode of Visual Studio. To do this, right click on your project, Properties -> Build -> platform target : x64. As any 32-bit process, Visual Studio applications compiled in 32-bit have a virtual memory limit of 2GB.
Each process has its own virtual memory, called an address space, into which it maps the code that it executes and the data it manipulates. A 32-bit process uses 32-bit virtual memory address pointers, which creates an absolute upper limit of 4GB (2^32) for the amount of virtual memory that a 32-bit process can address. However, the operating system requires half of it (to reference its own code and data), creating a limit of 2GB for each process. If your 32-bit application tries to consume more than the entire 2GB of its address space, it will return “System.OutOfMemory”, even though the physical memory of your computer is not full.
64-bit processes do not have this limitation, as they use 64-bit pointers, so their theoretical maximum address space is 16 exabytes (2^64). In reality, Windows x64 limits the virtual memory of processes to 8TB. The solution to the memory limit problem is then to compile in 64-bit.
However, object’s size in Visual Studio is still limited to 2GB, by default. You will be able to create several arrays whose combined size will be greater than 2GB, but you cannot by default create arrays bigger than 2GB. Hopefully, if you still want to create arrays bigger than 2GB, you can do it by adding the following code to you app.config file:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
You've got a max of 2Gb addressable memory as a 32bit app, as the other posters mentioned. Dont forget about overhead. You're creating an array of 93 million objects - if there happens to be 4 bytes of overhead per object that's an extra 350Mb of memory.
One more thing to be aware of; some .NET objects require 'contiguous' memory. i.e if you are trying to allocate a large array the system may need not only sufficient free memory in your process but also for all that free memory to be in one big chunk... and unfortunately the process memory gets fragmented over time so this may not be available.
Some objects/data types have this requirement and some don't... I can't remember which ones do, but I seem to recall StringBuilder and MemoryStream have different requirements.
On a 32-bit Windows Operating system the maximum 'user-mode' memory that a single application can access is 2GB... assuming that you have 4GB of memory on the box.
Unmanaged VC++ Application's memory consumption on windows server
http://blogs.technet.com/markrussinovich/archive/2008/07/21/3092070.aspx
(It's funny you asked this because I asked almost the same thing yesterday...)

Finding how much memory I can allocate for an array in C#

I am doing some calculations that require a large array to be initialized. The maximum size of the array determines the maximum size of the problem I can solve.
Is there a way to programmatically determine how much memory is available for say, the biggest array of bytes possible?
Thanks
Well, relying on a single huge array has a range of associated issues - memory fragmentation, contiguous blocks, the limit on the maximum object size, etc. If you need a lot of data, I would recommend creating a class that simulates a large array using lots of smaller (but still large) arrays, each of fixed size - i.e. the indexer divides to to find the appropriate array, then uses % to get the offset inside that array.
You might also want to ensure you are on a 64-bit OS, with lots of memory. This will give you the maximum available head-room.
Depending on the scenario, more sophisticated algorithms such as sparse arrays, eta-vectors, etc might be of use to maximise what you can do. You might be amazed what people could do years ago with limited memory, and just a tape spinning backwards and forwards...
In order to ensure you have enough free memory you could use a MemoryFailPoint. If the memory cannot be allocated, then an InsufficientMemoryException will be generated, which you can catch and deal with in an appropriate way.
The short answer is "no". There are two top level resources that would need to be queried
The largest block of unallocated virtual address space available to the process
The amount of available page file space.
As Marc Gravell correctly stated, you will have your best success on a 64-bit platform. Here, each process has a huge virtual address space. This will effectively solve your first problem. You should also make sure the page file is large.
But, there is a better way that is limited only by the free space on your disk: memory mapped files. You can create a large mapping (say 512MB) into an arbitrarily large file and move it as you process your data. Note, be sure to open it for exclusive access.
If you need really, really big arrays, don't use the CLR. Mono supports 64-bit array indexes, allowing you to fully take advantage of your memory resources.
I suppose binary search could be a way to go. First, start by allocating 1 byte, if that succeeds, free that byte (set the object to null) and double it to 2 bytes. Go on until you can't allocate any more, and you have found a limit that you can consider "the lower limit".
The correct number of bytes that can be allocated (let's call it x) is within the interval lower < x < 2 * lower. Continue searching this interval using binary search.
The biggest array one can allocate in a 64 bit .NET program is 2GB. (Another reference.)
You could find out how many available bytes there are easily enough:
Using pc As New System.Diagnostics.PerformanceCounter("Memory", "Available Bytes")
FreeBytes = pc.NextValue();
End Using
Given that information you should be able to make your decision.

Categories