Anybody know what the max number of items in a List is?
How do I increase that size? Or is there a collection that takes infinite items? (as much as would fit in memory, that is)
EDIT:
I get an out of memory exception when Count = 134217728 in a list of ints. got 3Gb of RAM of which 2.2 are in use. Sound normal
List<T> will be limited to the max of an array, which is 2GB (even in x64). If that isn't enough, you're using the wrong type of data storage. You can save a lot of overhead by starting it the right size, though - by passing an int to the constructor.
Re your edit - with 134217728 x Int32, that is 512MB. Remember that List<T> uses a doubling algorithm; if you are drip-feeding items via Add (without allocating all the space first) it is going to try to double to 1GB (on top of the 512MB you're already holding, the rest of your app, and of course the CLR runtime and libraries). I'm assuming you're on x86, so you already have a 2GB limit per process, and it is likely that you have fragmented your "large object heap" to death while adding items.
Personally, yes, it sounds about right to start getting an out-of-memory at this point.
Edit: in .NET 4.5, arrays larger than 2GB are allowed if the <gcAllowVeryLargeObjects> switch is enabled. The limit then is 2^31 items. This might be useful for arrays of references (8 bytes each in x64), or an array of large structs.
The List limit is 2.1 Billion objects or the size of your memory which ever is hit first.
It's limited only by memory.
edit: or not, 2Gb's is the limit! This is quite interesting, BigArray, getting around the 2GB array size limit
On a x64 Machine, using .Net Framework 4 (Not the Client Profile), compiling for Any CPU in Release mode, I could chew up all the available memory. My process is now 5.3GB and I've consumed all available memory (8GB) on my PC. It's actually a Server 2008 R2 x64.
I used a custom Collection class based on CollectionBase to store 61,910,847 instances of the following class:
public class AbbreviatedForDrawRecord {
public int DrawId { get; set; }
public long Member_Id { get; set; }
public string FactorySerial { get; set; }
public AbbreviatedForDrawRecord() {
}
public AbbreviatedForDrawRecord(int drawId, long memberId, string factorySerial) {
this.DrawId = drawId;
this.Member_Id = memberId;
this.FactorySerial = factorySerial;
}
}
The List will dynamically grow itself to accomodate as many items as you want to include - until you exceed available memory!
From MSDN documentation:
If Count already equals Capacity, the capacity of the List is increased by automatically reallocating the internal array, and the existing elements are copied to the new array before the new element is added.
If Count is less than Capacity, this method is an O(1) operation. If the capacity needs to be increased to accommodate the new element, this method becomes an O(n) operation, where n is Count.
The interface defines Count and IndexOf etc as type int so I would assume that int.MaxValue or 2,147,483,647 is the most items you could stick in a list.
Really got to question why on earth you would need that many, there is likely to be a more sensible approach to managing the data.
Related
I am using a Dictionary<Int,Int> to store the frequency of colors in an image, where the key is the the color (as an int), and the value is the number of times the color has been found in the image.
When I process larger / more colorful images, this dictionary grows very large. I get an out of memory exception at just around 6,000,000 entries. Is this the expected capacity when running in 32-bit mode? If so, is there anything I can do about it? And what might be some alternative methods of keeping track of this data that won't run out of memory?
For reference, here is the code that loops through the pixels in a bitmap and saves the frequency in the Dictionary<int,int>:
Bitmap b; // = something...
Dictionary<int, int> count = new Dictionary<int, int>();
System.Drawing.Color color;
for (int i = 0; i < b.Width; i++)
{
for (int j = 0; j < b.Height; j++)
{
color = b.GetPixel(i, j);
int colorString = color.ToArgb();
if (!count.Keys.Contains(color.ToArgb()))
{
count.Add(colorString, 0);
}
count[colorString] = count[colorString] + 1;
}
}
Edit: In case you were wondering what image has that many different colors in it: http://allrgb.com/images/mandelbrot.png
Edit: I also should mention that this is running inside an asp.net web application using .Net 4.0. So there may be additional memory restrictions.
Edit: I just ran the same code inside a console application and had no problems. The problem only happens in ASP.Net.
Update: Given the OP's sample image, it seems that the maximum number of items would be over 16 million, and apparently even that is too much to allocate when instantiating the dictionary. I see three options here:
Resize the image down to a manageable size and work from that.
Try to convert to a color scheme with fewer color possibilities.
Go for an array of fixed size as others have suggested.
Previous answer: the problem is that you don't allocate enough space for your dictionary. At some point, when it is expanding, you just run out of memory for the expansion, but not necessarily for the new dictionary.
Example: this code runs out of memory at nearly 24 million entries (in my machine, running in 32-bit mode):
Dictionary<int, int> count = new Dictionary<int, int>();
for (int i = 0; ; i++)
count.Add(i, i);
because with the last expansion it is currently using space for the entries already there, and tries to allocate new space for another so many million more, and that is too much.
Now, if we initially allocate space for, say, 40 million entries, it runs without problem:
Dictionary<int, int> count = new Dictionary<int, int>(40000000);
So try to indicate how many entries there will be when creating the dictionary.
From MSDN:
The capacity of a Dictionary is the number of elements that can be added to the Dictionary before resizing is necessary. As elements are added to a Dictionary, the capacity is automatically increased as required by reallocating the internal array.
If the size of the collection can be estimated, specifying the initial capacity eliminates the need to perform a number of resizing operations while adding elements to the Dictionary.
Each dictionary entry holds two 4-byte integers: 8 bytes total. 8 bytes * 6 millions entries is only about 48MB, +/- some space for object overhead, alignment, etc. There's plenty of space in memory for this. .Net provides virtual address space of up to 2 GB per process. 48MB or so shouldn't cause a problem.
I expect what's actually happening here is related to how the dictionary auto-expands and how the garbage collector handles (or doesn't handle) compaction.
First, the auto-expanding part. Last time I checked (back around .Net 2.0*), collections in .Net tended to use arrays internally. They would allocated a reasonably-sized array in the collection constructor (say, 10 items), and then use a doubling algorithm to create additional space whenever the array filled up. All the existing items would have to be copied to the new array, but then the old array could be garbage collected. The garbage collector is pretty reliable about this, and so it means you're left using space for at most 2n - 1 items in the collection.
Now the Garbage Collector compaction part. After a certain size, these arrays end up in a section of memory called the Large Object Heap. Garbage Collection still works here (though less often). What doesn't really work here well is compaction (think memory defragmentation). The physical memory used by the old object will be released, returned to the operating system, and available for other processes. However, the virtual address space in your process... the table that maps program memory offsets to physical memory addresses, will still have the (empty) space reserved.
This is important, because remember: we're working with a rapidly growing object. It's possible for such an object to take up address space far larger than the final size of the object itself. An object grows enough, fast enough, and suddenly you get an OutOfMemoryException, even though your app isn't really using all that much RAM.
The first solution here is allocate enough space in the initial collection for all of your data. This allows you to skip all those re-allocations and copying. Your data will live in a single array, and use only the space you actually asked for. Most collections, including the Dictionary, have an overload for the constructor that allows you to give it the number of items you want the first array to use. Be careful here: you don't need to allocate an item for every pixel in your image. There will be a lot of repeated colors. You only need to allocate enough to have space for each color in your image. If it's only large images that give you problems, and you're almost handling them with six millions records, you might find that 8 million is plenty.
My next suggestion is to group your pixel colors. A human can't tell and doesn't care if two colors are just one bit apart in any of the rgb components. You might go as far as to look at the separate RGB values for each pixel and normalize the pixel so that you only care about changes of more than 5 or so for an R,G,or B value. That would get you from 16.5 million potential colors all the way down to only about 132,000, and the data will likely be more useful, too. That might look something like this:
var colorCounts = new Dictionary<Color, int>(132651);
foreach(Color c in GetImagePixels().Select( c=> Color.FromArgb( (c.R/5) * 5, (c.G/5) * 5, (c.B/5) * 5) )
{
colorCounts[c] += 1;
}
* IIRC, somewhere in a recent or upcoming version of .Net both of these issues are being addressed. One by allowing you to force compaction of the LOH, and the other by using a set of arrays for collection backing stores, rather than trying to keep everything in one big array
The maximum size limit provided by CLR is 2GB
When you run a 64-bit managed application on a 64-bit Windows
operating system, you can create an object of no more than 2 gigabytes
(GB).
You may better use an array.
You may also check this BigArray<T>, getting around the 2GB array size limit
In the 32 bit runtime, the maximum number of items you can have in a Dictionary<int, int> is in the neighborhood of 61.7 million. See my old article for more info.
If you're running in 32 bit mode, then your entire application plus whatever bits of ASP.NET and the underlying machinery is required all have to fit within the memory available to your process: normally 2 GB in the 32-bit runtime.
By the way, a really wacky way to solve your problem (but one I wouldn't recommend unless you're really hurting for memory), would be the following (assuming a 24-bit image):
Call LockBits to get a pointer to the raw image data
Compress the per-scan-line padding by moving the data for each scan line to fill the previous row's padding. You end up with an array of 3-byte values followed by a bunch of empty space (to equal the padding).
Sort the image data. That is, sort the 3-byte values. You'd have to write a custom sort, but it wouldn't be too bad.
Go sequentially through the array and count the number of unique values.
Allocate a 2-dimensional array: int[count,2] to hold the values and their occurrence counts.
Go sequentially through the array again to count occurrences of each unique value and populate the counts array.
I wouldn't honestly suggest using this method. Just got a little laugh when I thought of it.
Try using an array instead. I doubt it will run out of memory. 6 million int array elements is not a big deal.
I want to make an array of size 10^9 elements where each elements can be an integer of the same size. I always get an OutOfMemoryException at the initialization line. How can I achieve this?
If this is not possible, please suggest alternative strategies?
Arrays are limited to 2GB in .net 4.0 or earlier, even in a 64 bit process. So with one billion elements, the maximum supported element size is two bytes, but an int is four bytes. So this will not work.
If you want to have a larger collection, you need to write it yourself, backed by multiple arrays.
In .net 4.5 it's possible to avoid this limitation, see Jon Skeet's answer for details.
Assuming you mean int as the element type, you can do this using .NET 4.5, if you're using a 64-bit CLR.
You need to use the <gcAllowVeryLargeObjects> configuration setting. This is not on by default.
If you're using an older CLR or you're on a 32-bit machine, you're out of luck. Of course, if you're on a 64-bit machine but just an old version of the CLR, you could encapsulate your "one large array" into a separate object which has a list of smaller arrays... you could even implement IList<int> that way so that most code wouldn't need to know that you weren't really using a single array.
(As noted in comments, you'll still only be able to create an array with 231 elements; but your requirement of 109 is well within this.)
I think you shouldn't load all this data into memory, store it somewhere in a file, and make a class that could work as an array but actually reads and writes data from and in the file
this is the general idea (this won't work as it is of course, plus you'll have to convert your int values into byte[] array before writing it and some other things)
public class FileArray
{
Stream s;
public this[int index]
{
get { s.Position = index * 4; return s.Read(); }
set { s.Position = index * 4; s.Write(value); }
}
}
that way you would have something that works just like an array, but the data would be stored on your hard drive
Today I noticed that C#'s String class returns the length of a string as an Int. Since an Int is always 32-bits, no matter what the architecture, does this mean that a string can only be 2GB or less in length?
A 2GB string would be very unusual, and present many problems along with it. However, most .NET api's seem to use 'int' to convey values such as length and count. Does this mean we are forever limited to collection sizes which fit in 32-bits?
Seems like a fundamental problem with the .NET API's. I would have expected things like count and length to be returned via the equivalent of 'size_t'.
Seems like a fundamental problem with
the .NET API...
I don't know if I'd go that far.
Consider almost any collection class in .NET. Chances are it has a Count property that returns an int. So this suggests the class is bounded at a size of int.MaxValue (2147483647). That's not really a problem; it's a limitation -- and a perfectly reasonable one, in the vast majority of scenarios.
Anyway, what would the alternative be? There's uint -- but that's not CLS-compliant. Then there's long...
What if Length returned a long?
An additional 32 bits of memory would be required anywhere you wanted to know the length of a string.
The benefit would be: we could have strings taking up billions of gigabytes of RAM. Hooray.
Try to imagine the mind-boggling cost of some code like this:
// Lord knows how many characters
string ulysses = GetUlyssesText();
// allocate an entirely new string of roughly equivalent size
string schmulysses = ulysses.Replace("Ulysses", "Schmulysses");
Basically, if you're thinking of string as a data structure meant to store an unlimited quantity of text, you've got unrealistic expectations. When it comes to objects of this size, it becomes questionable whether you have any need to hold them in memory at all (as opposed to hard disk).
Correct, the maximum length would be the size of Int32, however you'll likely run into other memory issues if you're dealing with strings larger than that anyway.
At some value of String.length() probably about 5MB its not really practical to use String anymore. String is optimised for short bits of text.
Think about what happens when you do
msString += " more chars"
Something like:
System calculates length of myString plus length of " more chars"
System allocates that amount of memory
System copies myString to new memory location
System copies " more chars" to new memory location after last copied myString char
The original myString is left to the mercy of the garbage collector.
While this is nice and neat for small bits of text its a nightmare for large strings, just finding 2GB of contiguous memory is probably a showstopper.
So if you know you are handling more than a very few MB of characters use one of the *Buffer classes.
It's pretty unlikely that you'll need to store more than two billion objects in a single collection. You're going to incur some pretty serious performance penalties when doing enumerations and lookups, which are the two primary purposes of collections. If you're dealing with a data set that large, There is almost assuredly some other route you can take, such as splitting up your single collection into many smaller collections that contain portions of the entire set of data you're working with.
Heeeey, wait a sec.... we already have this concept -- it's called a dictionary!
If you need to store, say, 5 billion English strings, use this type:
Dictionary<string, List<string>> bigStringContainer;
Let's make the key string represent, say, the first two characters of the string. Then write an extension method like this:
public static string BigStringIndex(this string s)
{
return String.Concat(s[0], s[1]);
}
and then add items to bigStringContainer like this:
bigStringContainer[item.BigStringIndex()].Add(item);
and call it a day. (There are obviously more efficient ways you could do that, but this is just an example)
Oh, and if you really really really do need to be able to look up any arbitrary object by absolute index, use an Array instead of a collection. Okay yeah, you use some type safety, but you can index array elements with a long.
The fact that the framework uses Int32 for Count/Length properties, indexers etc is a bit of a red herring. The real problem is that the CLR currently has a max object size restriction of 2GB.
So a string -- or any other single object -- can never be larger than 2GB.
Changing the Length property of the string type to return long, ulong or even BigInteger would be pointless since you could never have more than approx 2^30 characters anyway (2GB max size and 2 bytes per character.)
Similarly, because of the 2GB limit, the only arrays that could even approach having 2^31 elements would be bool[] or byte[] arrays that only use 1 byte per element.
Of course, there's nothing to stop you creating your own composite types to workaround the 2GB restriction.
(Note that the above observations apply to Microsoft's current implementation, and could very well change in future releases. I'm not sure whether Mono has similar limits.)
In versions of .NET prior to 4.5, the maximum object size is 2GB. From 4.5 onwards you can allocate larger objects if gcAllowVeryLargeObjects is enabled. Note that the limit for string is not affected, but "arrays" should cover "lists" too, since lists are backed by arrays.
Even in x64 versions of Windows I got hit by .Net limiting each object to 2GB.
2GB is pretty small for a medical image. 2GB is even small for a Visual Studio download image.
If you are working with a file that is 2GB, that means you're likely going to be using a lot of RAM, and you're seeing very slow performance.
Instead, for very large files, consider using a MemoryMappedFile (see: http://msdn.microsoft.com/en-us/library/system.io.memorymappedfiles.memorymappedfile.aspx). Using this method, you can work with a file of nearly unlimited size, without having to load the whole thing in memory.
I'm doing some Project Euler exercises and I've run into a scenario where I have want arrays which are larger than 2,147,483,647 (the upper limit of int in C#).
Sure these are large arrays, but for instance, I can't do this
// fails
bool[] BigArray = new BigArray[2147483648];
// also fails, cannot convert uint to int
ArrayList BigArrayList = new ArrayList(2147483648);
So, can I have bigger arrays?
EDIT:
It was for a Sieve of Atkin, you know, so I just wanted a really big one :D
Anytime you are working with an array this big, you should probably try to find a better solution to the problem. But that being said I'll still attempt to answer your question.
As mentioned in this article there is a 2 GB limit on any object in .Net. For all x86, x64 and IA64.
As with 32-bit Windows operating
systems, there is a 2GB limit on the
size of an object you can create while
running a 64-bit managed application
on a 64-bit Windows operating system.
Also if you define an array too big on the stack, you will have a stack overflow. If you define the array on the heap, it will try to allocate it all in one big continuous block. It would be better to use an ArrayList which has implicit dynamic allocation on the heap. This will not allow you to get past the 2GB, but will probably allow you to get closer to it.
I think the stack size limit will be bigger only if you are using an x64 or IA64 architecture and operating system. Using x64 or IA64 you will have 64-bit allocatable memory instead of 32-bit.
If you are not able to allocate the array list all at once, you can probably allocate it in parts.
Using an array list and adding 1 object at a time on an x64 Windows 2008 machine with 6GB of RAM, the most I can get the ArrayList to is size: 134217728. So I really think you have to find a better solution to your problem that does not use as much memory. Perhaps writing to a file instead of using RAM.
The array limit is, afaik, fixed as int32 even on 64-bit. There is a cap on the maximum size of a single object. However, you could have a nice big jagged array quite easily.
Worse; because references are larger in x64, for ref-type arrays you actually get less elements in a single array.
See here:
I’ve received a number of queries as
to why the 64-bit version of the 2.0
.Net runtime still has array maximum
sizes limited to 2GB. Given that it
seems to be a hot topic of late I
figured a little background and a
discussion of the options to get
around this limitation was in order.
First some background; in the 2.0
version of the .Net runtime (CLR) we
made a conscious design decision to
keep the maximum object size allowed
in the GC Heap at 2GB, even on the
64-bit version of the runtime. This is
the same as the current 1.1
implementation of the 32-bit CLR,
however you would be hard pressed to
actually manage to allocate a 2GB
object on the 32-bit CLR because the
virtual address space is simply too
fragmented to realistically find a 2GB
hole. Generally people aren’t
particularly concerned with creating
types that would be >2GB when
instantiated (or anywhere close),
however since arrays are just a
special kind of managed type which are
created within the managed heap they
also suffer from this limitation.
It should be noted that in .NET 4.5 the memory size limit is optionally removed by the gcAllowVeryLargeObjects flag, however, this doesn't change the maximum dimension size. The key point is that if you have arrays of a custom type, or multi-dimension arrays, then you can now go beyond 2GB in memory size.
You don't need an array that large at all.
When your method runs into resource problems, don't just look at how to expand the resources, look at the method also. :)
Here's a class that uses a 3 MB buffer to calculate primes using the sieve of Eratosthenes. The class keeps track of how far you have calculated primes, and when the range needs to be expanded it creates a buffer to test another 3 million numbers.
It keeps the found prime numbers in a list, and when the range is expanded the previos primes are used to rule out numbers in the buffer.
I did some testing, and a buffer around 3 MB is most efficient.
public class Primes {
private const int _blockSize = 3000000;
private List<long> _primes;
private long _next;
public Primes() {
_primes = new List<long>() { 2, 3, 5, 7, 11, 13, 17, 19 };
_next = 23;
}
private void Expand() {
bool[] sieve = new bool[_blockSize];
foreach (long prime in _primes) {
for (long i = ((_next + prime - 1L) / prime) * prime - _next;
i < _blockSize; i += prime) {
sieve[i] = true;
}
}
for (int i = 0; i < _blockSize; i++) {
if (!sieve[i]) {
_primes.Add(_next);
for (long j = i + _next; j < _blockSize; j += _next) {
sieve[j] = true;
}
}
_next++;
}
}
public long this[int index] {
get {
if (index < 0) throw new IndexOutOfRangeException();
while (index >= _primes.Count) {
Expand();
}
return _primes[index];
}
}
public bool IsPrime(long number) {
while (_primes[_primes.Count - 1] < number) {
Expand();
}
return _primes.BinarySearch(number) >= 0;
}
}
I believe that even within a 64 bit CLR, there's a limit of 2GB (or possibly 1GB - I can't remember exactly) per object. That would prevent you from creating a larger array. The fact that Array.CreateInstance only takes Int32 arguments for sizes is suggestive too.
On a broader note, I suspect that if you need arrays that large you should really change how you're approaching the problem.
I'm very much a newbie with C# (i.e. learning it this week), so I'm not sure of the exact details of how ArrayList is implemented. However, I would guess that as you haven't defined a type for the ArrayList example, then the array would be allocated as an array of object references. This might well mean that you are actually allocating 4-8Gb of memory depending on the architecture.
According to MSDN, the index for array of bytes cannot be greater than 2147483591. For .NET prior to 4.5 it also was a memory limit for an array. In .NET 4.5 this maximum is the same, but for other types it can be up to 2146435071.
This is the code for illustration:
static void Main(string[] args)
{
// -----------------------------------------------
// Pre .NET 4.5 or gcAllowVeryLargeObjects unset
const int twoGig = 2147483591; // magic number from .NET
var type = typeof(int); // type to use
var size = Marshal.SizeOf(type); // type size
var num = twoGig / size; // max element count
var arr20 = Array.CreateInstance(type, num);
var arr21 = new byte[num];
// -----------------------------------------------
// .NET 4.5 with x64 and gcAllowVeryLargeObjects set
var arr451 = new byte[2147483591];
var arr452 = Array.CreateInstance(typeof(int), 2146435071);
var arr453 = new byte[2146435071]; // another magic number
return;
}
In C# 2008, what is the Maximum Size that an Array can hold?
System.Int32.MaxValue
Assuming you mean System.Array, ie. any normally defined array (int[], etc). This is the maximum number of values the array can hold. The size of each value is only limited by the amount of memory or virtual memory available to hold them.
This limit is enforced because System.Array uses an Int32 as it's indexer, hence only valid values for an Int32 can be used. On top of this, only positive values (ie, >= 0) may be used. This means the absolute maximum upper bound on the size of an array is the absolute maximum upper bound on values for an Int32, which is available in Int32.MaxValue and is equivalent to 2^31, or roughly 2 billion.
On a completely different note, if you're worrying about this, it's likely you're using alot of data, either correctly or incorrectly. In this case, I'd look into using a List<T> instead of an array, so that you are only using as much memory as needed. Infact, I'd recommend using a List<T> or another of the generic collection types all the time. This means that only as much memory as you are actually using will be allocated, but you can use it like you would a normal array.
The other collection of note is Dictionary<int, T> which you can use like a normal array too, but will only be populated sparsely. For instance, in the following code, only one element will be created, instead of the 1000 that an array would create:
Dictionary<int, string> foo = new Dictionary<int, string>();
foo[1000] = "Hello world!";
Console.WriteLine(foo[1000]);
Using Dictionary also lets you control the type of the indexer, and allows you to use negative values. For the absolute maximal sized sparse array you could use a Dictionary<ulong, T>, which will provide more potential elements than you could possible think about.
Per MSDN it is
By default, the maximum size of an Array is 2 gigabytes (GB).
In a 64-bit environment, you can avoid the size restriction by setting the enabled attribute of the gcAllowVeryLargeObjects configuration element to true in the run-time environment.
However, the array will still be limited to a total of 4 billion elements.
Refer Here http://msdn.microsoft.com/en-us/library/System.Array(v=vs.110).aspx
Note: Here I am focusing on the actual length of array by assuming that we will have enough hardware RAM.
This answer is about .NET 4.5
According to MSDN, the index for array of bytes cannot be greater than 2147483591. For .NET prior to 4.5 it also was a memory limit for an array. In .NET 4.5 this maximum is the same, but for other types it can be up to 2146435071.
This is the code for illustration:
static void Main(string[] args)
{
// -----------------------------------------------
// Pre .NET 4.5 or gcAllowVeryLargeObjects unset
const int twoGig = 2147483591; // magic number from .NET
var type = typeof(int); // type to use
var size = Marshal.SizeOf(type); // type size
var num = twoGig / size; // max element count
var arr20 = Array.CreateInstance(type, num);
var arr21 = new byte[num];
// -----------------------------------------------
// .NET 4.5 with x64 and gcAllowVeryLargeObjects set
var arr451 = new byte[2147483591];
var arr452 = Array.CreateInstance(typeof(int), 2146435071);
var arr453 = new byte[2146435071]; // another magic number
return;
}
Here is an answer to your question that goes into detail:
http://www.velocityreviews.com/forums/t372598-maximum-size-of-byte-array.html
You may want to mention which version of .NET you are using and your memory size.
You will be stuck to a 2G, for your application, limit though, so it depends on what is in your array.
I think it is linked with your RAM (or probably virtual memory) space and for the absolute maximum constrained to your OS version (e.g. 32 bit or 64 bit)
I think if you don't consider the VM, it is Integer.MaxValue