What is the purpose of the LongLength property for arrays in .Net. Using a standard integer for length, you could accommodate up to 2 billion indices. Are there really people using .Net to maintain a single array with more the 2 billion elements. Even if each element was a single byte, that would still be 2 GB of data. Is it feasible to use such a large array in .Net?
-
It's very possible to have an array with more than 2 Billion entries in a 64 bit scenario. LongLength is indeed meant to support such scenarios.
As to whether or not that is actually used. I can say with certainty that there is some customer, somewhere, that considers this a vital business need. Customers find uses for features that you've never thought possible.
-
For example, if you had a > 2 GB file and needed to read it all into memory at once, that would call for such an array. Not that that is necessarily a recommended approach most of the time, but there could well be some case (on a powerful enough 64 bit system with a lot of memory anyway) that this might be required (maybe for performance reasons?).
Edit: Of course, it should be noted that as of CLR 2.0, having an array > 2 GB isn't actually supported (all the implementation of LongLength does is cast Length into a long, and attempting to create a bigger array will fail)... but maybe Microsoft is planning to add support later...?
-
Plus it returns the total number of elements in all the dimensions of the Array so it can be an array with "just" a half billion elements and 4 dimensions to make it needed to be 64-bit int.
Kibbee : Isn't that still 2 billion elements in total?Stefan : Yes of course, what I mean is that if you think "why would anyone want to handle 2 billion elements" it could seem more reasonable if was just a half billion elements but with some attatched properties in related dimensions. -
There's a school of thought known as the "0, 1 or N" school which believes that you should have either:
- none of something;
- one of something; or
- any number of something, as resources permit.
In other words, don't set arbitrary limits if you don't have to. Arbitrary limits have given us such monstrosities as:
- the 640K limit in early PCs.
- buffer overflow vulnerabilities.
- the hideous LBA disk addressing scheme.
Keep in mind that even two billion 64-bit integers only takes up
17,179,869,184 bytes of the 18,446,744,073,709,551,616 bytes of 64-bit address space available.That's less than 1 thousand-millionth, or 10^-9, or you could have many millions of these massive arrays before running out of address space.
That's forward thinking :-) .
TraumaPony : Also known as 1 billionth...paxdiablo : Yeah, I thought of that but I can never remember whether there's a discrepancy between the US and European billion (one being 10^9, the other being 10^12). I couldn't be bothered Googling it so I took the safe option.
0 comments:
Post a Comment