Version française
Home     About     Download     Resources     Contact us    
Browse thread
Array 4 MB size limit
[ Home ] [ Index: by date | by threads ]
[ Search: ]

[ Message by date: previous | next ] [ Message in thread: previous | next ] [ Thread: previous | next ]
Date: -- (:)
From: akalin@a...
Subject: Array 4 MB size limit
I'm running into cases where the 4 MB limit on arrays is starting to 
become a problem.  Lists are much slower and cause seg faults for me on 
the same data set, and Bigarrays are a partial solution because I'd 
like to be able to store arbitrary types in arrays (not just numbers).

I was greatly surprised when I found out there was such a low limit on 
arrays.  Is there a reason for this?  Will this limit ever be increased?

Is the limit a limit on the number of elements or the total size?  The 
language in Sys.max_array_size implies the former, but the fact the 
limit is halved for floats implies the latter.  If I had a record type 
with 5 floats, will the limit then by Sys.max_array_size / 10? Is there 
some sort of existing ArrayList module that works around this problem?  
Ideally, I'd like to have something like C++'s std::vector<> type, 
which can be dynamically resized.  Do I have to write my own? :(

Also, the fact that using lists crashes for the same data set is 
surprising.  Is there a similar hard limit for lists, or would this be 
a bug?  Should I post a test case?

Thanks!

Frederick Akalin