New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allocation of huge byte sequence raises Out_of_memory even when there is enough of free memory #6990
Comments
Comment author: @xavierleroy My setup is slightly different than yours (16G RAM + 16G swap), so I could not reproduce the out of memory condition, but I could reproduce enough to understand the issue. When the OCaml GC does not have enough free heap space to satisfy an allocation, it requests more memory from the OS (using malloc() or mmap()). The amount requested from the OS is the size of the thing to allocate, rounded up, then increased by a certain "overhead" factor. The overhead is by default 80%. You can observe the amount of memory requested from the OS by running with OCAMLRUNPARAM=v=4: (* First version ) Note that in both cases n2's allocation does not request more memory, as it can be satisfied from the overhead of n1's allocation. In both cases you're getting close to 16 Gb requested, and even closer in the second case, which is probably why you're getting an out of memory from the OS. The overhead percentage can be controlled by the user (see the description of ocamlrun and of the Gc module in the OCaml reference manual), in which case fewer memory is requested from the OS: (* First version *) (* Second version *) Bottom line: the GC is working as designed here, but the default GC parameters are not ideal for large allocations. One possibility would be to retry the OS allocation with zero overhead in case the first allocation with default overhead fails. I'll let our GC experts comment on this. |
This issue has been open one year with no activity. Consequently, it is being marked with the "stale" label. What this means is that the issue will be automatically closed in 30 days unless more comments are added or the "stale" label is removed. Comments that provide new information on the issue are especially welcome: is it still reproducible? did it appear in other contexts? how critical is it? etc. |
@damiendoligez Are we likely to adjust this behaviour? If not this can probably be closed. |
I don't think we want to change anything. When you are allocating a byte sequence that is more than half the machine's memory, it is probably better to switch to a bigarray that gives you better control of memory management. |
Original bug ID: 6990
Reporter: radekm
Assigned to: @damiendoligez
Status: assigned (set by @mshinwell on 2016-12-08T09:14:59Z)
Resolution: open
Priority: normal
Severity: minor
Platform: 64-bit
OS: openSUSE
OS Version: 13.1
Version: 4.02.3
Category: runtime system and C interface
Monitored by: braibant @gasche @diml
Bug description
I have 14 GB of free memory (16 GB total, 18 GB total with swap) and the following code works and uses less than 13 GB of memory.
let _ =
let n = 9 * 1000 * 1000 * 1000 + 300000000 in
let n2 = 3 * 1000 * 1000 * 1000 in
(*
let n = 9 * 1000 * 1000 * 1000 + 400000000 in
let n2 = 1 in
*)
let bs = Bytes.create n in
let bs2 = Bytes.create n2 in
for i = 0 to n - 1 do
Bytes.set bs i 'a'
done;
for i = 0 to n2 - 1 do
Bytes.set bs2 i 'A'
done;
Bytes.set bs 0 'b';
Bytes.set bs2 0 'B'
But when I redefine n and n2 (to 9 * 1000 * 1000 * 1000 + 400000000 and 1) then I immediately get
Fatal error: exception Out_of_memory
even when I have enough of free memory to hold the byte sequence.
The text was updated successfully, but these errors were encountered: