You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Original bug ID: 4868 Reporter: elehack Assigned to:@damiendoligez Status: assigned (set by @damiendoligez on 2011-05-31T14:32:00Z) Resolution: open Priority: normal Severity: feature Version: 3.11.1 Category: otherlibs Monitored by:@ygrek@hcarty pascal_cuoq gerd
Bug description
Currently, Bigarray uses a compiled-in constant CAML_BA_MAX_MEMORY, set to 256MiB, as the maximum size for allocating bigarray custom blocks. While find for low- to medium-memory machines, it causes excessive garbage collector thrashing when using large bigarrays on high-memory machines (e.g. arrays of several gigabytes). I propose the following changes:
change CAML_BA_MAX_MEMORY from a constant to a variable
provide a function to set this variable to a different value if it would help program performance
I have made this change on my local copy of OCaml, set the maximum memory to 12GiB, and the GC is substantially less active in the same code (which does not do heavy allocations, but primarily manipulates the bigarrays).
I'm also thinking about auto-configuring this. CAML_BA_MAX_MEMORY could be set to 3/4 of the available RAM, where the amount of RAM is calculated by a number of methods: (1) if there is a ulimit, take this (getrlimit for RLIMIT_DATA); (2) otherwise use an OS-specific method (see the attached script available_ram.sh to get an idea).
Also thinking that mmaped files (and other memory not allocated by Bigarray directly) should not count here at all. There could be another mechansim here, e.g. giving the GC a one-time boost when the user indicates that the file should be unmapped asap (or just auto-boost whenever a dead mapping is found, to get some automatic speed regulation). Rationale: memory-mapped files do not allocate memory, but only address space. You can nowadays map giant files, and it does not make any sense to include this space into OCaml's memory management calculations.
Original bug ID: 4868
Reporter: elehack
Assigned to: @damiendoligez
Status: assigned (set by @damiendoligez on 2011-05-31T14:32:00Z)
Resolution: open
Priority: normal
Severity: feature
Version: 3.11.1
Category: otherlibs
Monitored by: @ygrek @hcarty pascal_cuoq gerd
Bug description
Currently, Bigarray uses a compiled-in constant CAML_BA_MAX_MEMORY, set to 256MiB, as the maximum size for allocating bigarray custom blocks. While find for low- to medium-memory machines, it causes excessive garbage collector thrashing when using large bigarrays on high-memory machines (e.g. arrays of several gigabytes). I propose the following changes:
I have made this change on my local copy of OCaml, set the maximum memory to 12GiB, and the GC is substantially less active in the same code (which does not do heavy allocations, but primarily manipulates the bigarrays).
File attachments
The text was updated successfully, but these errors were encountered: