New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gc.compact vs. Gc.max_overhead #3305
Comments
Comment author: administrator
Yes, there is a problem with the automatic triggering of compaction, The bad news is, I don't really have a good idea of what's going on, -- Damien (* fails *) (* works *) let x = ref "";; |
Comment author: administrator On Tue, 30 Apr 2002, Damien Doligez wrote:
Effectivement, c'est troublant. J'avais un exemple (dans un programme compliqué) où l'insertion de la Martin |
Comment author: administrator
Je pense que j'ai reussi a faire un truc qui marche mieux pour declencher -- Damien |
Comment author: administrator At 19:06 28/05/02 +0200, you wrote:
Merci beaucoup. J'essaie d'essayer ça dès que j'ai un peu de temps et je Martin |
Original bug ID: 1111
Reporter: administrator
Status: closed
Resolution: fixed
Priority: normal
Severity: minor
Category: ~DO NOT USE (was: OCaml general)
Bug description
Full_Name: Martin Jambon
Version: 3.04
OS: i686/linux
Submission from: pc-bioinfo1.ibcp.fr (193.51.160.63)
Setting Gc.max_overhead to 50 triggers heap compactions as reported by
Gc.print_stat.
In the program I'm developing, this works fine until a certain point where no
compaction is triggered anymore, resulting in memory increase.
Periodic calls to Gc.compact fix the problem, i.e. reduce the memory amount to a
normal size (here, 10 MB).
The program makes a lot of memory allocations and deallocations.
The program has the following structure:
for each file, do
input data from file
extract some interesting information from the data
done
Example of GC statistics:
Gc.print_stat output after the last automatic heap compaction:
(somewhere in the middle of the main loop, ~15000 files have been read,
representing a total amount of ~500 MB of disk space in the form of Marshal'ed
data)
minor_words: 22845867
promoted_words: 1578695
major_words: 104923528
minor_collections: 213
major_collections: 12697
heap_words: 9793536 (* 10 MB *)
heap_chunks: 34
live_words: 1703034
live_blocks: 360724
free_words: 745339
free_blocks: 8
largest_free: 256000
fragments: 11
compactions: 67
Gc.print_stat output at the end of the program:
(just before the end of the main loop, 35000 files have been read)
minor_words: 22845867
promoted_words: 1930136
major_words: 227605106
minor_collections: 364
major_collections: 53380
heap_words: 193458176 (* 190 MB *)
heap_chunks: 728
live_words: 4020819
live_blocks: 763264
free_words: 44343702
free_blocks: 81982
largest_free: 36226
fragments: 23
compactions: 67
After the documentation found in module Gc (see below), this looks like a bug.
If not, maybe a warning message should be added.
The text was updated successfully, but these errors were encountered: