Dune memory consumption with multiple jobs

Hi everyone,

I’m trying to build the OCaml portion of the project at https://github.com/hacspec/hax, to be found here. But unless I force dune to run with a single job my laptop will inevitably run out of its 16GBs of memory. I could probably live with really long build times, but it applies to the editor tooling as well, making it unusably slow or resulting in processes being killed when memory runs out. (I’ve tried tuareg for Emacs and the OCaml-platform extension for VSCode.)

I’m using dune 3.11.1 and OCaml 4.14.1 on x86 Linux. My main question is if there is something I could try besides limiting the number of jobs to one wherever I can?
I have not heard about the same experience from other users working on the project using macOS, so maybe this could be a platform specific thing?

Thanks!
Jonas

I don’t have anything concrete to suggest, but it may be worth opening an issue over at Issues · ocaml/dune · GitHub with all the details. I think the upstream devs would be interested in knowing more about a build that causes an OOM.

Cheers,
Nicolas

1 Like

See Improve setup and dependencies by W95Psp · Pull Request #198 · hacspec/hax · GitHub, high memory usage has already been reported and there is a commit that tunes the GC parameters to reduce it.
In addition to setting the o parameter I suggest also trying O (as documented here OCaml - The runtime system (ocamlrun)).
In oxenstored I’ve set only O=120, YMMV.

The defaults are usually too high (5x free to live memory ratio, which is fine if that OCaml program is the only one you’re executing, but if you’re trying to share memory between multiple programs then a smaller compaction limit is better).

1 Like