Our first goal was to look at no-op builds, where dune has nothing to do but create the list of rules and check that they are up to date. It is a necessary step to improve the workflow of our developers when compiling large projects. So we haven’t really looked at how much time was spent in external processes such as ppx or ocamlopt.
We haven’t tried magic-trace. First because our dev servers do not have intel cpu. And second because it seems much more useful to create traces of very specific, short run, parts of a program. That would require us to first have identified parts of the dune codebase that could be the culprit.
The investigation didn’t go very far as we quickly learnt that janestreet is already working on optimising some of dune’s internals. We decided to pause our work until we see what they release.
In general I would say that it’s hard to understand what makes a build slow (when it’s not a no-op). Time spent in dune internals, time spent running preprocessing, time spent running the ocaml compiler, time during which parallelism is very low because a large module is a bottleneck, extra recompilations because of non optimal project layout, linking of the final binary, … In our specific case I can’t say if halving the time spent running ppx would have any impact on the compilation time as seen but the developers.