The OCaml Platform - a vehement dissent

I just read the latest version of the roadmap. I am appalled.

G1) Dune is The Frontend of The OCaml Platform

And the following is essentially a Dune tutorial.

This is not even wrong. I think you have two options: 1) just rename the language to Dune or D'ocaml or whatever; or 2) rename your project to “The OCaml-Dune Platform”.

3 Likes

I’m confused by your quip, do you also call the tooling for other languages “Rust-Cargo”, “.NET-MSBuild” and “Java-Maven” just because their compilers can be driven without their most popular build tools?

The fact is most language ecosystems now have some go-to solution for an end-to-end toolchain: compiler, build system, library repository, formatter, editor integrations (or dedicated IDEs available), etc. And it doesn’t necessarily have a distinct name, because it’s usually the (only) toolchain worth using for most users, except for those both experienced and desiring to go off the blessed path. So OCaml is now finally arranging one as well, and so far it’s the only significant attempt we’ve got, so for now it’s The (only) OCaml Platform.

I understand Dune has many quirks, and I would prefer if build tools in general weren’t so language specific (and editor integration not so build tool specific), but the toolchain involving Dune is so far the only one that matches the experience of working with other languages for most of us.

7 Likes

If dune were called ocaml and did exactly what it says in the roadmap, would your dissent still hold?

1 Like

That is plainly not true. In fact most of the most-used languages do not have that. The C Platform? The Java Platform?

Go envy (and Rust envy) are pathologies. They should be treated, not encouraged.

Yes. It is a huge mistake to bless one (not especially good) build system as The One True Build System for OCaml. All that does is shrink the space of possible improvements.

However, there is nothing wrong with elaborating a Dune-based “platform”. That would be all to the good. What’s wrong is promoting it as the One True Solution for OCaml.

First, my bona fides: I have always been a vehement critic of dune, and have always used Makefiles (OK, once I gave up on oasis). I use makefiles + ya-wrap-ocamlfind + not-ocamlfind and it’s perfectly cromulent. So I’m not here to defend dune: quite the opposite.

BUT. BUT. The authors of that roadmap are correct, that if you want to make headway with the Great Unwashed, you must choose a single blessed platform (build system, packaging, etc) and stick with it and advertise it, etc. Proposing a dozen (or even TWO) different build-systems is just going to confuse newbies, and that will be enough to drive them away.

So: if OCaml wants to attract new users, this roadmap is (unfortunately for me) the right way to go about it. B/c whether I like it or not, Dune has the most traction. No other build-system is a contender, and that doesn’t change even if I stipulate that my preferred method is better. [I mean, obviously I think it is, but (heh) that’s a minority opinion]

I see nothing wrong with the roadmap’s focus on dune.

ETA: If that had been the “Roadmap for the OCaml language”, and if it had proposed dune as the only build-system, to the exclusion of all other possibilities, then I’d have gotten a bit bent-out-of-shape. I’m willing to bet that I can use Rust without using cargo. Similarly, if I can use ocaml without using dune, I’m good.

16 Likes

Maybe an exaggeration but only a slight one. I readily admit that cabal vs stack is strongly implicated in keeping me away from Haskell.

I wouldn’t be so sure. Even if true, as it is now, there is a huge difference in documentation i.e. the reason why it is clearly possible to do what you do and roll your own build system for Ocaml is that the details of the command line for all the core tools are meticulously documented and not hidden from the unwashed. And I think this is the potential downside of a prominent “platform” project, that this documentation would gradually start to be neglected because “who needs it”.

2 Likes

As I wrote, if this started to happen, I would take great exception. But I have enough trust in the leaders of the OCaml project, to believe that they would not let this happen: that they understand that OCaml as a language needs to be open to heterodox thinking. And if it did start to happen, I would start making noise, pointing out the horrors that happened in Java, as this very sort of thing was tried (and failed, and failed, and failed again).

2 Likes

You can compile Rust without Cargo. That’s what they do in Linux,
firefox, and, afaict, in every project using Rust to replace C via the
strangler pattern — you have to integrate Rust in the existing build
system. It’s just a lot uglier this way.

I hope we can continue to call ocamlc/ocamlopt, but it’s really not what
most people should need to do. Neither is writing makefiles, dune is
infinitely superior to that if you haven’t absorbed 25 years of
knowledge on the fine details of .cmxa vs .cmx vs .cmxs vs .o, and how
early .cmi should be generated.

16 Likes

I’m not going to convince you to try Makefiles, but hey, it’s all good, I don’t need to try. But what you write is simply not true: it’s pretty straightforward to put all that logic you describe above into a shared Makefile (viz. https://github.com/camlp5/pa_ppx_regexp/blob/master/config/Makefile.shared ) and then the Makefile for the actual library can be pretty simple ( https://github.com/camlp5/pa_ppx_regexp/blob/master/pa_regexp/Makefile ). I mean, “make depend” is a thing.

3 Likes

I’m sorry, you’re showing me a 100 lines Makefile which, as far as I can tell:

  • builds objects directly in the source tree
  • doesn’t handle transitive dependencies (you need to list them all
    explicitly in PACKAGES right?), nor dependencies that have not been
    installed via ocamlfind (routinely happens in dune projects)
  • relies on obscure wrappers to ocamlfind(?)
  • would let me write META files by hand
  • has some gnarly conditionals

Compared to what dune does out of the box, it’s a lot of arcane for
anyone but a seasoned unix programmer. Seriously. If the comparison
point is cargo run this is not a valid alternative.

I’m genuinely glad this still works for you, and for other projects that have
successfully relied on Make for decades. But this cannot be the
recommended way for any beginner.

10 Likes
  • builds objects directly in the source tree

Nothing wrong with that. It’s what “make clean” is for.

  • doesn’t handle transitive dependencies (you need to list them all explicitly in PACKAGES right?), nor dependencies that have not been installed via ocamlfind (routinely happens in dune projects)

Uh, sure it does. ocamlfind knows to recursively include all transitive dependencies

  • relies on obscure wrappers to ocamlfind(?)

Fair cop. OTOH, not-ocamlfind does a very simple thing: it supports the “reinstall-if-diff” (reinstall a package if its disk contents have changed). ya-wrap-ocamlfind allows to put compile-flags into a comment at the top of an ocaml source file.

  • would let me write META files by hand

Not sure I follow. I guess you’re saying that it’d be better to have dune write them for me. I guess, sure. They’re not that tough to write.

  • has some gnarly conditionals

Uh, what conditionals? If you mean in “Makefile.shared”, that the file you’re not supposed to have to look inside.

But sure, this isn’t designed for a wet-behind-the-ears programmer who doesn’t know how to program quicksort. Sure. My point is, this Makefile (the second one) doesn’t do any of the things you complained about. And I use this same scheme in a large number of projects (some of them pretty large), sans problems.

ETA: I forgot to mention the two reasons I do things this way:

  1. Dune is opinionated. If you step outside the way that dune wants you to work, you’re stuck. So for instance, if you have a large body of C code, you can’t use dune to build it. Instead, kludge in your makefile. Woo-hoo. Makefiles don’t have that problem.

  2. in Dune, sub-modules (libraries, etc) are not reified as findlib modules. So the way that you build a test inside Dune control is different from the way you do it outside Dune control. With the Makefiles I shared, subdirectories map to findlib modules that are installed in a “local-install” directory, and are used from there to build other libraries, and also to build tests (or executables). And then at the end, some subset of those findlib modules (typically all of them) are copied to the global findlib location (and their META files are merged with rewriting to produce the final META file).

This second point has always bothered me about Dune: it’s clear that Dune wants to substitute its own idea for each and every thing, even if there was a perfectly cromulent version of that thing already.

Just a quick point before going to bed:

  1. in Dune, sub-modules (libraries, etc) are not reified as findlib modules. So the way that you build a test inside Dune control is different from the way you do it outside Dune control. With the Makefiles I shared, subdirectories map to findlib modules that are installed in a “local-install” directory, and are used from there to build other libraries, and also to build tests (or executables). And then at the end, some subset of those findlib modules (typically all of them) are copied to the global findlib location (and their META files are merged with rewriting to produce the final META file).

Any dune library with a public name like foo.bar.yolo will end up
existing in findlib as foo.bar.yolo, afaict. You can #require it
just fine. In fact my point about META files above is that I have some
projects with dozens of dune sub-libraries (it’s easy to write smaller,
more focused sub-libraries this way) with very little effort.

Cheers! :slight_smile:

1 Like

For most people using dune, we only care about the in-dune workflow. We run dune test and it runs the test suite. We run dune test -w and it runs the suite and goes into watch mode and re-runs the suite on file change. We don’t need or want to build tests ‘outside dune control’ because it’s just not relevant.

2 Likes

That’s true after it’s installed. But I mean during the build.

One of the things I do all the time, is copy out a test from a package and build it outside, as a way of starting to use that package. That’s not straightforward when Dune doesn’t internally name intermediate products with findlib module-names.

You mean build the test outside of its project directory, and not using dune? If I understand you correctly, you are just saying that projects which use dune don’t support non-dune workflows? Isn’t this rather an expected situation? If you have a blessed toolchain in the platform, wouldn’t you expect it to have a closed-world assumption? I don’t know any programming language-specific build system that doesn’t have this assumption. E.g. Rust cargo, Scala sbt, Clojure Leiningen, Elixir mix, etc.

2 Likes

So here’s an example of what I mean: I’m starting to use mdx to write tests and documentation. I want those tests to run under build-system control (of course). Here’s an example: https://github.com/camlp5/pa_ppx_static/blob/master/tests/toplevel_test.asciidoc#in-the-ocaml-toplevel

I’ve pointed directly at the spot with the toplevel directives to load the required findlib modules. One of those modules, pa_ppx_static, is built by this opam package, and I want to use it in this test (for obvious reasons). But I want this test to also be documentation, showing how to use this module. So I want the package-naming to be the same whether I’m inside the build, or outside it. As it turns out, that’s hard to do when there are a bunch of findlib modules being built (e.g. pa_ppx.deriving, pa_ppx.import, etc). So inside the build of pa_ppx I name them without the “.”, and outside, I name them with the “.”. That is, inside the build, it’s pa_ppx_import and outside, it’s pa_ppx.import.

BUT aside from this trivial naming issue, in the opam package pa_ppx (which has 52 findlib submodules !) are all built in directories and installed into a local-install directory, from which they can be run. So for instance, the module pa_ppx_base is built from the base directory, and then is used by pa_ppx_deriving (built from pa_deriving directory). Make a change to base and build that, and when you go to make pa_deriving, it’ll recompile. And why? B/c “make depend” picks up the dependency on other sub-modules in the opam package.

So what you get, is that you can treat each submodule as an independent project, except that when you do a toplevel “make”, the ones that need to get recompiled, get recompiled. And in tests, you name your packages nearly as if they were installed globally.

I don’t know whether other toolchains have an open or closed-world assumption, but I do know that when the toolchain is “make”, it’s usually straightforward to pull out a (in-build-system) build-step and figure out how to run it “outside the build system”. That’s quite, quite, quite, quite not the case with dune. I’ve often reverse-engineered dune build-logs and figured out how to run something outside of dune, and it’s always a complete PITA.

For instance, I figured out how ppx_expect_test worked, and it was a horrorshow. Just all sorts of weird shit going on. And for no good reason, either.

Dune could have really been better, if it had made a committment that all the build-commands it emitted were comprehensible outside the build-system. Ah, well. Heck, it could have been better if it compiled the dune-files of a project, into a toplevel Makefile, and then just invoked make. Again, ah, well.

I see that you interpreted this roadmap as discouragement for users/implementers of non-existent and yet-to-be-written other build tools.

I see the roadmap mostly as encouragement of the existing dune users and the community around it.

Since it’s highly speculative to prefer potential improvements of some other tools that don’t even exist yet, I prefer to care about people who already do tremendous work to improve the OCaml community today, so I’m fully in favour of this roadmap :yellow_heart:


As a side note, I’ve been in the Haskell community for 8 years and I watched real-time how damaging it was for the entire ecosystem to have multiple competing build tools.

Sure, each tool had its own advantages and became better due to competition witch each other, but in the end, the entire community suffered a great deal.

15 Likes

The real problem is that there are two non-working package managers (and Nix isn’t a solution, but a problem). And Stack is actually just a frontend to Cabal, so you can use both of them to your advantage (and writing the cabal file by hand and not use HPack).

While Opam is great, I can’t say that I like that Dune uses a lockfile and the direction it is heading (with watch-mode and clients), by trying to be a second Cargo (you can’t compare with Go, that would mean a single binary named ocaml that does everything).