Question: how hard would it be for opam and/or dune to be able to install multiple versions of the same package in the same switch? Currently, of course, a switch can contain only a single version of a package. But is there any inherent reason why this should be the case? It seems to me that a switch doesn’t really need to care about constraint solving; it’s just a store of downloaded packages/executables/etc. What if we did package constraint solving at the project level instead of at the switch level, and then compiled each project against the solved versions of the packages in the switch? [EDIT: the solved constraints can be stored in the project’s lockfile]
At this point you may be wondering, well why not just create local switches for each project to solve package constraints at the project level. And sure, that is possible. But switches are expensive. They have to download and build the compiler, the base package set, editor support tooling, and so on. They also take up a lot of disk space. Why keep doing this over and over again if we can set up a switch with a reasonable compiler version once, then keep reusing it with most or all of our priojects?
I also feel this is useful to users: being able to install several versions of one package at the same time.
During compilation, packages should use either the latest version they are compatible with, or the exact version they depend on if specified.
This is something which the experimental Dune package management feature may well be able to do (at least in future - I don’t believe that’s on the present “MVP” roadmap).
For opam switches, which are really (shh, don’t tell anyone I said it🤫) a thin wrapper around findlib installations, it’s a bit harder to see how that shift would be made, while remaining compatible with the past. Essentially, if an opam switch has multiple versions of a package, what does ocamlfind list (and therefore ocamlfind ocamlopt -package library) mean.
Underpinning both of these is the compiler itself - if you have A.1 and A.2 installed, package B is compiled against A.1 and package C is compiled against A.2 then package D can only use B and C if A.1 and A.2 have identical interfaces.
With how OCaml works today, I’m not sure how much it would save over using the existing Dune cache (but of course other build systems are in use, and that doesn’t mean better ways can’t be explored, etc.!)
As @dra27 mentioned, something vaguely like this is the idea behind the dune package management, where you would have your package dependencies defined in dune-project and don’t need to care about packages installed in a switch - dune will download and build the dependencies for your project. Thus your dependencies are indeed tied to the project instead of the switch.
However, there is no concept of “installed” packages so it’s not like dune package management can “install” multiple packages. Think of it more like with Nix, where the system you get in the end is the result of a computing a derivation, not a state that is achieved after a sequence of stateful actions.
Would that even work ? The linker will fail if two different implementations are provided on the command-line, so you would need a build system that recognizes this and picks only one version of A. And if you compile in native mode with cross-module optimisations, the consistency checks will complain that you’re not passing the right implementations.
Hah - not for the first time, I think of it in bytecode, and you correct me in native code!! That’s of course true - so you do end up with more builds of libraries than you might expect.
Sorry to nitpick a little, but do you mean that D can only use B and C if A.1 and A.2 have identical interfaces?
Assuming yes, let’s say we are doing the constraint resolution in project D, at which point the solver will fail and say that B requires A.1 but C requires A.2. Resolution would not fail anywhere else, only in that project. And someone would have to step in and manually resolve the issue. I think this is not any worse than the situation today.
(I edited a typo in my original comment which referred to “C and D” instead of “B and C”)
It’s indeed no worse - I was trying to illustrate that it’s also possibly not a huge amount better, which is related to @vlaviron’s point. Being able to install multiple versions of a package in a switch may still end up with all the packages being compiled twice, just with slightly different dependencies. There’s also the confusing reality that you may have the same version of a package compiled several times, but with slightly different versions of its dependencies. That’s quite manageable for Dune, handling this internally, but I think is probably very confusing for an opam switch (imagine opam list now with multiple copies of the same package…!)
The day your project that ended up depending on A.1, A.2, A.3 has a bug related to A, don’t you think it would have been more productive spending your time to align your dependencies so that they rely on a single version of A ?
And once your library A.1 depends on the C library C.1 and A.2 depends on the library C.2 it’s likely you are going to be stuck anyways.
isn’t esy doing exactly what is being discussed in this thread? Installing packages in a “global store” and composing an environment on demand? A la nix if I understand correctly.
Yes, you’re right. Unfortunately esy adoption has been hampered because it’s yet another tool to learn. Hopefully dune can solve this issue for everyone.