Plans to choose an official package/project manager?

From my experience (experienced in Rust and cargo and bouncing off OCaml a number of times due to OPAM), I was pretty about what switches were and how things were not sandboxed to my project folder by default. I’d be trying to build something and have to blow away my global state every time I tried to build another project. I tried using switches (as it seemed like the approach for sandboxing) but this seemed cumbersome and hard to understand for building individual projects – say, a project I’d pulled from Github and just wanted to try out quickly.

This translation is incredibly helpful, even if it is difficult to remember and teach – it’s only now I’m learning that I need a separate tool for project sandboxing (ie. direnv)? I definitely don’t think Cargo is perfect, and I don’t mind the build-system/package manager separation, but I do think there could be a lot that could be done to set people up for success. This is good for beginners and experts alike!

2 Likes

FWIW, OBazl is under heavy development. Version 2 with lots of improvements and tooling should be out before the end of the year.

2 Likes

direnv is required to use to the correct switch when entering a directory (you could do it manually but I’d always forget). Alternatively you could use a project local switch ( I don’t cause of habit and I’d want to centralise my switches under ~/.opam). There is work on making the compiler relocatable that will speed up creating a new switch and perhaps allow new workflows.

3 Likes

Opam has a built-in mechanism for this in the form of opam switch link:

$ opam switch create 4.12.1-grss 4.12.1
$ opam switch link 4.12.1-grss

I definitely agree that keeping an indirection between repositories and environments is the right approach w/ the current state of things. I switched to using opam switch link a while ago (after one too many times spent juggling local switches / recompiling after a switch move) and haven’t looked back. It’s much less brittle than “true” local switches and still achieves project isolation (so that I can vendor / opam pin to my heart’s content). Have been meaning to write about this for a while, but not gotten around to it yet…

That said, I’m also very excited about @dra27’s work on compiler relocation, and look forward to a day when I can have exactly 1 copy of each compiler version on my machine :stuck_out_tongue:

10 Likes

Unless I am misunderstanding, this is not a replacement for direnv. For example, direnv modifies the PATH environment variable on the fly according to the current directory (e.g., so that it points to the corresponding bin repository from Opam). That is not something opam switch link does; it only applies to subsequent calls to the opam program itself, not to calls to any other programs.

2 Likes

Opam ships with shell hooks for configuring the shell environment / completions (set up by opam init), and these are already sensitive to whichever switch is considered “active” in the current working directory (by traversing upwards looking for _opam files, or looking at OPAMSWITCH). opam switch link then sets up a symlink to prompt those hooks to consider a particular global switch as “active” within a particular subtree – in the same way as would happen with a local switch.

My own opam env sets up the following variables according to my switch:

  • OPAM_SWITCH_PREFIX
  • CAML_LD_LIBRARY_PATH
  • OCAML_TOPLEVEL_PATH
  • MANPATH
  • PATH

This ensures that Dune / my editor / my shell PATH are all sensitive to opam switch link by default. Certainly this isn’t enough to make opam a generic replacement for direnv for all projects (nor should it be!), but AFAIK it does mean that Opam users don’t need .direnv just to keep a correspondence between repositories and switches.

4 Likes

I knew about those files, but your comment prompted me to have a deeper look at them. My .opam/opam-init directory correctly contains init.sh, complete.sh, and variables.sh. But it does not contain env_hook.sh, which is referred by init.sh. I suppose that is the file responsible of running opam env at each prompt for you. No idea why opam init did not install this file for me, but I now understand why your user experience with opam switch link is much better than mine.

3 Likes

Careful with that axe, Eugene. Some of us don’t see that as “needless” and rather like opam because it tries (and succeeds) at being a general purpose application package manager, not just for OCaml projects, but for any project that requires any number of programming languages.

If there’s one thing for which I’d like to throw my gloves into the ring in favor here, it’s for keeping opam general purpose.

1 Like

Thinking about this some more, I’d like to offer what I hope the compiler team will view as constructive criticism— an observation of a way the compiler tools work today that I believe can be improved— that might help simplify what the larger community sees as an undesirable complexity.

The current tools use a model for constructing libraries that looks very familiar to C/C++ programmers at first glance, but owing to the peculiarities of modern OCaml, is actually a bit more complicated. Here’s what I mean by that:

The files comprising a typical C/C++ library include…

  • a static archive (libname.a)
  • a dynamic library (libname.so or libname.dylib)
  • a set of header files (name.h, folder/name1.h, folder/name2.h et cetera)

The files comprising a typeical OCaml library include…

  • a bytecode library file (name.cma)
  • the compiled interfaces (the .cmi files)
  • the bytecode compiled modules (the .cmo files)
  • optionally, the binary annotation files (the .cmt and .cmti files)
  • optionally, where the library has external values:
    • the auxiliary static library (libname.a)
    • optionally, the dynamic “stub” library (dllname.so)
  • optionally, where the native compiler is available:
    • the native library files (name.cmxa and name.a)
    • the native compiled modules (the .cmx files)
    • optionally, the native plugin (name.cmxs)
  • optionally, the source code (the .ml and .mli files)

Various competing ways of managing the complexity of the C/C++ library distribution problem exist, and most of them use peripheral tools apart from the C/C++ compiler toolchain to deal with it. For example pkg-config from freedesktop.org. The venerable ocamlfind tool takes a similar approach.

I used to work at Apple, and one thing I’ll say the Apple compiler toolchain has long had going for it is the way frameworks are built in, and I think it would be an improvement to the OCaml ecosystem if the core OCaml toolchain had a similar conceptual framework for modeling the distribution bundle of OCaml libraries. Leaving this out of the compiler chain for tools like ocamlfind and dune and opam to manage has, I would say, resulted in more complexity than if the compiler tools themselves provided a unified systematic way of bundling libraries, and I think something like the way Apple frameworks work would be a great help.

In other words, instead of ocamlfind ocamlc -requires unix... it would be better I think if it were just ocamlc -F unix... instead. In that world, the OCaml tools would systematically define the format of an OCaml “framework” (which includes all the required and optional artifacts I outlined above), the build tools like dune and bazel and make could use the toolchain to produce them rather than write their own logic to conform to the way ocamlfind has done it historically, and the packaging tools like opam and apt and pkgsrc treat packages that contain one or more frameworks as versioned units of dependency management.

Note well: the whole world of PPX artifacts is an additional complexity in OCaml library distribution that I think isn’t going to get any easier for packaging and build tools to handle without something like this “framework” concept I’m talking about.

I hope the compiler team finds this suggestion constructive and useful. I’d be willing to collaborate with the compiler team and others in the community in a task group to design such a framework format.

4 Likes

Are you aware of this proposal? RFCs/ocamlib.md at master · ocaml/RFCs · GitHub

I believe it has been accepted already and is currently being implemented.

Rudi.

2 Likes

But apparently the work has stopped: Opam packages: be honest with your dependencies - #16 by dbuenzli

I am now. Yes, something like that is what I’m talking about.

But apparently the work has stopped

Sadface.

There were a few long discussion on the RFC in this forum, that were not able to reach a consensus. I think starting here you can find your way through most of them: OCaml RFC#17: library linking proposal

Note that the lack of consensus I’m talking in the message above has nothing to do with that particular discussion which is pure FUD by someone who was not involved in the implementation effort.

Oh sorry. I only saw the discussions on the forum, not only that one, and thought they were related to your comment. Thanks for the correction

Noted. For my part, I found the arguments against the proposal in RFC 17 from that discussion unpersuasive. I was particularly turned off by the criticism that it “changes the standard” from ocamlfind to this new mechanism provided as a compiler built in.

Speaking as someone with experience serving in standards bodies, i.e. the kind that publish and maintain collective ownership of actual standards documents, I have to say that ocamlfind is very far from any kind of “standard”— it is, at best, an ad-hoc convention, with multiple independent implementations, which are not fully interoperable— and this is the main reason it poses a problem for us as a community.

The main reason I prefer the RFC 17 solution over ocamlfind is Conway’s Law. You ship your org chart. I want the same people maintaining the compiler toolchain to be maintaining the library distribution format. I’ve seen why that’s necessary from working at Apple with the toolchain they maintain there, and I’m not excited about having two separate independent teams maintaining these two very tightly-coupled features of the language ecosystem, and the nice thing about having it built into the compiler toolchain is that you only need one implementation that only has to be interoperable with itself. Like all of the rest of the language at present.

Anyway, I’m sad about whatever caused the implementors to get derailed by lack of consensus. I don’t know anything about it. That said, if somehow that obstacle were cleared and the work restarted, then I’d be happy to be an earlier adopter with my own little corner of the OCaml planet.

4 Likes

That’s all very sad news to me as well. I was very much looking forward to that proposal. Though I must admit, I didn’t understand the necessity to bake the thing in the OCaml compiler. Sure it makes ocamlc more usable from the command line, but who is really concerned about this? IMHO, the proposal provides a lot of value even without support from the compiler. All the additional metadata proposed to be stored in cma, cmxa, etc can be stored externally without much loss in usability.

2 Likes

There are however quite a few advantage in doing so. Here are a few ones that come to mind.

  1. It avoids the discussion about the metadata file format and its syntax.
  2. In a parallel byte/native code build there’s no fight about who is going to write the file.
  3. It makes compilation objects self-describing (which allows .e.g. plugins with dependency specs without the need to artificially turn them into a library).
  4. By not having one more thing to load that could be missing when you need it, it reduces the possible error paths.
  5. It’s entirely consistent with the current design of compilation objects which in fact do have precise dependencies (used for type-safe linking). We just add the final missing step: where to look them up.

Finally you may not care about ocamlc’s usability but I do want a usable ocaml. It would be strange to have the latter and the former not following suit. Besides I think there’s a lot of value in having a compiler that is usable from the bare cli without having to invoke the whole bureaucracy of a build system; it’s useful for quick tests, diagnosing problems and allows for concise bug reporting.

11 Likes

Along with others, I too am disappointed to learn that OCaml Linking RFC work has stopped. Among other capabilities enabled by the RFC implementation, I was looking forward to the day when ocaml compiler knew about library names and we would be done with mismatch between findlib name and opam name for libraries. Ah well … (sigh)

As an aside, once the RFC is accepted at (GitHub - ocaml/RFCs: Design discussions about the OCaml language), isn’t it fair to assume that there is a strong consensus from upstream OCaml regarding the RFC? Additionally, I don’t see any issues raised regarding the RFC in the github issues. Shouldn’t the issues/objections be open as since the RFC is open too?

Note that the lack of consensus I’m talking in the message above has nothing to do with that particular discussion

Noted.

3 Likes

I’m obviously not going to argue against making ocamlc more usable. I just think that the proposal was strong enough even without this additional (minor IMO) feature.

I acknowledge all the other implementation issues you’ve listed. I just think that whoever works on OCaml tooling is used to such issues and I doubt that a few more would be a deterrent.