Simultaneous development of a library and executable

Hello,

I’m new to OCaml, but so far I’ve been impressed with the tooling and especially the overall speed of development.

I’m starting to develop a project in OCaml and part of the project is a self-contained library that will be used in future projects. I created a separate dune library with:

dune init proj --kind library my_lib

and then I created an executable dune project with:

dune init proj my_exec

And in the executable project, the bin/dune file refers to my_lib with:

(executable
  ..
  (libraries my_lib ...))

In order for my_exec to find my_lib, I need to “opam install” my_lib. But my_lib will be developed simultaneously with my_exec so it will be constantly changing. In the my_lib root directory, I used:

opam install . --working-dir

But when I update my_lib, my_exec is not aware of the changes until I do:

opam update my_lib

followed by:

opam upgrade my_lib

But the “opam upgrade” command is slow compared to every other command I use. It is slow enough that I do not want to include it in my build process.

Another option is to include my_lib under the my_exec project’s lib directory. I would break out my_lib from my_exec only when my_lib stabilizes.

Is there a way to keep my_lib as a separate project from my_exec without incurring the cost of “opam upgrade” after every change to my_lib? Is there a standard way of developing a library along with an executable simultaneously?

Thank you for any advice you can give me (i.e., please tell me if I’m doing something stupid)!

You should be able to just do a dune install: if should install without going thru opam. You’ll have to do the install every time you update your library, of course. And you can later do a dune uninstall to cleanup.

A different idea. I often have multiple packages I’m developing, that I need to update. S I use opam “pin add” and “pin remove” liberally.

(1) go into my_lib, do “opam pin add .”; then “opam install .”

(2) go into my_bin, do the same

(3) later, you want to update my_lib; go in there and do “opam remove . ; opam pin remove .” You can do the same to my_bin.

(4) to back to #1, later, rinse, repeat.

I do this a ton with 5-7 libraries, and it seems to work fine. Though, to be fair, it’s still slower than just “make install”, “make uninstall”, etc.

I would say it’s typical to keep the library and the executable component in the same project. Do you have a specific reason why you can’t do that?

4 Likes

I would also suggest keeping the executable and the library in the same dune-project, possibly in different packages if you don’t want the library to install the executable. It’s a fairly well-supported usecase.

On the other hand, I would suggest forgetting that dune install exists.

1 Like

Thanks for all the responses!

I didn’t try dune install but I did try opam pin add . and opam pin remove .. But I’m experiencing the same performance issues with those commands. It appears the less I call opam, the better it will be for my use case.

With other programming languages I’ve used, I’ve never kept an independent library within the confines of an executable project that uses that library. If the library is inside the project, how do you handle version control for the executable and library separately? One git repository for bin/ and one git repository for each subdirectory library in lib/?

How would I keep the executable and library in the same dune project but in different packages? I tried searching for examples but so far have not found anything. Can you give me a pointer to information or to a dune project that is set up this way?

1 Like

It’s fair to say that opam pin yaddayadda is slower than dune install/uninstall. I don’t use dune (Makefiles all the way) but I do “make install/uninstall” etc all the time during development: all the time I’m adding code to package Z that depends on packages X,Y and finding I need to add code to X,Y to support those changes Z. “make install” is my friend in those situations.

OK, I had gotten the impression that the executable is closely coupled with the library itself and it would make sense to keep them in a single monorepo. If not, you can still keep them in separate repos but check out the library repo locally as a subdirectory of the executable repo. Obviously, you would ‘git ignore’ the library subdirectory in the executable’s repo because you want to keep them separate in git.

Remember, dune is a ‘composeable’ build system. You can quite literally ‘vendor’ libraries as subdirectories of your project and it will ‘just work’.

See The joys of Dune vendoring | Notes from the Windows corner (a bit old) and vendored_dirs - Dune documentation

2 Likes

Old, but apparently still relevant :grin::old_man:

1 Like

Looks like we could use a “how to” in the dune docs for this. But here is an example repo that does this: GitHub - mbarbin/dunolint: A linter for build files in dune projects (OCaml) – this is extremely common practice.

If you are using dune to generate your opam files, you just define multiple package stanzas in your dune-project file. Or you can skip that added layer of complexity and just write one opam file per package.

Thank you for all your responses!

I think I am starting to understand Dune a little better. Thanks for the vendored_dirs suggestion and the dunolint example was great.

Final question: I’m planning on copying the dunolint example and having a vendor directory which will hold vendored libraries. I tried a small example project where inside the vendor directory I have a symbolic link to another directory that contains a Dune library. It appears to work. Is there any reason that I should not use symbolic links like this?

1 Like

Sure. As for an example yojson does this, where the executable is yojson-bench and is assigned to the yojson-bench package and the regular library is part of the yojson library.

Basically: Assign (package ...) in your executable stanza.

That’s completely fine, this is also how I often handle these things.

It might not work on Windows (but this should be verified–which is pretty easy to do nowadays thanks to the setup-ocaml GitHub Action).

From what you’ve described, I don’t see why you’d need to bother with the added complexity of vendoring. In any case, that is orthogonal to structuring your project so that it is composed of several packages that have some (non-cyclical) dependencies.

I haven’t used this myself, but opam-monorepo may also come in handy: GitHub - tarides/opam-monorepo: Assemble dune workspaces to build your project and its dependencies as a whole

I misunderstood what “vendoring” means and I agree that what I am doing is not vendoring. I was just excited that the vendored_dirs approach worked without thinking about whether it made sense or not.

I now have a symlink to the independent library directory (my_lib) from within the executable directory (my_exec) with no vendored_dirs and that works as well.

In terms of structuring a project, the freedom of choice that Dune provides is confusing to me coming from other programming languages but I feel like I’m making some progress.

Thanks again for all the replies!

2 Likes