I’m new to OCaml, but so far I’ve been impressed with the tooling and especially the overall speed of development.
I’m starting to develop a project in OCaml and part of the project is a self-contained library that will be used in future projects. I created a separate dune library with:
dune init proj --kind library my_lib
and then I created an executable dune project with:
dune init proj my_exec
And in the executable project, the bin/dune file refers to my_lib with:
(executable
..
(libraries my_lib ...))
In order for my_exec to find my_lib, I need to “opam install” my_lib. But my_lib will be developed simultaneously with my_exec so it will be constantly changing. In the my_lib root directory, I used:
opam install . --working-dir
But when I update my_lib, my_exec is not aware of the changes until I do:
opam update my_lib
followed by:
opam upgrade my_lib
But the “opam upgrade” command is slow compared to every other command I use. It is slow enough that I do not want to include it in my build process.
Another option is to include my_lib under the my_exec project’s lib directory. I would break out my_lib from my_exec only when my_lib stabilizes.
Is there a way to keep my_lib as a separate project from my_exec without incurring the cost of “opam upgrade” after every change to my_lib? Is there a standard way of developing a library along with an executable simultaneously?
Thank you for any advice you can give me (i.e., please tell me if I’m doing something stupid)!
You should be able to just do a dune install: if should install without going thru opam. You’ll have to do the install every time you update your library, of course. And you can later do a dune uninstall to cleanup.
I would also suggest keeping the executable and the library in the same dune-project, possibly in different packages if you don’t want the library to install the executable. It’s a fairly well-supported usecase.
On the other hand, I would suggest forgetting that dune install exists.
I didn’t try dune install but I did try opam pin add . and opam pin remove .. But I’m experiencing the same performance issues with those commands. It appears the less I call opam, the better it will be for my use case.
With other programming languages I’ve used, I’ve never kept an independent library within the confines of an executable project that uses that library. If the library is inside the project, how do you handle version control for the executable and library separately? One git repository for bin/ and one git repository for each subdirectory library in lib/?
How would I keep the executable and library in the same dune project but in different packages? I tried searching for examples but so far have not found anything. Can you give me a pointer to information or to a dune project that is set up this way?
It’s fair to say that opam pin yaddayadda is slower than dune install/uninstall. I don’t use dune (Makefiles all the way) but I do “make install/uninstall” etc all the time during development: all the time I’m adding code to package Z that depends on packages X,Y and finding I need to add code to X,Y to support those changes Z. “make install” is my friend in those situations.
OK, I had gotten the impression that the executable is closely coupled with the library itself and it would make sense to keep them in a single monorepo. If not, you can still keep them in separate repos but check out the library repo locally as a subdirectory of the executable repo. Obviously, you would ‘git ignore’ the library subdirectory in the executable’s repo because you want to keep them separate in git.
Remember, dune is a ‘composeable’ build system. You can quite literally ‘vendor’ libraries as subdirectories of your project and it will ‘just work’.
If you are using dune to generate your opam files, you just define multiple package stanzas in your dune-project file. Or you can skip that added layer of complexity and just write one opam file per package.
I think I am starting to understand Dune a little better. Thanks for the vendored_dirs suggestion and the dunolint example was great.
Final question: I’m planning on copying the dunolint example and having a vendor directory which will hold vendored libraries. I tried a small example project where inside the vendor directory I have a symbolic link to another directory that contains a Dune library. It appears to work. Is there any reason that I should not use symbolic links like this?
Sure. As for an example yojson does this, where the executable is yojson-bench and is assigned to the yojson-bench package and the regular library is part of the yojson library.
Basically: Assign (package ...) in your executable stanza.
From what you’ve described, I don’t see why you’d need to bother with the added complexity of vendoring. In any case, that is orthogonal to structuring your project so that it is composed of several packages that have some (non-cyclical) dependencies.
I misunderstood what “vendoring” means and I agree that what I am doing is not vendoring. I was just excited that the vendored_dirs approach worked without thinking about whether it made sense or not.
I now have a symlink to the independent library directory (my_lib) from within the executable directory (my_exec) with no vendored_dirs and that works as well.
In terms of structuring a project, the freedom of choice that Dune provides is confusing to me coming from other programming languages but I feel like I’m making some progress.