[ANN] release 0.3.0 of drom, the OCaml project creator

Hi,

We are pleased to release version 0.3.0 of drom, the OCaml project creator.

drom is born from a simple observation: every time you create a new OCaml project, you spend time searching and copy-pasting files from other projects, adapting them to the new one. drom does that for you: it comes with a set of predefined skeleton projects, that you can easily configure and adapt to your goal.

It’s as easy as:

$ drom new
  # check the list of skeletons
$ drom new PROJECT_NAME --skeleton SKELETON_NAME
$ cd PROJECT_NAME
$ emacs drom.toml
   # ... edit basic description, dependencies, etc. ...
$ drom project
$ drom build

Thanks to contributors (Maxime Levillain and David Declerck), the list of project skeletons for drom 0.3.0 has grown:

  • OCaml projects: library menhir mini_lib mini_prg ppx_deriver ppx_rewriter program
  • C Bindings: c_binding ctypes_foreign ctypes_stubs
  • Javascript projects: js_lib js_prg vue wasm_binding

and you can easily contribute your own: for example, gh:USER/SKELETON will trigger the download of the USER/SKELETON project from Github as a template for your new project.

drom is available from opam: opam update && opam install drom.0.3.0

Enjoy !

The drom team and OCamlPro

18 Likes

I was looking into drop recently (It looks to have a nice and rapidly revolving feature set!), but I balked when I saw it introduced yet another level of config files to the stack (and in a format which I particularly don’t like, but that’s just a taste thing). I wonder if you could speak to the motive for adding another layer of configuration, when it seems like everything should already be there in the dune-project file.

1 Like

First of all, you don’t need these configuration files if your goal is just to generate the project, and then never use drom anymore. You can delete these files and keep using dune and opam. We should probably give a few hints in the documentation, since there are a few files to modify (for example, to tell dune it should now generate the opam files).

For the reason to introduce them, it is quite simple: we want to have all the projet configuration in one place, and all other files generated from it. Such a unique configuration file does not yet exist, so we had to create it.

For the format, we chose TOML because it was already used by Rust+Cargo. Since our goal is to make onboarding OCaml simple, we prefered a format already used in other close languages instead of an OCaml-community-only language.

1 Like

Thanks for the motivation and context, @lefessan. I find the additional info helpful, especially pointers on how I might use the tool without committing to another level of configuration.

When I’d first skimmed the docs, it looked to me like the new config didn’t add anything that wasn’t already in dune-project files (or was supported by them), but after reading further (thanks to your reply), I see it also configures and manages other things like a sphinx configuration and CI configuration.

re:

we want to have all the projet configuration in one place, and all other files generated from it. Such a unique configuration file does not yet exist, so we had to create it.

I suppose this depends on what you consider to be the bounds of a project configuration! :slight_smile: iirc, Cargo also doesn’t configure a github-centric CI or an external documentation engine. I believe the configurations available via dune-project are roughly on par with what cargo files offer. So, it seems to me, the need for a new configuration format follows from the expansion of what is considered as part of the project configuration. Personally, I’m not sure whether I want those things in my package configuration (or if I did, I think I’d probably opt for a more total solution, like nix), however, I can definitely see the value added here!

While I’m still unsure whether drom is well suited for my workflow, I definitely appreciate the work, and I look forward to following its development :slight_smile:

I thought I might share some notions I’ve had re: the success of cargo. I believe a lot of cargo’s success follows from these two features:

  1. The total integration of package management and build system (i.e., between deployment and delivery) which comes from using a single tool for both. I’m not sure if the tight coupling is necessarily an overall good thing in the long run, but it definitely has some advantages. Obviously, one of drom’s aims is to help fill in this gap.
  2. Cargo’s very narrow focus in terms of project management, and the simplicity of the tool that this entails. E.g., cargo has no support for different templates, and an RFC to add support for them seems to have resulted in a resolution to improve the documentation instead. This simplicity lowers the learning curve, and provides devs a small and digestible common basis to work from. My concern with elaborate configs and project initialization is that it might end up front loading technical debt on new projects (or generally encourage a boiler-plate central development style, as we see common in JS and other webdev communities). However, this is likely particular to my POV and proclivities.
1 Like

Indeed, the idea behind drom is that a project is not just a set of source files, it has to come with CI configuration, documentation, tests, etc. Usually, gathering such files from other projects is time-consuming and should be avoided when the goal is to start a new project and just focus on the code itself.

Actually, when we started opam, we were already working on a tool that was combining the build (ocp-build at the time) and the package manager. The reason why we switched to “package manager only” is that many libraries already existed using different build systems (ocamlfind, make, omake) and it would have been a huge effort to translate them to ocp-build to get enough packages in the first repo. So we decided to make the package manager agnostic to the build system. Note that our hope was also that opam would be used for other languages, and maybe even by able to have a single repository for multiple languages with dependencies between them… looks like it’s not yet ready to happen though :slight_smile:

3 Likes

Slightly off topic: do you think the upgrade policy for packages implemented in Opam has proven itself? I believe a package frequently has to update the version constraints for its dependencies when the dependencies move on even when the package itself didn’t. As soon as a package is no longer actively managed, its opam description gets outdated relative to the progressing dependencies and the package no longer compiles.

I don’t think this problem is related to the way Opam handles dependency versioning, it is more the lack of support for versioning in the OCaml language that causes them. There are very few libraries in OCaml that correctly implement semantic versioning, i.e. do change their major version on breaking changes. It requires some work, because simple changes that would not break packages in other languages (adding a function in a module, for example) can break OCaml packages because of signature checking.

In some of our recent libraries (ez_cmdliner and ez_subst for example), we provide the interface under a V1 module, and we will try to keep this module as backward-compatible as possible, i.e. switching to a new V2 module for breaking changes. We will see if such a practice can improve long-term compatibility for packages in opam, and if it works, it would be nice if it could become a convention for all libraries in the ecosystem.

4 Likes

A hack would be to checkout the version of opam-repository when
the program (and at the version) you are interested into was introduced.
This way, you get an opam-repository snapshot where this program should compile.
This is ugly, yes, and the non reproducibility of opam builds is a terrible thing.
I think @c-cube uses such a hack at work.

Opam supports lockfiles nowadays, which set dependencies in stone so should give good reproducibility in practice. (Development using lockfiles also has downsides, in particular you have to remember to udpate them from time to time and it’s easy to forget.)

2 Likes

For a while opam has been testing all reverse dependencies, so these are kept up to date also for old packages. I agree that this could be a problem if you get the package from its unmaintained master branch (and if there is no lockfile – as mentioned by @gasche), but it should be less of a problem in opam nowadays

2 Likes

That’s a great service - automating the necessary work. I’m still not convinced that the current defaults are the best. Basically the semantics of opam files changes over time as you don’t get the same result as you once did.

The main problem is that people only put minimal constraints on versions, not maximal. So, when a new breaking version arrives, opam picks the most recent versions of all packages satisfying all the constraints, so the package is built with a version with which it is not compatible. The only solution I see would be to always define a maximal version too, i.e. semantic versioning.

drom currently does that. If you specify a dependency lib = "1.1.0", drom translates it into an opam constraint ( >= "1.1.0" & < "2.0.0~" ). So, if the dependency correctly implements semantic versioning, i.e. moves to version 2.0 on a breaking change, the package specification will not need to be updated. The opam team at OCamlpro is currently implementing the semantic version operator natively in opam, so it will soon just be ( ~ "1.1.0") or something like that.

7 Likes

It is nice to encourage semantic versioning and make it easier. But first one needs a notion of backwards-compatibility that is tractable, so I do not think that adding a function to a module should count as a breaking change. Instead, it is probably better if programmers are taught the dangers of using module type of with interfaces that they do not own.

2 Likes

Builds should be reproducible out of the box, and not requiring the user to play with some obscure options.

2 Likes

What’s the reasoning between splitting program into two packages - app and app_lib?

In my experience, you often end up wanting to take features from one program for another one. Having it available as a library makes it very easy. Sometimes, you also want features from a program that you have not written, so it is even better if everybody does that. Since drom tries to enforce some good practices and conventions, we made it the default.

It’s also nice for testing. For example, drom's own tests depend on drom_lib.

1 Like

Dune can load libraries into utop using the dune utop command, but it can’t load executables.

1 Like

From drom, you can use drom top that will do the same as dune utop, but install dependencies before with opam.