People build the repo with the build system of the repo which has to fetch set of warnings from somewhere. You’ve mentioned that’s painful to maintain the set of warnings consistent across projects - and I agree. How do you then deduplicate this warnings across your projects? I assumed that you’ve found a way to avoid this since you’ve said:
Adding that line in 40 repos is a maintenance burden I don’t want to have. It just doesn’t scale.
For now I simply use OCaml’s default set which in contrast to dune’s one is sensitive and non irritating – but then if your system is really compositional solving this problem is just a matter of being able to import configuration defaults from a library.
Except IIRC for useless-record-with which doesn’t make sense from a code evolution perspective but I got over it. ↩︎
To reinforce this point: its not about intentions. No matter how irenic and welcoming the Dune devs are, they are finite. Presumably they do not work for free, at least most of them don’t. What happens when the economy takes a nose dive and their budget gets cut? Or their company simply changes priorities, and Dune devs get transferred to teams addressing core business?
I have no problems with experimenting with a cargo-style unified CLI tool for OCaml. But I am extremely concerned that existing tools that we know and love could disappear, or be neglected, or lose their CLIs as they are turned into libraries usable only from the unified CLI. This could happen because of limited resources being shifted to the development of the unified tool, or as a way to force adoption of the unified CLI.
I’d really like the roadmap to commit in the strongest words on keeping essential tools like the core compilers, OPAM, interfaces with code editors, documentation generators, etc, usable independently of the unified CLI tool, with their own CLIs to support existing and future alternate workflows. (For the core compilers I can vouch for that, but not for the other tools.)
I don’t think it’s critical to keep the different cli tools – it’s fine if dune is the frontend to all major cli commands. What I think is important is to keep the internal data interchange format between different functionalities of dune textual. This is a pain from the perspective of the efficiency loss - using data structures in memory is much easier and type-safe etc. But this allows anyone to hook in to the functionality they need for their particular tooling. And since the tools are currently communicating textually (for the most part), there’s no loss relative to today – there’s just less of a gain.
What would be the point of maintaining an interchange format between the internal libraries of a single unified command line driver if there were no other independent tools in active maintenance that use that format?
Indeed I agree with @xavierleroy that the more modular the setup is, the best; however I’d like to remark that there are some big wins due to tool integration that often outweight the loss of CLI modularity. Two examples:
a large amount of time is spent in Coq marshalling back .vo files, fcc, a tool that integrates the coqc compiler with a build system similar to Dune can share the .vo de-marshalling across the compilation of a whole Coq theory; gains in time a memory are large.
something similar can be observed in for example opam and build caches, it is not easy for Opam to improve sharing and build caching without basically reimplementing their own build system, or creating a new protocol which is hard to design and maintain
I’ve observed that in general users do much prefer integration, but that’s a HCI question that should be answered formally IMHO.
I thought this was another update but it’s from May!
Thanks for making OCaml so easy to use. @tmattio@sabine (and others that I don’t recognize - Dune folks). As a lurker and experimenter, all the improvements in the tooling has made it feel much more like Go, Rust, npm where I can spin up a script in seconds. I keep coming back to OCaml for…idk a multitude of reasons despite always testing the waters with f#, scala, haskell. Not that I share any open code to help others because they’re all too specific and garbage.
I agree that an integrated tool can have better performance than a collection of standalone tools used via their CLIs. But this is not a valid reason to kill the standalone tools and their CLIs! It’s for end-users to choose between the fast integrated tool with its fixed workflow and the perhaps slower but more flexible standalone tools that support the users’ preferred workflow.
Taking your Coq+Dune example and the way I use Coq in my CompCert project: compilation times are not an issue, the whole system (incl. extraction and OCaml compilation) builds in 1min15s on a good workstation, and de-marshaling accounts for 1.9% of this time. On the other hand, the build procedure is not trivial, with some Coq sources being generated by preprocessing, and Coq producing OCaml files (by extraction) which are then compiled. So, your Coq+Dune tool would not be a good deal for me: in exchange for negligible performance gains, I would have to redo my build procedure to use a system (Dune) that doesn’t support mixed-language projects well. Please leave me alone with my “slow” builds and my carefully honed build procedures.
I fear I may be partly responsible for the ‘killing the CLI’ part of this, as I first demonstrated a really early integrated prototype back in a Oxford OCaml Workshop presentation. Allow me to be really clear on my position here today: any CLI that is released as part of OCaml Platform tooling and has users is one we try really hard to maintain, as that CLI is very often already integrated into build scripts (and will thus break some opam packages that are already released, and we do try so hard to keep those building over time without upper bounds).
Back when I started prototyping the integrated CLI in 2017, OCaml was possibly at its lowest point in terms of the Platform tooling, since almost no industrial users actually used the publicly released tools! Jane Street had Jenga, Coq had Makefiles, Xen still used omake, the OCaml compiler itself had backed away from using ocamlbuild, and every project I talked to didn’t because they cited slow performance and difficult debuggability to the then-recommended stack of Oasis/ocamlbuild/ocamlfind. How did this happen? A fateful decision back in 2012 resulted in Oasis wrapping the ocamlbuild CLI, which in turn had a special mode that wrapped ocamlfind, and every single compiler invocation went through 5 forks before it ever got to ocamlopt.opt. If instead Oasis had instead linked to ocamlbuild as a library, we may have avoided this, but we’ll never find out. And I’m not criticising the authors of Oasis for their decision either – it was a very pragmatic one to get us past having to write direct ocamlbuild _tags files.
What I underestimated with the integrated CLI is the sheer amount of time any migrations take for downstream projects, and also what Xavier points out above about multi-language builds and the flexibility of Makefiles. So my own thinking has evolved on it too: what we need from our tools is a OCaml library interface, with the CLIs being as thin as possible. And by and large, that’s mostly how the active tools in the Platform operate today. We have a number of CLI tools that interoperate via opam-libs or the more lightweight opam-file-format. Dune itself is just vendoring in big chunks of opam for its own integration, which means that it can be upgraded with the same core logic as used in future versions of the opam CLI. Dune’s also got a library reimplementation of ocamlfind, so that it doesn’t need to shell out to that but still retains strong compatibility.
This also points to a possible good toplevel metric for the OCaml Platform: what proportion of the community are using the tools that we recommend? This proportion is clearly increasing (opam, dune, merlin, lsp-server and odoc are now widely adopted both in open source and in monolithic codebases that use OCaml), but I think we’re less clear on others like ocamlformat, dune-release vs opam-publish, and mdx. Suggestions for improvements on this metric, and for ways to measure it more systematically, are welcome.
A suggestion for ocamlformat in particular: measure how many people use a fixed version of it. Because it’s not backward compatible I’ve settled on 0.24.1 so I don’t have to reinstall a different version every time I change project. I suspect I might not be the only one.
(dune for example is backward compatible and it makes a huge difference).
Actually Dune support for the Compcert multi-language workflow is not perfect, but pretty reasonable, and has many advantages over the current makefile support, but that’s offtopic.
Of course I think removing CLIs for the sake of removing CLIs is not good, and I don’t think anyone defends that here; on the other hand not all features tools may want can be expressible in terms of CLIs, etc… in the end when a developer wants to implement a feature, they must choose between different trade-offs, and often unix-style CLIs don’t offer the best value, at least with the current interface / bash.
I picked a performance benefit of an integrated tool, but note that there are many other advantages to integration such as usability / ergonomy, lower impedance, etc…
I can fully understand people being wary of lack of modularity , and they should be as it is an important concern, but on the other hand, and the goal of my message, is to try showcase some of the motivations to have a single entry point to the toolchain.
IMVHO we should be wary of dogmatic approaches to user/developer facing components, and instead try to use HCI methods to better understand what the pain points and use cases are.
 In fact years ago I would have strongly rejected an integrated setup, but experience in tooling has taught me otherwise, and I’ve found plenty of examples. For example, in jsCoq, making the package manager a part of the document manager allowed us to remove a huge amount of redundant code, and above all, we got rid of a kind of CLI protocol that was pretty complex to design / prove correct.
Adds a workflow to integrate with other build systems (W16)
Adds a workflow to compile to WebAssembly from Dune (W14)
Adds a workflow to compile MirageOS unikernels from Dune (W12)
Updates Literate Programming workflow to use odoc and mention interactive code blocks (W24)
Updates Generate Documentation workflow to list features planned as part of the odoc roadmap (W25)
Updates Package Publication to remove mention that users need not to commit an opam file (W26)
A big thank you to everyone who participated in the discussion! Building a community-driven roadmap is no trivial exercise, but witnessing so many people being driven to share constructive feedback, there’s no doubt that it’s absolutely worth it.
Much of the feedback, both here and in other Discuss threads (e.g. The OCaml Platform - a vehement dissent), points out the roadmap’s pronounced Dune-focus. I want to be very clear that while the focus of the coming years highlighted in the roadmap is indeed to consolidate the integration of the Platform tools to create a cohesive experience – a top request in the OCaml surveys – there’s no intention to make Dune the singular solution for OCaml tooling. The OCaml Platform is a collection of independent tools that can be used without Dune, and this will remain the case.
We’ve worked on formalising the Guiding Principles before sharing the roadmap. One key principle we’ve asserted is that while Platform tools operate under a single frontend (the editor or Dune at the moment), they remain independent. @xavierleroy rightly suggested that we commit to this in stronger terms, so we’ve updated the relevant Principle to make this more explicit.
In addition to clarifying the Guiding Principles, seeing that the majority of the feedback we received is asking for better support for non-Dune users, two of the added workflows in this last update (W15 and W16) aim at bringing better support for third-party (i.e. non-Platform) tools.
Let us know what you think of these new workflows and the other updates! As usual, let’s give an additional 2 weeks for feedback on the roadmap. Barring significant concerns, we’ll undergo a final revision based on all the discussions before adopting the first version.
PS: thank you @shonfeder for pointing out that the issues were deactivated on the repository. I’ve activated them, so don’t hesitate to open issues there now.
Thanks for the update @tmattio , I was wondering if some tooling to ease researcher’s in Software Engineering and Machine learning would be in scope for the platform?
Any common development workflow is potentially in scope! It’s worth noting that there’s no requirement for a tool to be in the roadmap for it to be incubated: if a new tool fills a gap in the development experience and has users, it can be proposed for incubation in the Platform.
I am way more active in the Coq community, but in our case, there is a lot of interest from research groups in having a library / theory platform that can be used as a dataset for both ML and SE experiments.
There are quite a set of challenges for the case of Coq, which I think they may apply to OCaml as indeed they share a lot of history w.r.t. tooling; I’d recommend this paper for an overview of the Coq case https://drops.dagstuhl.de/storage/00lipics/lipics-vol268-itp2023/LIPIcs.ITP.2023.26/LIPIcs.ITP.2023.26.pdf but basically, researchers need uniform tooling with a concrete API for experimentation; this mostly involves: a) changing code b) compiling code very fast c) getting typing or other kind of information in an usable format.