Packaging is about software deployment, which is to say making a software system available for use. Building is about constructing programs that can be used from source code (in conjunction with whatever dependencies). These two things are related, but different. E.g., interpreted language like Python have package managers, but (arguably) there is no “building”, you just install new source code and run it. On the other extreme, binary package managers install pre-built artifacts, so while building is required to create the package, the package manager itself doesn’t involve a build phase. In source based package mangers, the border is easily blurred, since deploying some software means getting the source code onto the target machine and building it. It is common for language-specific tooling to combine source based package management and the build system into one too (or at least to wrap this for end users into one tool), but they are logically distinct areas of concern.
If you look in your
.opam files, you will see see an entry like
build: ["dune" "build" ...]. This tells opam ow to build the source of the package. It uses this instruction as part of its installation process. You could replace this with another build tool, like
ocamlbuild or with just an invocation of ocamlc directly! But standard practice is now generally to use
dune. (You could also package up ocaml code using a different package manager, like nix or pacman, in which case you might use dune but not opam).
I suspect this isn’t true: if you remove the
.opam file in question dune will probably still build fine, assuming all the dependencies are already installed. What opam files are good for, however, is providing a portable definition of all the ocaml dependencies needed by a package. This lets you pull down the source code onto a new machine (whether real or virtual), and just do
opam install . to have all the dependencies installed and the package built. (This is just a special case of “configuration as code”.)
See https://dune.readthedocs.io/en/stable/dune-files.html#generate-opam-files and https://dune.readthedocs.io/en/stable/opam.html#generating-opam-files for how to configure dune to generate opam files. Let me know if you have any followup questions about making use of that configuration.
I’m not familiar with Java land, so I’m afraid I can’t help with comparison. But I can explain how these three files relate:
.opam species a package for
opam. A package includes some meta data about the project (like its name, the author, etc.) as well as a list of other opam packages required as dependencies. It also includes instructions on how to build and install the backage.
dune-project specifies a software project that will be built with dune. A project can consist of one or more software components (libraries, tests, executables, data). It only supports a few pieces of meta data, and was recently extended to also enable generation of
.opam files. If you enable this (as described above), you won’t need to think too much about the
.opam file any more. See https://dune.readthedocs.io/en/stable/dune-files.html#dune-project for documentation on this file.
dune files specify components that can be built by dune. They describe the libraries that a component relies on, any preprocessors it uses, what executable or library it builds to, etc. See https://dune.readthedocs.io/en/stable/dune-files.html#dune for documentation on
The reason for the distinction between opam dependencies and library dependencies in a
dune file can be understood by considering this kind of example: an executable and library might both be defined in the same package, but the don’t necessarily share the same dependencies. The executable may need a CLI parsing library, but the internal library in the package won’t require this, so we’ll end up with the CLI parsering lib in the .opam file, but it will only show up in the dune file for the executable, not for the internal library.
If we started over from scratch, we’d probably have fewer moving parts to this whole system, but thankfully they are all converging into a nicely integrated set of tools, and things are improving all the time.