Serious OPAM package quality issues



Oh C++ my bad … or is it? Mr weisenheimer is trying to be cute with me :wink: The
libraries might be written in C++, but even it has been a while for me I’m pretty
sure the FFI-Bindings in Pyhon are still in C :smiley: SCNR

So, here I might really jumping the gun, but that sound bad either. IMHO a tool like opam which really
every one can or must agree on is really vital for a language eco system.

Actually according to the github page of opam-depext archlinux is supported.


I’m sure I am since by your own words you tell us that you are not able to articulate it.

Constructive criticism and feedback is always welcome.


Yeah it looks like arch is ‘supported’, but it doesn’t really work. See here, here and here.

So all that remains is to add arch to the CI, and… well I don’t know if anything but bugging the arch people can realistically work here.

BTW, referring back to my original post above, the friend I talked about uses Arch, and since aspcud failed, he used the internal solver, often getting stack overflows in the process. I skipped over these details when describing his woes.


Hmm, I recall that the community setup a remote solver[0] for situations like this. But, it seems the documentation for this (or something related to this) is out of date[1] – and contains an old link which should be replaced with some reference to [0]? It’s unclear, so hopefully I’m jogging someone’s memory on the history here.

[0] :
[1] :


It seems there are serious issues going on at the moment on arch please see this issue.

Regarding the internal solver I’m more and more convinced it should be removed altogether as it is no longer able to operate in any meaningful way on the current state of the opam-repository. I have opened an issue upstream.


Year things are getting out of hand on the Internet fast. You are right, absolutely right,
and I apologize.

And @avsm precautionary I apologize to you also that C/C++ thing has been me
trying to be funny.

Hmm! Constructive? That’s gone been difficult I see now. As you so
truly point out the flaw in my logic. My words are gone be inadequate.

Once again, I apologize for offending the community


This thread has derailed from the original conversation way too much. Hence I am closing this thread. Please open a new thread if you want discuss something.




I am sorry. I am may have closed the thread too early since people started to talk against each other rather sticking to the original topic. I should have ideally asked to keep the conversation to the point.

I have re-opened topic. Please keep your comments to being constructive and actionable.


I am really sorry you had to hit all these problems. I assure you that feedback on opam has been, overall, very good, and that this is not the normal user experience.

Regarding the above: this is a known non-working configuration. We could probably have acted on it more quickly, but it’s not that opam “doesn’t like” the combination you describe, it’s really that Arch did ship a broken package that returns broken results. Since the package solver is at the core of the system, it’s no wonder the whole experience is ruined when it starts behaving in unpredictible ways.


Sorry! My mistake. That simple has been my impression, after sifting through bug reports and workarounds.

So, I leave it at that. Everything further I had to say would only full an already heated discussion.


Just updating to say that 2.0.0~beta4 now has a built-in solver (a proper one, not some half-working heuristics), and should therefore work nicely out-of-the-box, without the runtime dependency to aspcud. There are precompiled, statically linked binaries here (please report any issue with those, or missing platform you’d be interested in)


Is this topic still under discussion? If so, here are some general thoughts:

  • I am impressed by the automated tests that run on pull requests for the opam-repository. In principle they guarantee quite a high standard.

  • The problem are created by stale packages rather than updates, which are heavily tested. Packages become stale because their dependencies move on, or a new compiler is released.

  • Could the problem be contained by annotating packages with data gathered from automatic builds rather than relying on maintainers alone? Test results can prove the meta data in an opam file wrong and I believe we should incorporate this feedback. We could record sets of dependencies that tests have found to be working and sets that are known to be broken. Maybe improved version constraints could be deduced from them but just having the information available could already be helpful for maintainers.


I agree 100%. This is the biggest problem right now. The OPAM devs have done spectacular work in other areas, including taking care of the solver issue. What remains is continuous integration of old packages. OcamlPro had a server for this task at some point, but it seems like it may be gone. As you said though, it’s not enough just to label the packages that don’t build, or to publish a matrix on a web site. We need to act based on this information.

I think packages that don’t build should enter ‘unmaintained’ state. OPAM should allow you to try and build them, but when you do, for each unmaintained package it should say “I’m sorry – package X hasn’t been updated for OCaml 4.05. Please request the author to update the package at, and post an issue at”

I honestly think that this is the only way to incentivize package authors to keep their packages up-to-date. Many package authors don’t feel the need because they don’t feel like anyone uses their packages – they’re providing a free service, after all. Human communication with users should help nudge them towards maintaining their packages, or at least moving them to github where others can help.

This, btw, is why I don’t think we should make it too easy to publish on OPAM. Publishing is one thing; maintaining is a whole different story, taking a much greater level of commitment. Newbies shouldn’t be encouraged to publish. It’s equivalent to emphasizing quantity over quality.


Perhaps another way of handling the situation is lockfiles, a la Ruby Gems/yarn/and now npm? If opam respects a package’s locked dependency versions, then there’s no risk of getting stale on older versions or breaking on newer versions.

Upgrades would be manual, of course. But reproducible builds would be automatic.


I don’t know that upgrading per se is a problem – the solver is generally capable enough to know when you can’t upgrade. The problem seems to be getting a rarely used package to install in the first place.


Yes, perhaps because its dependencies have moved on and the rarely-used package has thus suffered from code rot? What if we captured its exact dependency version requirements and installed only those so that we could reliably install it?


That would be useful, but quite a difficult task. We’d basically have to ignore packages’ stated dependency requirements and test combinations of dependency versions on our own. Currently, we’re barely able to test just the latest version of packages.

Even if we did this, though, we’d still have the issue of being stuck with old versions of OCaml to make packages work – not only is it a pain and far from what the user desires, eventually you wind up with no version of OCaml that satisfies all the packages you want to install. At least I think that since OPAM 2 treats compilers like packages, it should be able to do the constraint solving for the user without the user having to try one compiler version after another.

The other issue is that two packages may have a common dependency, but be unable to agree on a common version for said dependency. I don’t know if OPAM has a solution for this.

At the end of the day, while OPAM can preserve dependency information about packages and try to improve on their stated constraints, there’s no replacement for package maintainers doing the job of keeping their packages up to date. Releasing a package into OPAM without the will to maintain it may satisfy the hunger of growing our ecosystem, but it’s empty calories.


When a package is updated, all (or the most popular subset) of its reverse dependencies could be built. Those that break lack a version constraint that limits the version of the package to the old version. This information could be added to the reverse dependency.

Over time, an unmaintained package would become so constrained that it is basically never installs anymore.