I’m setting up a private OPAM repository on an authenticated GitLab instance. I’ve set up the Git repository, added it to my local switches, so far so good. I’m planning on having Gitlab CI build the tarballs, so everything should run smoothly.
However, every time OPAM needs to interact with the Gitlab repo, it prompts me for my credentials. Is there a way to store them somewhere?
The following post seems similar, but there was no definite answer given.
I solved it by creating a custom retrieval command that uses a token to retrieve tarballs from GitHub, which is how the OPAM repository cache is built.
The repo that people add to their development machine is on an HTTP server accessible only within our network but without authentication.
This looks very promising. It seems to work for the repository updates, I still have to give it a try for package installs.
Besides, this is working great with Gitlab, since it supports user private tokens for authentication. That way, I don’t have to write down my actual password in ~/.netrc and I have control over what the token gives access to.
Maybe this script will help you a little. These are the steps necessary for Github to get private tarballs (used as OCAMLFETCH command), maybe adapting it for Gitlab could be feasible.
That sounds totally doable. The script could look at the host of the URL, and if it matches my Gitlab instance, call a custom curl command where my token could be read from the environment (or a configuration file).
Welp, 4 years later, I decided to give that whole private OPAM repository thing another shot, and I ended up with something I’m quite proud of, with a private Opam repository, and private packages uploaded to GitLab’s package registries!