PGOCaml with OPAM

Hi all,

I have a repo with some modules requiring PG’OCaml.

If I build it manually from it’s directory with dune build @install then everything is going smoothly.

But If I try to opam install . --yes then pgocaml ppx can’t connect to database. The exact error is following:

Error: PG'OCaml PPX error: Could not make the connection host=, port=5432, user=m, password=*****, database=sdb, error: PGOCaml_generic.Make(Thread).Error("PGOCaml: Could not connect to database

DB connection params in both cases are provided by environment variables and looks correct.

I’d be grateful for any suggestions where to look :slight_smile:


opam builds are sandboxed and the build phase can’t access the network. The sandbox can be disabled with opam init --disable-sandbox, I’m not sure there’a a more fine grained setting.

1 Like

While TCP sockets are not available in the sandbox, Unix domain sockets are. So, if your PostgreSQL database is also listening on one, you could use it instead of


Thanks, guys. Will try both approaches :slight_smile:

Accessing the database at compile time doesn’t do much and creates a lot of problems unfortunately. I hope we’ll soon have a typed sql library that doesn’t do this.

ppx_rapper can be seen as a typed SQL library. We have annotated queries like:

SELECT @string{link} FROM rss WHERE link = %string{link}

@string indicates a string output. And %string, a string input.

Then ppx_rapper creates a Caqti query. The query is used with labeled arguments (here ~link) and can output tuples or records with adequate types (a third option is outputting data by calling a function with labelled arguments).

1 Like


I’m really curious how do you propose to check type correctness in compile time if the only source of the appropriate information is DB structure?

Somehow, I don’t see other ways yet.

P.S.: you can parse SQL file with the DDL, yes, but it’s basically the same approach, only worse.

I don’t get why worse ?

With annotated queries like upward with ppx_rapper, the risk of error is limited since types are very closed to the arguments and with records, we can avoid some error with big tuples. It is an improvement.

I know it is not perfect. Perhaps the best way would be to have a converter between database schema and Sequoia table descriptions. But I havn’t practiced Sequoia. All I know is that tables/types descriptions are localized, then queries are built upon them. (see GitHub - andrenth/sequoia: OCaml type-safe query builder with syntax tree extension)

Sorry, Frederic, it was question for Grimson :slight_smile:

Look, the current PG’OCaml’s approach let me simultaneously verify that my SQL queries are executable on the current DB structure AND my code is properly typed to handle the results of my queries.

AFAIU, ppx_rapper let me do only the latter.

And I really can’t see any other approach that would let me do those two things in one step.

1 Like

Yes, ppx_rapper does only work well if the types matches the database. There is no verification. I didn’t know that PGOcaml check/infer its types with the database.

I’m really curious, why worse?

From my point of view:

My local database is not in sync with git, so how can I guarantee the code I’m compiling is valid for the current schema?
Do I have a way quickly update/downgrade my db schema when I switch to a different commit to ensure consistency?
Why do I need a local database if I want to compile and exercise a portion of the code that do not access the database (e.g. running some unit tests on a different part of the project)

A DDL file make all of this disappear, or easy to address. And I probably have the DDL already to create & migrate the database as necessary.

What did the “live” checks really bought us?
Once compiled in the CI/CD pipeline, the binary will be pushed to a different machine and ran against a different instance of PostgreSQL.
That instance may be in sync or not, we may have missed a migration script. We will see what happens at runtime. :man_shrugging:


Compile time is not the right moment to check conformity with databaseindeed. We want to check that at runtime.

Another approach is to have your DDL directly embedded in OCaml. That’s what rel does which I still didn’t get the time to polish for release. There’s a bit of documentation on how to manage your schema here.

So when you connect to the database you query the meta tables and reconstructs a runtime representation of the schema and check it against the one you have. The stuff is fully typed, thanks to a form of runtime types for records. No ppx nonsense around.

You can also ask for changes between the schemas which is a less imperative view of thinking about your schema changes. Basically rather than thinking about how you are going to bring your old schema to the new one via a sequence of imperative DDL statements, you devise the new schema you want to work with, then you ask for the diff with the old schema which you can translate into SQL DDL statements – that output still has to be tweaked to devise data migrations though.

If you do an opam pin add rel that installs a rel tool which you can use on sqlite3 databases (or sqlite3 .sql files) to see how the library views them and compute diffs between arbitrary sqlite databases or regenerate their schema (including in OCaml syntax) as seen by rel .

1 Like