How fit is ocaml after 3 decades?

Hello everyone :slight_smile:

I (novice, inexperienced) would like to ask the makers of and contributors to OCaml their take on rewriting it ?

  • Why this doesn’t happen ?
  • If you’d do it from scratch, how, what would you do otherwise ? (I read glimpses of that everywhere but in scattered form).

My question takes its question mark from two accumulator patterns I see as universal plagues.

  1. Entropy
  2. Infinitesimal calculus (-> integration of diffs -> continuity, summing zeroes -> divergence)

Has OCaml decayed or aged well ? Has entropy won over and settled in the compiler ?
Do infinitesimal (locality) decisions keep status quo (some invariant) on anything bad/good in the long run ?
How OCaml is immune to these torments ?

I guess I’d like to know how resilient has been OCaml to the dents of time. Its childhood mistakes. And what do you think of integrating design errors (if at all) over infinite open time intervals ? (say OCaml lasts 300 more years). It’s getting pretty existential now but is OCaml designed with immortality in mind ?

I recommend you focus on one single question and clarify your expectations.

1 Like

It already happened. OCaml is descended from at least two complete rewrites. See Caml - Wikipedia

2 Likes

And is happening one more time with the Multicore OCaml work progressing.

3 Likes

There is 1 question in the title. +10 opening synonym leads to focusing on a themed perspective.

  • Caml light was apparently an optimization step, heck, a bare metal rewrite.
  • Caml Special Light is the kind of rewrite I refered to. A construct inclusion.
  • Objective Caml , another structure inclusion rewrite.
  • Multicore might be an optimization rewrite. I didn’t think about it ! I had language alone in mind, away from the architecture lowerings.

I think of 1st and 4th as canonical lowerings (virtual -> metal). 2nd and 3rd as eloquence evolutions. I remain unable to find (virtual -> virtual ) embeddings. I am thinking of higher-kinded types integration in OCaml’s fabric.

I posted this question after 90ish hours in a row of reading issues, PRs and discussions. I started to pick up meta patterns (my bad !) and vibes.

But that led me to pondering this for a while. I came up to the conclusion I am a naive newbie for one ! (introspecting led me to personal introspection, aha!, reflexivity kicking in).

I am happy one can coin the idiom “infinitesimal reasoning/logics”.

I observed (inflation phase ~> semantic compression phase) and a kind of error correcting always online. That is, many incremental steps followed by a fold/factorization step. All this minor/major tidying up reminds me of simulated annealing(the multi resolution bit).

That raises this Q. Does implementing a permutation of OCaml major features change its final form ? So far it’s done in historical order. If you fast forward 1 century and ask how to add 100 features, you don’t just randomly plug and patch core ML lightly I think.

Anyway, this is too meta and out of my concern at this stage (can’t do anything today). I have hopefully understood that OCaml is not its grammar, rather it’s a community embodiment, which kind of answers all of my questions on its evolution. Therefore, OCaml is alive, conscious and has feelings.

1 Like