Dear OCaml Community, following our previous decision to conclude the Owl project, we have been touched by the supportive and encouraging feedback from all of you. After a thorough discussion, Liang @ryanrhymes and I think it might still be for the best interest of the OCaml community to continue maintaining a solid numerical computing library. Consequently, I, Jianxin, will assume the role of project leader to ensure Owl remains maintained. Our goal is to keep Owl stable and updated, given the very limited resource we have, as explained in our previous declaration. At least we aim to keep Owl compatible with the latest stable version of OCaml.
As mentioned previously, our availability to dedicate time to Owl is limited. Achieving our objectives will require collective effort. Thus, I am looking to assemble a team of contributors eager to support both development and maintenance tasks. For details on our plans and how you can contribute, please refer to the project’s README file.
If you’re interested in joining the Owl team, taking on a specific part of the codebase, or if you have any questions, do not hesitate to contact me here or via email.
The Owl tutorial book is also redesigned so as to make it more maintainable. Due to a change of template, some part of it may not look quite well formatted yet. You are welcome to take a look and help to improve it if you are interested.
Great to hear that. I am sure Owl will catch up to the Python scientific ecosystem soon.
A few things I would like to say,
I think Owl is the numpy of Ocaml mixed with automatic differentiation. Which is great for differentiable array programming. And I think most of the work should go into making it much faster and stable. Although numpy is great it lacks some features to use it for deep learning. Even if the goal is to use it on a commodity CPU and not on GPU there is still no support for model compression such as quantization and to my understanding under-laying computational kernels such as OpenBLAS and stuff doesn’t support dtypes like bfloat8, float8 or int4 (there is a jax related project called ml_dtypes which provides support for bfloat18, float8, etc but doesn’t support automatic higher precison accumulation by default). Thats why meta and google came up with their own computational kernels which support reduced precision operators such as FBGEMM and XNNPACK. So I think Owl should only focus focus on providing numpy + scipy like support for Ocaml and for deep learning we should start a new project separate from Owl (I feel like augmenting Owl’s ndarray would make it less clean and unstable) and from ground up as an Owl barn project. There is still a lot of functionality needed to be added into owl itself but keeping a deep learning library as a separate project would be better.
I would really like to contribute to both owl and owlbarn.
First and foremost, I want to express my sincere appreciation for Jianxin’s willingness to take the helm and drive this project forward. Having worked closely with him for years, I have full confidence that he will lead Owl to new heights, paving the way for a more sustainable future.
Secondly, I urge everyone interested to actively participate in the project. Whether it’s contributing features, refining code, fixing typos, sharing ideas, or enhancing documentation—every contribution, big or small, will be immensely valuable to the entire community. Let’s work together to ensure Owl thrives and benefits us all.
I am not clear on any commitment at present but I think one thing I can do is to port the latest pytorch/examples to owl if possible. I am slow paced doing this also for ocaml torch. Usually, I don’t like to announce something in plan in case I may not finish it, but maybe it’s appropriate to say it here.
Thank you all so much for the very constructive feedback. @mtk2xfugaku has some great idea about starting some deep learning project based on Owl; @arbipher offer to help build up more examples. I think these are great ideas. Please just feel free to start an issue on Github for discussion, submit PR, or start a new project in owlbarn .
@lambda_foo thanks for mentioning this issue. Currently I’m setting up test procedure for installing Owl, and I’ll look into this issue as soon as it’s done.
If its officially dead, is there an alternative out there? I’m writing a library that heavily depends on Owl but im starting to worry that its abandon-ware and the correctness issues I’m facing may never be solved.
It wouldn’t be surprising if it’s not maintained any more. As far as I know, no one is being paid to work on it. If there is real interest of course someone (or company) could step up, but also it would take time for a new OSS maintainer to build trust and credibility especially for a project as important as Owl.
Thanks for your attention to the project progress. Glad to tell you that we are still maintaining it. As you can see from the Owl repo commit history, we have been keeping revising the Owl code base, and a week ago we have just fully revised the existing example codes. However, due to the lack of manpower, we can only focus on certain task at a time, and reply to issues with best effort.
Since you’ve just opened the Issue 674 one hour ago, I’ll first take a look and see what I can do; for the Issue 671, unfortunately we currently only support the four types, and adding extra support for the Int type would require some non-trivial coding effort. Really appreciate the temporary workaround you proposed though!
You are right, I didn’t look at the commit history and just browsed through the open issues and the lack of replies gave the impression that it wasn’t being maintained. Regarding the lack manpower, I’d be happy to help with fixing trivial issues. But I doubt i’d be much use since I’ve only been using Ocaml for 4 months and know very little about its internals to be a useful contributor.
module Ndarray = Owl.Dense.Ndarray.Generic
module Anyarray = Owl.Dense.Ndarray.Any
let transpose ?axis x =
let shape = Ndarray.shape x in
let y = Anyarray.init_nd shape @@ fun c -> Ndarray.get x c in
let y' = Anyarray.transpose ?axis y in
Ndarray.init_nd (Ndarray.kind x) (Anyarray.shape y') @@ fun c -> Anyarray.get y' c
I suppose this workaround can be generalized to any of the Generic functions that are limited to only 4 bigarray kinds?
I’m a bit surprised that the .Any array representation in Owl does not suffer from the same deficiencies of .Generic . Are the implementations of the functions completely different? Is there a performance difference between using either of the array types?