Lwt is a library for concurrent programming in OCaml. It is developed by the community, and is one of OCaml’s most-used libraries.
Lwt is based around the familiar concept of promises. When you make an I/O request with Lwt, the library gives you a promise. It then runs the request in the background, without blocking the rest of your program. When the request completes, Lwt fulfills the promise, running any handlers you’ve attached. Lwt can process huge numbers of such requests concurrently.
In the meantime, your code (e.g. the handlers) runs in one thread. This eliminates the need to worry about race conditions, set up locking, or other such complications. You can still spawn threads on an opt-in basis, however.
Lwt is routinely used in high-concurrency, high-load applications.
Lwt resources and community
Lwt welcomes users and contributors of all levels of experience!
Please ask any questions you may have. You’d be surprised at how often simple questions help other users, and educate the maintainers. And, if you want to take the plunge and help Lwt and the broader OCaml community, we’d love to help you get gently started as a contributor.
To reach us:
- Reply to this very topic, or open another one on discuss.ocaml.org!
- Open issues in the Lwt repo.
- The Lwt Gitter has many frequent contributors lurking.
- Ask on StackOverflow.
- The #ocaml channel on freenode is a good place to find experienced Lwt users.
Documentation:
- The manual can be found here. Sorry about its state – we are currently writing a new one!
- Lwt tutorial from Mirage.
- Concurrent Programming with Lwt by @dkim is a nice collection of examples, especially useful for those reading about concurrency in Real World OCaml.
- Simple Lwt server by @dmbaturin.
- A comparison of asynchronous programming in Lwt and Python by @talex5.
Announcements:
- Here, in the Lwt category of discuss.ocaml.org.
- In the Lwt announcements issue, which you can subscribe to.
- On the OCaml mailing list.
- In /r/ocaml.
If you’d like to follow development more closely, watch the repo on GitHub
And, if you are looking for something helpful, but not too challenging to get started contributing with, we maintain a list of easy issues.
Happy concurrent programming!