Context: I’m writing some serialization/deserialization routes for OCaml / Rust interop.
I am looking for a container over bytes where I can efficiently:
- read a u8/i8/u16/i16/u32/i32/f32
- write a u8/i8/u16/i16/u32/i32/f32
- everything is little endian
- also needs to work under jsoo
The closest I have found so far is: Buffer (base.Base.Buffer)
but it’s lacking the read/write ops.
Suggestions?
I think Buffer
and Bytes
provide this, but they will only allow to write char (that is u8). It is perfectly possible to write two of them to get to u16, but it might not be efficient enough.
I see, so basically I should expect to roll my own conversions for:
(u8, u8) <-> i16, u16
(u8, u8, u8, u8) -> i32, u32, f32
? (I’m surprised these are not builtins )
The stdlib’s Buffer
and Bytes
do have efficient read write for sized ints since 4.08.
This is where you want to look up the docs.
I see, are
val get_uint8 : bytes -> int -> int
get_uint8 b i is b's unsigned 8-bit integer starting at byte index i.
Since 4.08
val get_int8 : bytes -> int -> int
get_int8 b i is b's signed 8-bit integer starting at byte index i.
Since 4.08
val get_uint16_ne : bytes -> int -> int
get_uint16_ne b i is b's native-endian unsigned 16-bit integer starting at byte index i.
Since 4.08
val get_uint16_be : bytes -> int -> int
(and the corresponding set_*)
the functions you are referring to ?
Yes. Your other option is linear bigarrays of bytes (I don’t think you need a tutorial, the API docs seem sufficient to me, follow the types, it’s just arrays). But the integer primitives are not exposed on those.
You’ll find quite a few Big{bytes,string}
modules lying around, here’s one you could get started from. I think a Bigbytes
module should be added in stdlib, but I’m waiting to see what happens with bytes
in OCaml 5 before proposing one (if we can get non moveable bytes
then a Bigbytes
module would become redundant on 64-bit platforms).