Structuring FRP (specifically Note) applications

Hi, I’ve recently been playing around with Note and Brr_note on a toy project of some moderate complexity and have run into issues with structuring my signals and events and have found myself questioning if the structure I’ve chosen is correct.

The root of the application is a S.fix node:

    S.fix initial_state (fun state -> 
      (* ... *)
      state, state
    )

The signal state captures the main state of the application.

Now within the fix, based on the state, I construct various UI elements which themselves may release events :

let new_project_elt, (new_project_evt : [> `NewProject] event) =
           button "New project" `NewProject in
...

In order to update the state, rather than modifying the data itself, the components release events which capture the information needed to enact the desired transformation at a later point.

Then, finally, at right at the end of the fix, I merge all the emitted events using a select:

let all_evt = E.select [new_project_evt; ... ] in

Sometimes it may be the case that one event may trigger another - for example, a LoadProject emitted by a component may require asynchronous loading of data from some separate service. In this case, we can bind events:

let load_data_evt = 
    all_evt 
    |> E.filter_map (function `LoadProject url -> Some url | _ -> None)
    |> Fun.flip E.bind load_data_from_url  in
(* merge it back into the all_evt stream: *)
let all_evt = E.select [
         load_data_evt; 
         E.filter (function `LoadProject _ -> false | _ -> true) all_evt
]

Then, in order to update the state of the application based on this event, I sample the state signal, calculate the new state, and swap the signal with a constant one that always produces the updated value:

S.sample state ~on:all_evt (fun state ev -> 
      match ev with  
      | `NewProject -> set_new_project state 
      | ... )
|> E.map (fun v -> S.const v)
|> S.swap state

So far, this works pretty well, however, one limitation of this is that events must be resolved within one “loop” of the S.fix, and can’t influence subsequent iterations, and this means that I have to be very careful with the order in which events are handled or end up with subtle bugs, which seems to suggest that I’m doing it wrong.

As an example of the kind of problem this causes, suppose rather than just loading projects, we also wanted to ask the user if they want to save their current project before loading a new project - in this case, we now have to handle events in a particular order:

(* first wrap load project if state is dirty *)
let load_project_evt = 
       S.sample ~on:load_project_evt state (fun state ev ->
            if state_is_dirty state 
            then `AskSaveThen ev
            else ev) in

(* now evaluate load project *)
let load_data_evt =
     load_project_evt
     |> E.filter_map (function
          `LoadProject url -> Some url
         | _ -> None)
    |> E.bind load_data_from_url in
let other_evts =
     load_project_evt 
    |> E.filter (function `LoadProject _ -> false | _ -> true) in

(* merge back into *)
let all_evts = E.select [load_data_evt; other_evts] in

The problem is that now my events have a strict dependence on the order in which they must be run, and if I were to swap them around, the program would be subtly incorrect (managing this would be slightly less of a hassle if events were allowed to flow across loops).

For just these two cases, the structure and reasoning is simple, but as the number of dependencies increase, tracking them becomes a hassle, and makes bugs more likely, suggesting that my structure is incorrect.

Has anyone else run into this problem? or do you use a different structure for these projects?

1 Like

I think that your state should include a status telling if you are Loading or Running the application :

type t =
  { status : [`Loading | `Running ]
  ; data : …
  }

As long as you do not store functionnal values inside the state, you can add as many elements as you want (if you want to add such elements in the state, you have to provide your own compare function in order to avoid error in runtime).

Then, here is how I run my event loop: each event is handled inside it’s own module and I avoid the monolithic pattern matching:

(** The Make module build the main application loop.

    The function [run] update the state on each event, and return a new state.
    Each event must follow the [event] type, which is composed from the type
    [t], and a module with a fonction [update].

    This example create an application with the state containing a simple
    counter. An even which increment this counter is created and can be used to
    update the state.


    [
        type state = { value : int }

        (** Increment the state. *)
        module Incr = struct
            type t = unit

            let update () state = { value = state.value + 1 }
        end

        (** Decrement the state. *)
        module Incr = struct
            type t = unit

            let update () state = { value = state.value - 1 }
        end

        module App = Make(struct type t = state end)

        (* Create the events *)
        let incr_event = App.E ((), (module Incr:App.Event with type t = Incr.t))
        let decr_event = App.E ((), (module Decr:App.Event with type t = Decr.t))

        let init = { value = 0 } in

        (* Run the main loop *)
        let state = App.run
          init
          (E.select
            [ incr_event
            ; decr_event ] ) in …
    ]

*)
module Make(S:sig type t end) = struct
  module type Event = sig

    type t

    val update: t -> S.t -> S.t

  end

  type event = E : 'a * (module Event with type t = 'a) -> event

  (** Simple helper for the main event loop *)
  let run
    : ?eq:(S.t -> S.t -> bool) -> S.t -> event Note.E.t -> S.t Note.S.t
    = fun ?eq init event ->
      let action = Note.E.map (fun (E (t, (module Event))) st -> Event.update t st) event in
      Note.S.accum ?eq init action
end

I have applied this pattern in differents applications, and this works fine. I’ve found the inspiration in this post : The shape design problem - #26 by kantian

Hope this helps :slight_smile:

1 Like

I see, yes, this is a pretty nice way of avoiding the match statement, although I’m not sure if it’ll work for my case, in particular, because it seems like the events in question are static (or at least not dependent on prior events):

In the above code, the event being accum-ed over is not dependent on the state.

Also it doesn’t seem like you could have an event that then triggers another event in this structure.

I guess with a loading field like this, you can manually capture the loading of a future, but I wonder if there’s a more ergonomic way to combine it with the events - it seems a shame to manually do the polling with functions like Futr.to_event in Brr_note

I didn’t have the time to go through the details of your question but I suspect you are trying to handle side effects via effectful events/signals which is a bad idea that React (the frp library) thaught me. Keep your FRP graph free from any side effect and use Note loggers to performs your side effects.

In particular what is the shame here ?

This is the mecanism you should use to handle interactions with asynchronous computations.

As far as I am aware, I’m not making use of any effectful signals - at least there are no imperative parts in my code.

To be particular, the shame was in reference to the proposed structure by @Chimrod , which seemed to suggest the use of a specific loading field, and then manually poll the future.

Currently, in my design, I do use Futr.to_event, and this is the specific cause of my issues, because events can trigger other events:

E.filter_map (function `LoadingProject -> Some () | _ -> None) ev
|> E.bind (load_data_from_url : unit -> `Data event)

At the root of my project, I have a single S.fix () which provides the signal that represents the state of my application - based on this (functional/pure) state, I create components, and events, and then as the last operation in the fix, I merge all the generated events, and do a single update of the state signal.

The problem is that because the chain of influence for an event must be contained within the fix (i.e every event must be merged before the final update of the state signal), I end up with code that is dependent on the order in which signals are handled (by order, this isn’t in reference to the evaluation order (i.e due to some hidden mutable state), but rather their order in the signal graph).

I don’t understand what you are saying here and what order you are talking about. There is no such things are as an “order in which signals are handled”, there’s no “event resolution” or “handling” or “running events”. In an update cycle everything happens at the same time (the synchrony hypothesis). You may want to go through React’s basics.

If you can do a conceptual self contained example of what you are trying to do, I could have a look.

Right, it could be that my choice of wording was poor here, maybe it’s easier to explain with an example as you suggested.

Here’s a small example of the structure I’m working with:

open Brr
open Brr_note
open Note

let load_data = function
  | `Incr ->
    (* pretend it does some async event *)
    Fut.return (`Data 1) |> Futr.to_event

type state = {data_count: int}
let initial_state = {data_count=0}

let build_ui_component (count: int signal) tag =
  let button = El.button [] in
  let ev = Evr.on_el Ev.click (fun _ -> tag) button in
  El.div [ button ], ev
  
let update state = function
  | `Data v -> {data_count=state.data_count + v}
  | `Reset -> {data_count=0}
  | _ -> state

let application =
  S.fix initial_state (fun state ->
    let count = S.map ~eq:Int.equal
                  (fun v -> v.data_count) state in

    let incr, ev1  = build_ui_component count `Incr in
    let reset, ev2  = build_ui_component count `Reset in

    let all_events = E.select [ev1;ev2] in

    let all_events =
      (* in order to evaluate incr events, we must call load_data *)
      (* first split out incr events *)
      let incr, other_than_incr =
        all_events 
        |> E.filter_map (function `Incr -> Some `Incr | _ -> None),
        all_events |> E.filter_map (function `Incr -> None | v -> Some v) in
      (* whenever incr occurs,  it triggers a subsequent load data event  *)
      let data = E.bind incr load_data in
      (* merge events together  *)
      E.select [data; other_than_incr] in

    (* final state update *)
    let state =
      S.sample ~on:all_events state update
      |> E.map (S.const)
      |> S.swap state in

    state, (state, [incr;reset])
  )

The key snippet here is:

    let all_events =
      (* in order to evaluate incr events, we must call load_data *)
      (* first split out incr events *)
      let incr, other_than_incr =
        all_events 
        |> E.filter_map (function `Incr -> Some `Incr | _ -> None),
        all_events |> E.filter_map (function `Incr -> None | v -> Some v) in
      (* whenever incr occurs,  it triggers a subsequent load data event  *)
      let data = E.bind incr load_data in
      (* merge events together  *)
      E.select [data; other_than_incr] in

Here, some events (like Incr) I can’t handle immediately, but rather need to spawn a future to evaluate them, so I split out event in question, do the bind, and merge together, but now this imposes an “order” (maybe that’s the wrong choice of words) in the dependencies of my signals, because any event that produces Incr must occur before the above snippet.

I guess you mean dependency order. So yes if you have an event/signal that depends on another one you have to define the later before, like values in general in OCaml.

In your particular case I suggest to handle the triggering of the asynchronous computation in the update function and feedback the result into the FRP system by using a primitive event.

This leads to the following code:

open Brr
open Brr_note
open Note

let data_loader : unit -> [> `Data of int] E.t * (unit -> unit) =
fun () ->
  let ev, send = E.create () in
  let load () = let fut = Fut.return (`Data 1) in Fut.await fut send in
  ev, load

type state = { data_count : int }
let initial_state = { data_count = 0 }
let update load_data action state = match action with
| `Data v -> { data_count = state.data_count + v }
| `Reset -> { data_count = 0 }
| `Incr -> load_data (); state
| _ -> state

let build_ui_component (count: int signal) tag =
  let button = El.button [] in
  let ev = Evr.on_el Ev.click (fun _ -> tag) button in
  El.div [ button ], ev

let application =
  let data_loaded, load_data = data_loader () in
  let def state =
    let count = S.map ~eq:Int.equal (fun v -> v.data_count) state in
    let incr, ev1 = build_ui_component count `Incr in
    let reset, ev2 = build_ui_component count `Reset in
    let action = E.select [ev1; ev2; data_loaded] in
    let update = E.map (update load_data) action in
    let state' = S.accum (S.value state) update in
    state', (state', [incr; reset])
  in
  S.fix initial_state def
3 Likes

Ah, I see, thanks, that makes sense. Yes, this solves the fragile dependency order issue I was running into.

I’m not sure what was “fragile”. One of the curse and virtues of FRP is that it makes all your data dependencies explicit. But you could also likely have expressed the dependency directly inside on the Incr event, rather than perform this elaborate multiplexing and demultiplexing event surgery.

It just felt more clean to express the load as an external component of your fix point.

Note that in general one should think more about FRP programs in terms of the denotational semantics of events and signals (that is pure values that vary over time) rather than think in terms of “event handling”, “loop”, update cycle or dependency graph. But it clearly requires a bit of mind twisting, data flow programming is really a different kind of programming (which is why I think most people give up on it).

3 Likes

Okay, I have a set of questions here, as my experience with React + my interpretation of the ideals of pure FRP, contradict some of the things happening in this code example.

  1. In your suggested change to the code, you update a primitive event from within the FRP graph - isn’t this breaking the guarantees of FRP? … Though I know why it works
  2. You use S.value - is this now safe in Note?
  3. I know that full-program recursion is a need for many programs - especially GUIs, but isn’t pure FRP in essence not incompatible with full-program recursion? (else you just end up writing the whole program unlifted from the FRP monad - which is just a game-loop)
  4. Isn’t it pretty inefficient to construct the full event-graph on each S.fix iteration… Making it kind of shooting oneself in the foot - unless you don’t need high performance at all. Also one looses the possibility of folding over events, unless you save these inside the initial state

This should be double checked but I don’t think it does. The fact that the call leads to a Fut.await to send the primitive event should ensure that the primitive occurence only occurs at earliest in the next JavaScript event loop iteration. So while it may seem so this does not directly recurse into the FRP graph.

However that’s still performing a side effect from the FRP graph maybe it would have been slightly cleaner to use a Note logger to do it (with likely further recursive complications :–)

Yes. That’s one of the advantages.

I didn’t understand that comment.

I’m not sure what you meant by that.

The event structure and dom element composition is purely static in this example. The def function only gets called once to define the fix point, it’s not called at each “S.fix iteration”.

I’d like to replace these fix point definitions by an infinitesimal delay combinator and lazy definitions at some point, I think they would make things clearer. (And one of the reason Note is still not formally released).

1 Like

I’m curious about how this would work? I had just been using the logrs module for logging events for debugging, and holding references to ensure that events/signals internal to a component would be updated, and I thought that these were the only applications. How can they be used in this situation?

I don’t know, @rand’s comments just made me think about that. I would need have to try but I’m busy with other things now.

I general a Note logger is the mecanism to use to observe data from signals/events and use them as inputs for other non-FRP idioms and, in particular, for performing side-effects.

This allows to keep the FRP system semantically clean. React under the influence of frtime advocated the use of effectful signals and events, but depending on your effects, that breaks equational reasoning and doesn’t blend well with the synchrony hypothesis (everything in an update step happens simultaneously).

For example here in Brr_note they are used to effect signal/event changes on the imperative DOM

1 Like

This was what I was thinking about. FRP gives you a limited but useful amount of expressivity, while at the same time guaranteeing some amount of safety. If you then communicate via effects to the FRP graph from within the graph, then I guess one could think of the safety-guarantees of FRP as lost? Though the pragmatic argument against is; “I know what I’m doing”. Another thing you loose is also to be able to translate the FRP-graph directly to e.g. a pure language.

What I meant here with “full program recursion” is e.g. when you have a model that is updated by your GUI, and your GUI depends on your model. Do you know if this has been solved with pure FRP (i.e. without effects) - e.g. in Haskell?

Ah - longtime misunderstanding of S.fix. I never used it in my granular video synth codebase, but now I think I have a specific usecase for it.

Using OCaml’s Lazy.t? Would be interesting to see some pseudocode for this interface

As long as your effects

  1. Do not trigger new instants (update cycle) from within an instant (i.e. do not update a primitive event/signal directly).
  2. Do not need to be ordered relative to one each other (i.e. do not rely on the order of updates in the update cycle)
  3. Do not need to have their life-time tightly controlled (i.e. you don’t mind if a few more effects occur before gc).

That’s kind of ok. The problem is that with React people got in trouble with exactly all of these points :–)

So I thought it was better to provide a formal structure to interface the outputs of the FRP system, in general you need 3. anyways because your effects deal with ressources.

So the idea is just to use FRP to express and solve complex logical temporal problems and have your effects clearly segregated from it.

I think that in the abstract fix points (i.e. your signal depends on the value it had an infinitesimal amount of time before) solve this. But usually you need to eventually interface with the world an have an effect (e.g. to render).

Something like

val S.delay : 'a -> 'a t Lazy.t -> 'a t

let rec state =
  let next = lazy (S.accum (S.value state) update) in
  S.delay initial_state next

In fact that looks wrong, it won’t pass the statically constructive test :–)

Ah yes, if one defines a representation of the UI in the render phase, that is returned in S.fix. Then one can map the raw user-input events to find which UI element got interacted with, which then maps to model-update events

The reason why e has the form lazy ( … ) doesn’t cover the statically constructive test, is that the lazy keyword need to be around the final expression?

I like the aesthetics of using recursive values and lazy - but otoh. the signature of S.delay doesn’t communicate how to use it. Though one could say that this was already the case with S.fix - as one had to do a manual check to avoid divergence

Yes or guarded by some constructor.

It will be difficult. In any case the idea is to have what fix gives you i.e. make a signal dependent on the value it had itself an infinitesimal amount of time ago (which is the signal you receive as an argument in S.fix).

I don’t have these things in my head but this revolves around either providing a prev combinator that gives you the value of it’s argument at t-dt or a delay combinator that say this will be the value of the signal at t+dt.