Any lib like Aysnc.Deferred.List.iteri with `Parallel in the C++ world?

I need to develop a simple performance testing tool for internal use. What I need is sending a brunch of http/tcp requests in every period time without waiting the previous request respond.

Since most of my colleagues write C++, I developed the prototype using C++ (pseudocode):

for (int i=0; i<BIG_NUM; i++) {
    sleep(TINY_TIME);
    int ret = send_req();
    log(ret);
}

In this case, every iter in the for loop have to wait the response from the target in order to send another request.

So I develop another prototype using OCaml & Async, it works as expected:

open Core
open Async
open Cohttp
open Cohttp_async

module Log  = Async.Log

let mock_lst = [111; 222; 333; 444; 555; 666]

let logger : Log.t = Log.create ~level:`Debug
    ~output:[(Log.Output.file `Text ~filename:"log.txt")] ~on_error:`Raise

let seq_send (url:string) (param:string) : unit Deferred.t  =
  Log.debug logger "%s\n" (param^" start!");
  let uri = Uri.of_string (url^param) in
  let%bind _, body = Cohttp_async.Client.get uri in
  let%bind body = Body.to_string body in
  Log.debug logger "%s\n" body; return ()

let ord_send_iter (lst:int list) : unit Deferred.t =
  Deferred.List.iteri ~how:`Parallel lst
    ~f:(fun idx ele ->
        after (sec (float_of_int idx)) >>= fun () ->
        (seq_send "http://127.0.0.1:9001/?id=" (string_of_int ele)))

let () =
  Command.async
    ~summary:"test"
    Command.Spec.(
      empty
    )
    (fun () -> ord_send_iter mock_lst)
  |> Command.run

I was planing to post this question in some C++ forum, but I am not sure they would familiar with both OCaml and Async.

I was wondering if maybe anybody who knows both C++ and OCaml would like to give me some advice?

1 Like

After doing some researh on the future library, I think Iā€™d got the answer:

// g++ t.cpp -o t -std=c++11
#include <iostream>
#include <vector>
#include <thread>
#include <random>
#include <fstream>
#include <chrono>
#include <future>
#include <ctime>
#include <cstdlib>

using namespace std;

ofstream fout("log.txt");
mutex io_mutex;
mutex io_mutex2;

int send_req(int m) {
    this_thread::sleep_for(chrono::seconds(m));
    std::time_t result = time(nullptr);
    lock_guard<mutex> lock(io_mutex);
    fout << m << " sent at " << asctime(std::localtime(&result)) << endl;
    int i = system("curl \"http://localhost:9001/?id=1\"");
    std::time_t result2 = time(nullptr);
    lock_guard<mutex> lock2(io_mutex2);
    fout << m << " done with " << asctime(std::localtime(&result2)) << endl;
    return m;
}

int main(int argc, char** argv) {
    vector<future<int> > futs;

    for (int i=0; i<10; i++) {
        futs.push_back(async(send_req, i));
    }

    for (auto &e : futs) {
        e.get();
    }

    return 0;
}

Along with the testing web server I found from other site

import tornado.ioloop
import tornado.web
import random
import tornado.gen

class DefaultHandler(tornado.web.RequestHandler):
    @tornado.web.asynchronous
    @tornado.gen.engine
    def get(self):
        id = self.get_query_argument("id", "1")
        sleepy = 2.0 * (random.random())
        self.write(id + " delay.... " + str(sleepy))
        yield tornado.gen.sleep(sleepy)
        self.finish()


def make_app():
    return tornado.web.Application([
        (r"/", DefaultHandler),
    ])

if __name__ == "__main__":
    app = make_app()
    app.listen(9001)
    tornado.ioloop.IOLoop.current().start()
2 Likes