Giter VIP home page Giter VIP logo

Comments (20)

ecorm avatar ecorm commented on August 26, 2024

boost::future might not be viable because of the issues raised in crossbario/autobahn-cpp#42 and crossbario/autobahn-cpp#43. In particular, continuations don't run in the main thread.

from cppwamp.

taion avatar taion commented on August 26, 2024

IMO there's a fundamental problem with using futures in that in principle one could attach multiple continuations to the same future (I don't mean chaining), in which case the right thing to do might actually be to run each of the continuations in a separate thread, for example.

It's not like the neat cases with callbacks and coroutines where there is exactly one unique continuation. You could of course add a special future class that essentially just offers sugar for attaching additional callbacks, but it will likely break (or at least be inconsistent) down the road when/if Asio adds proper support for then-able futures.

FWIW, this only applies to the experimental then-able futures that AutobahnCpp uses. Plain vanilla futures don't have these problems. They're annoying to use with Asio, though: http://www.boost.org/doc/libs/1_58_0/doc/html/boost_asio/example/cpp11/futures/daytime_client.cpp

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

The experimental nature of boost::future continuations is one of the primary motivators that led me to write my own library based on calbacks.

There's a boost::future::then overload that accepts a generic object called an Executor. I'm considering writing my own Executor that would execute continuations via boost::asio::io_service::post. Hopefully, that would solve the issues you raised for Autobahn|Cpp. This custom executor would part of a FutuSession mixin that would be layered on top of Session. This future-based API would be considered experimental while continuations and executors have the experimental status in Boost.

from cppwamp.

taion avatar taion commented on August 26, 2024

I think that's the most straightforward way to do it with the Boost continuations. It's painful to use, though. You always have to pass in the executor, and then you're going through multiple thread boundaries... essentially running the continuation on a new thread, just to post something into a queue for the main thread to run.

Might be worth explaining the implications in documentation at least. It'd certainly be nice to have a working future-based API, but it will likely be the wrong thing to use unless the user has a very strong preference for futures over callbacks or coroutines.

This was my primary motivation to stop using AutobahnCpp, BTW. I could have made my own executor or else just called io_servce::post in the continuation, but I couldn't figure out a way to use futures that didn't involve using extra threads for no good reason.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

but I couldn't figure out a way to use futures that didn't involve using extra threads for no good reason.

I'm currently hitting the same road block while experimenting with futures. It seems that continuations can only "triggered" by future::get or future::wait which ends up blocking the main thread. I can no longer call io_service::poll, which deadlocks the entire single-threaded app.

You always have to pass in the executor, and then you're going through multiple thread boundaries... essentially running the continuation on a new thread, just to post something into a queue for the main thread to run.

It wouldn't be necessary to spawn a new thread for each continuation. I think it should be workable with only two threads. The main thread would basically just do:

  1. Spawn a worker thread that does io_service.run().
  2. Issue the first async connect operation, and chain in the continuations.
  3. Call future::wait on the "last" continuation.

You're right about the thread boundary crossings, though.

It'd certainly be nice to have a working future-based API, but it will likely be the wrong thing to use unless the user has a very strong preference for futures over callbacks or coroutines.

Yeah, I currently don't see what the appeal is for futures. I'm having difficulty imagining how you're supposed to implement non-sequential branching logic with them. On the other hand, with coroutines, it's almost like writing regular procedural-style code!

from cppwamp.

taion avatar taion commented on August 26, 2024

I'm not sure I understand what you're saying. If you do future::then with launch::deferred, then it is the case that your continuations won't run until you try to wait or get them, but otherwise they should run asynchronously right away. This is how all of the AutobahnCpp examples work.

As for futures in general, there are a use cases like having multiple continuations (not chained, potentially in parallel) hanging off the same future, and things like when_all and when_any that aren't as easy to express with callbacks or continuations. They offer composability and let you avoid the pain of nesting callbacks, while still being more explicit than using coroutines. I do think coroutines are the nicest way to write async code, though, but futures aren't too much worse.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

This future stuff is new to me, so it seems I still have some misconceptions about them.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

It turns out I was forgetting to keep a returned future object alive. It seems like an easy mistake to make. This bug in my test program was giving me the wrong impressions on future's behavior.

I managed to get a small FutuSession test program running, using a custom executor that posts every work task to io_service. Even though I use that custom executor in every continuation, my program still ends up spawning a separate worker thread. Strange...

When I get the chance, I'll create an experimental branch for the new FutuSession API, for anyone interested in taking a look.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

I get that same extra worker thread when I run coroutine examples. It turns out that it's created by ip::tcp::resolver::async_resolve on first use, to emulate asynchronous host resolution on Linux: http://www.boost.org/doc/libs/release/doc/html/boost_asio/overview/implementation.html#boost_asio.overview.implementation.linux_kernel_2_6

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

This Boost.Thread bug makes it impossible to return futures from within continuations that were run with an executor. Not being able to return a future from within a continuation limits the usefulness of futures.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

Released first cut of the future-based API in this experimental branch: https://github.com/ecorm/cppwamp/tree/issue003-future-api

from cppwamp.

taion avatar taion commented on August 26, 2024

That's quite cool. I didn't realize the executor's submit method only got invoked once the future result was ready. I must have misunderstood what was going on.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

I tried putting std::cout statements in AsioExecutor's methods and (except for the constructor/destructor) saw that only submit ever gets called. The other methods aren't really needed, but I left them there for the sake of completeness.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

If anyone really wants to use the FutuSession API, and is willing to put up with the experimental status of boost::future, please let me know. I can merge the FutuSession branch and possibly add a tutorial page.

from cppwamp.

taion avatar taion commented on August 26, 2024

I suppose someone would have spoken up already if so. Maybe add a small note in the docs that experimental support is available?

I do think that given the way futures end up working under the hood right now, it would generally be a poor decision to use the future-based API rather than the coroutine-based one, but I already said I wasn't interested in using a future-based API (:

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

I just read over in the Autobahn|Cpp issues that boost::asio::io_service will be made into an Executor. Perhaps I should wait until then.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

I'm holding off completing the Future-based API until Asio 1.11.0 is released as part of Boost.

Asio 1.11.0 overhauls io_service to make use of executors: http://think-async.com/asio/asio-1.11.0/doc/asio/history.html#asio.history.asio_1_11_0

I'll also be waiting for this Boost.Thread bug to be fixed: https://svn.boost.org/trac/boost/ticket/11192

from cppwamp.

jpetso avatar jpetso commented on August 26, 2024

Yay, they fixed https://svn.boost.org/trac/boost/ticket/11192 - to be released with Boost 1.60.0 :)

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

Asio was updated in Boost 1.66 release. There's mention of executors in this new documentation page: http://www.boost.org/doc/libs/1_66_0/doc/html/boost_asio/net_ts.html

I'm too busy with work/life right now to investigate any deeper. Coroutines and the traditional callback interface have been working well for us so far at work; we don't use use futures.

from cppwamp.

ecorm avatar ecorm commented on August 26, 2024

Closing this in favor of #128, which will feature a unified API in Session that supports any Boost.Asio completion token.

from cppwamp.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.