I work myself into the C++11 features and recognized the move-semantic and try to apply it to at least every function that handles container or "bigger" objects. Now I found some tasks that I would like to run in parallel so I would use std::future, but these tasks handle containers (in my situation return a container). So I have this pseudo code:
std::future<container&&> c = std::async([]()->container&&{ /* stuff return a local container object */ });
And know I ask myself how is the lifetime controlled of a container rval ref? If I am right and the task is finished before I call c.get() it is stored. Will the stored value still contain a usable object?
Does this ensure it's lifetime?
std::future<container> c = std::async([]()->container&&{ /* same stuff -- ^ -- */ });
container cc = std::move(c.get());
It looks as though you're doing move semantics wrong.
You should return by value, not by rvalue reference, and let the move constructor ensure that returning by value is efficient. Otherwise you risk returning a dangling reference to an object that no longer exists. The point of move semantics is to make passing objects by value cheap, rvalue references are just the language feature that enables that, the goal should not be to use rvalue references for their own sake.
In other words, you want to move the data from the lambda body, to the lambda's return value, to the future's stored value. That moves the data. You don't want to pass around a reference, that doesn't move anything (and you could already pass things by reference in C++03 using lvalue references!)
Your lambda should return by value, and the future
should store by value:
std::future<container> c = std::async([]()->container{ /* stuff */ });
And you don't need to use std::move
, unique futures return the stored value as an rvalue, so you can move from it automatically without using std::move
to cast it to an rvalue:
container cc = c.get(); // cc will be move constructed
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With