Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

RxJava. Sequential execution

In my Android App I have a presenter which handles user interactions, contains kind of request manager and if needed sends user input over request manager to request manager.

Request manager itself contains server API and handles server request using this RxJava.

I have a code, which sends a request to server everytime a user enters a message and show the response from server:

private Observable<List<Answer>> sendRequest(String request) {
    MyRequest request = new MyRequest();
    request.setInput(request);
    return Observable.fromCallable(() -> serverApi.process(request))
            .doOnNext(myResponse -> {
                // store some data
            })
            .map(MyResponse::getAnswers)
            .subscribeOn(Schedulers.newThread())
            .observeOn(AndroidSchedulers.mainThread());
}

However now I need to have kind of queue. The user may send a new message before the server has responded. Each message from the queue should be processed sequentially. I.e. the second message will be sent after we've got a response to the first message and so on.

In case an error occurs no further requests should be handled.

I also need to display the answers within a RecyclerView.

I have no idea how to change the code above to achieve the handling described above

I see kind of problem. On one hand, this queue can be anytime updated by the user, on the other hand anytime server sent a response the message should be removed from the queue.

Maybe there is a rxjava operator or special way I just missed.

I saw a similar answer here, however, the "queue" there is constant. Making N sequential api calls using RxJava and Retrofit

I'll be very thankful for any solution or link

like image 844
Tima Avatar asked Nov 13 '17 12:11

Tima


4 Answers

I don't fnd any elegant native-RxJava solution. So I will custom a Subscriber to do your work.

For your 3 points:

  1. For sequential execution, we create a single thread scheduler

    Scheduler sequential = Schedulers.from(Executors.newFixedThreadPool(1));

  2. For stop all requests when error occur, we should subscribe all request together instead of create a Flowable every time. So we define following functions (here I request is Integer and response String):

    void sendRequest(Integer request)

    Flowable<String> reciveResponse()

    and define a field to make association of request and response flow:

    FlowableProcessor<Integer> requestQueue = UnicastProcessor.create();

  3. For re-run the not-sent request, we define the rerun function:

    void rerun()

Then we can use it:

reciveResponse().subscribe(/**your subscriber**/)

Now let us implement them.

When send request, we simply push it into requestQueue

public void sendRequest(Integer request) {
  requestQueue.onNext(request);
}

First, to do the request sequentialy, we should schedule work to sequential:

requestQueue
  .observeOn(sequential)
  .map(i -> mockLongTimeRequest(i)) // mock for your serverApi.process
  .observeOn(AndroidSchedulers.mainThread());

Second, to stop request when error occur. It's a default behavior. If we do nothing, an error will broken the subscription and any futher items will not be emitted.

Third, to re-run the not-sent requests. First because that the native operator will cancel the stream, like MapSubscriber do (RxJava-2.1.0-FlowableMap#63):

try {
    v = ObjectHelper.requireNonNull(mapper.apply(t), "The mapper function returned a null value.");
} catch (Throwable ex) {
    fail(ex);// fail will call cancel
    return;
}

We should wrap the error. Here I use my Try class to wrap the possible exception, you can use any other implementation that can wrap the exception instead of throw it:

    .map(i -> Try.to(() -> mockLongTimeRequest(i)))

And then it's the custom OnErrorStopSubscriber implements Subscriber<Try<T>>, Subscription.

It request and emits items normally. When error occur(in fact is a failed Try emitted) it stopped there and won't request or emit even downstream request it. After call rerun method, it will back to the running statu and emit normally. The class is about 80 lines. You can see the code on my github.

Now we can test our code:

public static void main(String[] args) throws InterruptedException {
  Q47264933 q = new Q47264933();
  IntStream.range(1, 10).forEach(i -> q.sendRequest(i));// emit 1 to 10
  q.reciveResponse().subscribe(e -> System.out.println("\tdo for: " + e));
  Thread.sleep(10000);
  q.rerun(); // re-run after 10s
  Thread.sleep(10000);// wait for it complete because the worker thread is deamon
}

private String mockLongTimeRequest(int i) {
  Thread.sleep((long) (1000 * Math.random()));
  if (i == 5) {
    throw new RuntimeException(); // error occur when request 5
  }
  return Integer.toString(i);
}

and output:

1 start at:129
1 done  at:948
2 start at:950
    do for: 1
2 done  at:1383
3 start at:1383
    do for: 2
3 done  at:1778
4 start at:1778
    do for: 3
4 done  at:2397
5 start at:2397
    do for: 4
error happen: java.lang.RuntimeException
6 start at:10129
6 done  at:10253
7 start at:10253
    do for: 6
7 done  at:10415
8 start at:10415
    do for: 7
8 done  at:10874
9 start at:10874
    do for: 8
9 done  at:11544
    do for: 9

You can see it runs sequentialy. And stopped when error occur. After call rerun method, it continue handle the left not-sent request.

For complete code, see my github.

like image 54
Dean Xu Avatar answered Nov 03 '22 11:11

Dean Xu


For this kind of behaviour I'm using Flowable backpressure implementation. Create outer stream that is parent for your api request stream, flatMap the api request with maxConcurrency = 1 and implement some sort of buffer strategy, so your Flowable doesn't throw exception.

Flowable.create(emitter -> {/* user input stream*/}, BackpressureStrategy.BUFFER)
                .onBackpressureBuffer(127, // buffer size
                        () -> {/* overflow action*/},
                        BackpressureOverflowStrategy.DROP_LATEST) // action when buffer exceeds 127
                .flatMap(request -> sendRequest(request), 1) // very important parameter
                .subscribe(results -> {
                    // work with results
                }, error -> {
                    // work with errors
                });

It will buffer user input up to given threshold, and then drop it(if you don't do this it will throw exception, but it is highly unlikely that user will exceed such buffer), it will execute sequentially 1 by 1 like a queue. Don't try to implement this behaviour yourself if there are operators for thing kind of behaviour in libary itself.

Oh I forgot to mention, your sendRequest() method must return Flowable or you can convert it to Flowable.

Hope this helps!

like image 26
Tuby Avatar answered Nov 03 '22 11:11

Tuby


My solutions would be as follows (I did something similar in Swift before):

  1. You will need a wrapper interface (let's call it "Event") for both requests and responses.
  2. You will need a state object (let's make it class "State") that will contain request queue and the latest server response, and a method that will accept "Event" as parameter and return 'this'.
  3. Your main processing chain will look like Observable state = Observable.merge(serverResponsesMappedToEventObservable, requestsMappedToEventObservable).scan(new State(), (state, event) -> { state.apply(event) })
  4. Both parameters of the .merge() method will probably be Subjects.
  5. Queue processing will happen in the only method of "State" object (pick and send request from the queue on any event, add to queue on request event, update latest response on response event).
like image 45
Maxim Volgin Avatar answered Nov 03 '22 11:11

Maxim Volgin


i suggest to create asynchronous observable methods , here a sample :

public Observable<Integer> sendRequest(int x){
    return Observable.defer(() -> {
        System.out.println("Sending Request : you get Here X ");
        return storeYourData(x);
    });
}

public Observable<Integer> storeYourData(int x){
    return Observable.defer(() -> {
        System.out.println("X Stored : "+x);
        return readAnswers(x);
    }).doOnError(this::handlingStoreErrors);
}

public Observable<Integer> readAnswers(int h){
    return Observable.just(h);
}

public void handlingStoreErrors(Throwable throwable){
        //Handle Your Exception.
}

the first observable will send request when he get response will proceed the second one and you can chain , you can customize each method to handle errors or success, this sample like queue.

here the result for execution :

for (int i = 0; i < 1000; i++) {
        rx.sendRequest(i).subscribe(integer -> System.out.println(integer));

}
Sending Request : you get Here X 
X Stored : 0
0
Sending Request : you get Here X 
X Stored : 1
1
Sending Request : you get Here X 
X Stored : 2
2
Sending Request : you get Here X 
X Stored : 3
3
.
.
.
Sending Request : you get Here X 
X Stored : 996
996
Sending Request : you get Here X 
X Stored : 997
997
Sending Request : you get Here X 
X Stored : 998
998
Sending Request : you get Here X 
X Stored : 999
999
like image 30
Elyes Avatar answered Nov 03 '22 10:11

Elyes