Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Angular HttpInterceptor for caching parallel request by sharing the observable

i want caching HTTP parallel request by sharing the observable and also cache the response in a Map object.

demo online

caching-interceptor.service.ts

import { HttpEvent, HttpHandler, HttpInterceptor, HttpRequest, HttpResponse } from '@angular/common/http';
import { Injectable } from '@angular/core';
import { Observable, of } from 'rxjs';
import { tap, finalize, share } from 'rxjs/operators';


@Injectable()
export class CachingInterceptorService implements HttpInterceptor {

  public readonly store = new Map<string, HttpResponse<any>>();
  public readonly queue = new Map<string, Observable<HttpEvent<any>>>();

  constructor() {}

  intercept(req: HttpRequest<any>, next: HttpHandler): Observable<HttpEvent<any>> {

    // Don't cache if it's not cacheable
    if ( req.method !== 'GET' ) {
      return next.handle(req);
    }

    // Checked if there is pending response for this request
    const cachedObservable: Observable<HttpEvent<any>> = this.queue.get(req.urlWithParams);
    if ( cachedObservable ) {
      console.info('Observable cached');
      return cachedObservable;
    }

    // Checked if there is cached response for this request
    const cachedResponse: HttpResponse<any> = this.store.get(req.urlWithParams);
    if (cachedResponse) {
      console.info('Response cached');
      return of(cachedResponse.clone());
    }

    // If the request of going through for first time
    // then let the request proceed and cache the response
    console.info('Request execute');
    const shared = next.handle(req).pipe(
      tap(event => {
        if (event instanceof HttpResponse) {
          console.info('Response reached');
          this.store.set(req.urlWithParams, event.clone());
        }
      }),
      finalize(() => {
        // delete pending request
        this.queue.delete(req.urlWithParams);
      }),
      share()
    );

    // add pending request to queue for cache parallell request
    this.queue.set(req.urlWithParams, shared);

    return shared;
  }
}

Is this implemplementation of observable caching correct?

I'm some doubt about: what happening if the observable is deleted into finalization of the request and some has subscribed?

Side Note: this is just an example and don't implement cache expiring/invalidation.

like image 749
Simone Nigro Avatar asked Aug 07 '20 14:08

Simone Nigro


1 Answers

You implementation works fine, but I think it can be made much simpler.

import { HttpEvent, HttpHandler, HttpInterceptor, HttpRequest, HttpResponse } from '@angular/common/http';
import { Injectable } from '@angular/core';
import { Observable } from 'rxjs';
import { shareReplay, first, filter } from 'rxjs/operators';


@Injectable()
export class CachingInterceptorService implements HttpInterceptor {

  public readonly store: Record<string, Observable<HttpEvent<any>>> = {};

  constructor() {}

  intercept(req: HttpRequest<any>, next: HttpHandler): Observable<HttpEvent<any>> {

    // Don't cache if it's not cacheable
    if ( req.method !== 'GET' ) {
      return next.handle(req);
    }

    // Check if observable is in cache, otherwise call next.handle
    const cachedObservable = this.store[req.urlWithParams] ||
      ( this.store[req.urlWithParams] = next.handle(req).pipe(
          // Filter since we are interested in caching the response only, not progress events
          filter((res) => res instanceof HttpResponse ),
          // Share replay will cache the response
          shareReplay(1),
      ));
    // pipe first() to cause the observable to complete after it emits the response
    // This mimics the behaviour of Observables returned by Angular's httpClient.get() 
    // And also makes toPromise work since toPromise will wait until the observable completes.
    return cachedObservable.pipe(first());
  }
}

Check my Stackblitz fork

like image 81
Sherif Elmetainy Avatar answered Oct 28 '22 09:10

Sherif Elmetainy