Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use Promise.all to handle thousands of request

Tags:

In my React app I have a component that send request to an online service that is capable to handle 50 requests max. I got a new request now to execute 7000 MAC’s.

function App() {
const [data, setData] = useState([]);

useEffect(() => {
     const fetchData = async () => {
        await axios.all([
             axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 }),
             axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 })
       // Adding all the mac address ....... 
        ]).then((responseArr) => {
            setData(responseArr)
        });
     };
    fetchData();
}, []);

I would like to extend the fetchData function so basically it will send only 50 IP’s and will wait till iteration is complete.

When the iteration is complete then the next 50 will be executed.

Thank you

like image 321
angus Avatar asked Aug 27 '20 15:08

angus


2 Answers

Without library, you could use this function:

function poolPromises(iterPromises, poolSize) {
    return new Promise((resolve, reject) => {
        let promises = [];
        function nextPromise() {
            let { value, done } = iterPromises.next();
            if (done) {
                resolve(Promise.all(promises));
            } else {
                promises.push(value); // value is a promise
                value.then(nextPromise, reject);
            }
            return !done;
        }
        
        while (promises.length < poolSize && nextPromise()) { }
    });
}

This function will take promises from an iterator up to the pool size. Whenever a promise resolves, it will get the next promise from the iterator so the pool is complete again. So the pool does not have to be emptied completely before the next chunk of promises is generated. As soon as a spot is free it will be used again.

It is important that the iterator only creates a next promise when one is pulled from it via the next() method.

In your use case, you can call it as follows:

const fetchData = async () => {
    function * iterRequests() {
        for (let macAddress of macAddresses) {
            yield axios.get("/ipdn/" + macAddress, { timeout: 10000 });
        }
    }
    return poolPromises(iterRequests(), 50).then(setData);
}    

Note: fetchData does not have to be declared async, since there is no await in there.

like image 186
trincot Avatar answered Sep 28 '22 06:09

trincot


Here is how you can do it without any external libraries:

const ips = [
  /* List of mac address. */
];

useEffect(() => {
  const fetchData = async () => {
    const loadedData = [];
    
    // Iterate over the slices of array until all the ips have been processed.
    for (const sliceIps of sliceGenerator(ips)) {
      const gettingData = sliceIps.map(getDataFromIp);
      const sliceLoadedData = await axios.all(gettingData);
      loadedData = loadedData.concat(sliceLoadedData);
    }
    setData(loadedData);
  };
  fetchData();
}, []);

const getDataFromIp = (ip) =>
  axios.get("/ipdn/<MAC ADDRESS>", { timeout: 10000 });

// Generates a slice of an array, here the slice has a size of 50 max.
function* sliceGenerator(arr) {
  const sliceSize = 50;
  let i = 0;
  while (i < arr.length) {
    yield arr.splice(i, i + sliceSize);
    i += sliceSize;
  }
}

I'm using a generator function* sliceGenerator here to generate the slices of ips array. This way you batch process them 50 by 50.

I'm also using a for (... of ...) loop. It's very convenient because you can use the await keyword inside.

like image 35
Baboo Avatar answered Sep 28 '22 07:09

Baboo