Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to handle streaming data using fetch?

I have used async for with great success in handling output streams from processes with node.js, but I'm struggling to get something that I was hoping could "just work" with the browser fetch API.

This works great to async'ly handle chunks of output streaming from a process:

for await (const out of proc.child.stdout) {
  ...
}

(in an async function context here of course)

I tried to do something similar in a browser where I want to gain access to the data while it is being sent to me from the server.

for await (const chunk of (await fetch('/data.jsonl')).body) {
  console.log('got', chunk);
}

This does not work in Chrome (Uncaught TypeError: (intermediate value).body is not async iterable).

For my use case this is not necessary, so I am simply using let data = await (await fetch(datapath)).text(); in my client code for now. This is analogous to the typical use of .json() instead of .text() on the awaited fetch, so no processing can begin until the entire response is received by the browser. This is not ideal for obvious reasons.

I was looking at Oboe.js (I think the relevant impl is somewhere near here) which pretty much deals with this but its internals are fairly ugly so it looks like that might be the only way to do this for now?

If async iteration isn't implemented (meaning async for cannot be used yet) isn't there another way to use the ReadableStream in a practical way?

like image 369
Steven Lu Avatar asked May 31 '20 19:05

Steven Lu


2 Answers

Unfortunately async iterable support is not yet implemented, despite being in the spec. Instead you can manually iterate, as shown in this example from the spec. (I'll convert examples to async/await for you in this answer.)

const reader = response.body.getReader();
const { value, done } = await reader.read();

if (done) {
  console.log("The stream was already closed!");
} else {
  console.log(value);
}

You can use recursion or a loop to do this repeatedly, as in this other example:

async function readAllChunks(readableStream) {
  const reader = readableStream.getReader();
  const chunks = [];
  
  let done, value;
  while (!done) {
    ({ value, done } = await reader.read());
    if (done) {
      return chunks;
    }
    chunks.push(value);
  }
}

console.log(await readAllChunks(response.body));
like image 96
Domenic Avatar answered Oct 13 '22 22:10

Domenic


According to the spec, a ReadableStream such as the fetch-API's Response.body does have a getIterator method. For some reason it's not async-iterable itself, you explicitly have to call that method:

const response = await fetch('/data.json');
if (!response.ok)
    throw new Error(await response.text());
for await (const chunk of response.body.getIterator()) {
    console.log('got', chunk);
}
like image 34
Bergi Avatar answered Oct 13 '22 22:10

Bergi