Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Upload progress indicators for fetch?

People also ask

How do I get my fetch progress?

// Step 1: start the fetch and obtain a reader let response = await fetch('https://api.github.com/repos/javascript-tutorial/en.javascript.info/commits?per_page=100'); const reader = response. body. getReader(); // Step 2: get total length const contentLength = +response.

Should I use fetch or XMLHttpRequest?

The Fetch API allows you to make network requests similar to XMLHttpRequest (XHR). The main difference is that the Fetch API uses Promises, which enables a simpler and cleaner API, avoiding callback hell and having to remember the complex API of XMLHttpRequest.


Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.

Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.

Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.

Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.

Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().


My solution is to use axios, which supports this pretty well:

axios.request({
    method: "post", 
    url: "/aaa", 
    data: myData, 
    onUploadProgress: (p) => {
      console.log(p); 
      //this.setState({
          //fileprogress: p.loaded / p.total
      //})
    }
}).then (data => {
    //this.setState({
      //fileprogress: 1.0,
    //})
})

I have example for using this in react on github.


Update: as the accepted answer says it's impossible now. but the below code handled our problem for sometime. I should add that at least we had to switch to using a library that is based on XMLHttpRequest.

const response = await fetch(url);
const total = Number(response.headers.get('content-length'));

const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
    const result = await reader.read();
    if (result.done) {
        console.log('Fetch complete');
        break;
    }
    bytesReceived += result.value.length;
    console.log('Received', bytesReceived, 'bytes of data so far');
}

thanks to this link: https://jakearchibald.com/2016/streams-ftw/


fetch: not possible yet

It sounds like upload progress will eventually be possible with fetch once it supports a ReadableStream as the body. This is currently not implemented, but it's in progress. I think the code will look something like this:

warning: this code does not work yet, still waiting on browsers to support it

async function main() {
  const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
  const progressBar = document.getElementById("progress");

  const totalBytes = blob.size;
  let bytesUploaded = 0;

  const blobReader = blob.stream().getReader();
  const progressTrackingStream = new ReadableStream({
    async pull(controller) {
      const result = await blobReader.read();
      if (result.done) {
        console.log("completed stream");
        controller.close();
        return;
      }
      controller.enqueue(result.value);
      bytesUploaded += result.value.byteLength;
      console.log("upload progress:", bytesUploaded / totalBytes);
      progressBar.value = bytesUploaded / totalBytes;
    },
  });
  const response = await fetch("https://httpbin.org/put", {
    method: "PUT",
    headers: {
      "Content-Type": "application/octet-stream"
    },
    body: progressTrackingStream,
  });
  console.log("success:", response.ok);
}
main().catch(console.error);
upload: <progress id="progress" />

workaround: good ol' XMLHttpRequest

Instead of fetch(), it's possible to use XMLHttpRequest to track upload progress — the xhr.upload object emits a progress event.

async function main() {
  const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
  const uploadProgress = document.getElementById("upload-progress");
  const downloadProgress = document.getElementById("download-progress");

  const xhr = new XMLHttpRequest();
  const success = await new Promise((resolve) => {
    xhr.upload.addEventListener("progress", (event) => {
      if (event.lengthComputable) {
        console.log("upload progress:", event.loaded / event.total);
        uploadProgress.value = event.loaded / event.total;
      }
    });
    xhr.addEventListener("progress", (event) => {
      if (event.lengthComputable) {
        console.log("download progress:", event.loaded / event.total);
        downloadProgress.value = event.loaded / event.total;
      }
    });
    xhr.addEventListener("loadend", () => {
      resolve(xhr.readyState === 4 && xhr.status === 200);
    });
    xhr.open("PUT", "https://httpbin.org/put", true);
    xhr.setRequestHeader("Content-Type", "application/octet-stream");
    xhr.send(blob);
  });
  console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>

I don't think it's possible. The draft states:

it is currently lacking [in comparison to XHR] when it comes to request progression


(old answer):
The first example in the Fetch API chapter gives some insight on how to :

If you want to receive the body data progressively:

function consume(reader) {
  var total = 0
  return new Promise((resolve, reject) => {
    function pump() {
      reader.read().then(({done, value}) => {
        if (done) {
          resolve()
          return
        }
        total += value.byteLength
        log(`received ${value.byteLength} bytes (${total} bytes in total)`)
        pump()
      }).catch(reject)
    }
    pump()
  })
}

fetch("/music/pk/altes-kamuffel.flac")
  .then(res => consume(res.body.getReader()))
  .then(() => log("consumed the entire body without keeping the whole thing in memory!"))
  .catch(e => log("something went wrong: " + e))

Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.

However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.


Since none of the answers solve the problem.

Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.


A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute

The bodyUsed attribute’s getter must return true if disturbed, and false otherwise.

to determine if stream is distributed

An object implementing the Body mixin is said to be disturbed if body is non-null and its stream is disturbed.

Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.

Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.

const [input, progress, label] = [
  document.querySelector("input")
  , document.querySelector("progress")
  , document.querySelector("label")
];

const url = "/path/to/server/";

input.onmousedown = () => {
  label.innerHTML = "";
  progress.value = "0"
};

input.onchange = (event) => {

  const file = event.target.files[0];
  const filename = file.name;
  progress.max = file.size;

  const request = new Request(url, {
    method: "POST",
    body: file,
    cache: "no-store"
  });

  const upload = settings => fetch(settings);

  const uploadProgress = new ReadableStream({
    start(controller) {
        console.log("starting upload, request.bodyUsed:", request.bodyUsed);
        controller.enqueue(request.bodyUsed);
    },
    pull(controller) {
      if (request.bodyUsed) {
        controller.close();
      }
      controller.enqueue(request.bodyUsed);
      console.log("pull, request.bodyUsed:", request.bodyUsed);
    },
    cancel(reason) {
      console.log(reason);
    }
  });

  const [fileUpload, reader] = [
    upload(request)
    .catch(e => {
      reader.cancel();
      throw e
    })
    , uploadProgress.getReader()
  ];

  const processUploadRequest = ({value, done}) => {
    if (value || done) {
      console.log("upload complete, request.bodyUsed:", request.bodyUsed);
      // set `progress.value` to `progress.max` here 
      // if not awaiting server response
      // progress.value = progress.max;
      return reader.closed.then(() => fileUpload);
    }
    console.log("upload progress:", value);
    progress.value = +progress.value + 1;
    return reader.read().then(result => processUploadRequest(result));
  };

  reader.read().then(({value, done}) => processUploadRequest({value,done}))
  .then(response => response.text())
  .then(text => {
    console.log("response:", text);
    progress.value = progress.max;
    input.value = "";
  })
  .catch(err => console.log("upload error:", err));

}