Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Maximum IPC message size exceeded

Currently, I'm developing the tool running on a web browser.

In this project, I am using IndexedDB with Dexie.js ver 2.x.

On Google Chrome, I mentioned the error "Maximum IPC message size exceeded". Probably, it's caused by putting so huge data into IndexedDB. My ad-hoc implementation was that to convert arrays to string by JSON.stringify, and I once could achieve the problem. However, the problem happened again.

  • In my application an aggregation called Project is defined.
  • Each Project has up to 500 Input text.
  • Each Input text has 1 string and its length is up to 50k.
  • Also, each Project has Analysis, but it contains only analysis parameters.
  • Each Analysis has a lot of Result.
  • Each Result contains which length is 20k.
  • Input text and Result are gotten by getAll() methods of IndexedDB, filtered by projectId or analysisId.

My question is that ...

  1. How to avoid this error?
  2. I'd like to rescue the data stored at IndexedDB. I know where the data are stored in a local file system. If possible to do so, how to do it?
like image 910
kaorun343 Avatar asked Jan 02 '23 18:01

kaorun343


2 Answers

If you look at what causes this error in the Chrome implementation, such as https://chromium.googlesource.com/chromium/src.git/+/master/content/browser/indexed_db/indexed_db_database.cc, it occurs when the message size is too large. Here the message size basically refers to the amount of data that is sent from C++ (the browser binary) to Javascript as a result of some processing.

To avoid sending too much data, there are a few things you could do:

  • make sure you never call getAll on a lot of data
  • use a limit when calling getAll on a potentially large amount of data
  • use openCursor instead of getAll
  • store smaller objects

I think your best bet is to try switching to openCursor. This will retrieve your items one at a time (per request). This way you will avoid ever running into this error. You lose a tiny bit of speed using a cursor, but you gain scalability.

To get using a cursor to work just like getAll does, it is simple. All you need to do is first declare an empty array, then start the cursor, and iterate, each time adding the cursor item to the array. At the end of iteration you have essentially assembled from one at a time pieces the same array result as from calling getAll.

like image 94
Josh Avatar answered Jan 13 '23 19:01

Josh


I couldn't find a cursor-like way of querying with Dexie (maybe didn't look hard enough) but I wanted to keep using Dexie as my layer of abstraction.

I was needing to both retrieve all records and retrieve a subset based on ID (same as you) and I came up with the following, which solved our problem:

/**
 * Retrieve all of Dexie collection
 *
 * We can't just call Collection.toArray() because once the result is large
 * enough, we'll get "Maximum IPC message size exceeded" error. This is a
 * memory-friendly implementation. Although maybe a bit slow due to a page
 * size of one.
 */
function retrieveDexieCollection(collection) {
  return new Promise(async (resolve, reject) => {
    try {
      const result = []
      await collection.each(r => {
        result.push(r)
      })
      return resolve(result)
    } catch (err) {
      return reject(err)
    }
  })
}

// then later, use our function
const projectIds = [1,2,3]
const records = await retrieveDexieCollection(db
  .whateverYourTableIsCalled
  .where('projectId')
  .anyOf(projectIds))

If you need to process the records as you go (I did) you can add a mappingFunction parameter and call that for each element before adding it to the array.

The function is pretty flexible because you can pass it any Dexie collection; a whole table or the result of any arbitary query.

like image 28
Tom Saleeba Avatar answered Jan 13 '23 20:01

Tom Saleeba