Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to load model with tf.loadModel from firebase storage?

I am developing an app in ionic3 framework for recognizing drawn characters and I am having trouble with the import of the model. I have imported the model from Keras (converted with tensorflowjs_converter) into my ionic3 app in two different ways:

  1. The model.json and weight files (shards) are placed into the folder /assets/models.
  2. The model.json and weight files (shards) are being hosted in firebase storage.

When launching the app in the browser with the first method, the model and weights are correctly loaded and I am able to predict the classes. But when launching the app, with the same method into my Android device with ionic cordova run android --device, the model seems to not retrieve the data from the weight files as it gives the following error:

Based on the provided shape, [3, 3, 32, 64], the tensor should have 18432 values but has 917.

Now, I tried to host the files in firebase storage to try and fix this issue. I retrieve the model.json from storage and I still get the same error as stated above in both browser and device.

From the experience of storing the shards and model locally in the app, I came to the conclusion that the shards are not being recognized in the device either both ways.

Also, when using the firebase storage method in device, when trying to fetch the model from the url, I catch the following error: Failed to fetch.

Here is the code of retrieving the shards and the model:

const modelURL: string = await this.db.getModel();
const shards: string[] = await this.db.getShards();

modelURL and shards contains the download urls from firebase storage. The model and the shards are kept together on the same level:

/* Firebase Storage hierarchy */

https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fmodel.json?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard1of4?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard2of4?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard3of4?alt=media&token=******
https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard4of4?alt=media&token=******

So with that, I pass the download url of the model into tf.loadModel:

import * as tf from '@tensorflow/tfjs';    

await tf.loadModel(modelURL).then(model => {
    const output: any = model.predict(img);
});

So, is there any way to pass the shards into tf.loadModel(), fetched from firebase storage, so that in my device and browser I can retrieve all of the data required in order to predict from the model?

Thank you for your help.

like image 974
Gabriel Garcia Avatar asked Feb 03 '26 20:02

Gabriel Garcia


1 Answers

The http loader for tf.loadModel() call assumes model.json and the corresponding weight files (group1-shard1of1,...) share the same url path prefix. For example: given model file located at: https://foo.bar/path/model.json the loader with try to retrieve weight files at: https://foo.bar/path/group1-shard1of1, ...

In your case const modelURL: string = await this.db.getModel(); const shards: string[] = await this.db.getShards();

If the modelUrl and shards do not share the same path, you might need to create you own BrowserHttp IOHandler for loading: const model = await tf.loadModel(new MyOwnHttpIOLoader(modelUrl, shards));

If they do, you might be able to align them by editing the model.json file manually. In the model.json file, there is an array of weight file paths.

With the firebase storage the problem is the url of the model file is: https://firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fmodel.json which has the path firebasestorage.googleapis.com/v0/b/project-foo.com/o The loader will use the path and try to load the weight file at firebasestorage.googleapis.com/v0/b/project-foo.com/o/group1-shard1of4. But it does not match your weight url firebasestorage.googleapis.com/v0/b/project-foo.com/o/model%2Fgroup1-shard1of4, it is missing the model%2F prefix.

In order to make the loader work, you can manually update the model.json to add the prefix. Search the file for "weightsManifest", edit the "paths" array to be something like following ["model%2Fgroup1-shard1of4", ...]

like image 156
Ping Yu Avatar answered Feb 05 '26 13:02

Ping Yu



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!