I've trained my own graph model. And I want to use it on browser. Here is my code:
async function predict() {
const model = await tf.loadGraphModel('./model/model.json');
let img = document.getElementById('test');
var example = tf.browser.fromPixels(img);
example = example.expandDims(0);
const output = await model.predict(example).data();
console.log(output);
}
When I run this, it gives this error on console:
Uncaught (in promise) Error: This execution contains the node 'SecondStagePostprocessor/BatchMultiClassNonMaxSuppression/map/while/Exit_4', which has the dynamic op 'Exit'. Please use model.executeAsync() instead. Alternatively, to avoid the dynamic ops, specify the inputs [SecondStagePostprocessor/BatchMultiClassNonMaxSuppression/map/TensorArrayStack_2/TensorArrayGatherV3]
at t.compile (tfjs:2)
at t.execute (tfjs:2)
at t.execute (tfjs:2)
at predict ((index):85)
at /websites/optik2/async http://localhost/websites/optik2/:96
I need predict()
function, executeAsync()
is not good as it is.
EDIT
Ok, I'm using as executeAsync
now as @Jason Mayes says. But it's returning some values like that:
t {kept: false, isDisposedInternal: false, shape: Array(3), dtype: "float32", size: 1200, …}
rank: 3
isDisposed: false
kept: false
isDisposedInternal: false
shape: (3) [1, 300, 4]
dtype: "float32"
size: 1200
strides: (2) [1200, 4]
dataId: {}
id: 2198
rankType: "3"
scopeId: 3545
__proto__: Object
How can I get the bounding box of this?
What is the length of the output from const output = await model.executeAsync(data)
?
You should look for these shapes in the output
;
output[X] = detection_boxes // shape: [1, x, 4] x: number of bounding boxes
output[Y] = detection_scores // shape: [1, x] x: number of scores
output[Z] = detection_classes // shape: [1, x] x: number of classes
Then you can fetch the predictions through the following;
const boxes = output[0].dataSync()
const scores = output[1].arraySync()
const classes = output[2].dataSync()
You can then construct a prediction object with all the predicted bounding boxes by doing this;
buildDetectedObjects(scores, threshold, imageWidth, imageHeight, boxes, classes, classesDir) {
const detectionObjects = []
scores.forEach((score, i) => {
if (score > threshold) {
const bbox = [];
const minY = boxes[i * 4] * imageHeight;
const minX = boxes[i * 4 + 1] * imageWidth;
const maxY = boxes[i * 4 + 2] * imageHeight;
const maxX = boxes[i * 4 + 3] * imageWidth;
bbox[0] = minX;
bbox[1] = minY;
bbox[2] = maxX - minX;
bbox[3] = maxY - minY;
detectionObjects.push({
class: classes[i],
label: classesDir[classes[i]].name,
score: score.toFixed(4),
bbox: bbox
})
}
})
return detectionObjects
}
classesDir
being a dictionary with the classes from the training;
let classesDir = {
1: {
name: 'Class name 1',
id: 1,
},
2: {
name: 'Class name 2',
id: 2,
}
}
The prediction object will then be an array with objects;
[{
bbox:[x,y,width,height],
class: X,
label: class name,
score: 0.XYZ
},
{
bbox:[x,y,width,height],
class: X,
label: class name,
score: 0.XYZ
}]
You are using console.log to display a tensor.
console.log(tensor)
Instead you would need to use the print
method of the tensor to see its output.
tensor.print()
After getting the tensor from the backend, then console.log
can be used to display the output as a plain js array
data = await tensor.data()
console.log(data) // plain js array
Try model.executeAsync() instead of predict.
As it is async you should use:
const output = await model.executeAsync(data);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With