Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

face-api.js - Why is browser's faceapi.detectAllFaces() is faster than server's?

I want to use face detection on my server-side. Therefore, I found face-api.js for this task. I discovered that each call of faceapi.detectAllFaces() lasts for ~10 seconds. But when I start the browser-example, only the first function lasts 10 seconds and all the next lasts less than one second.

My server-side code (you can see a similar code in ageAndGenderRecognition.ts):

import * as faceapi from 'face-api.js';
import { canvas, faceDetectionNet, faceDetectionOptions, saveFile } from './commons';
await faceDetectionNet.loadFromDisk('../../weights')
await faceapi.nets.faceLandmark68Net.loadFromDisk('../../weights')
await faceapi.nets.ageGenderNet.loadFromDisk('../../weights')

const img = await canvas.loadImage('../images/bbt1.jpg')

console.time();
const results = await faceapi.detectAllFaces(img, faceDetectionOptions);
// ~10 seconds.
console.timeEnd();

console.time();
const results2 = await faceapi.detectAllFaces(img, faceDetectionOptions);
// ~10 seconds again.
console.timeEnd();

Why faceapi.detectAllFaces() (except first call) is faster in browser-example than in ageAndGenderRecognition.ts? And which similar thing I can to do to my faceapi.detectAllFaces()-function has the same speed?

like image 256
Palindromer Avatar asked May 21 '19 20:05

Palindromer


2 Answers

There might be some reasons why your nodejs sample code runs for 10s:

  1. You are not importing @tensorflow/tfjs-node at all, in this case tfjs does not use the native Tensorflow CPU backend and operations will take much longer on the CPU.

  2. You are importing @tensorflow/tfjs-node but there is a version mismatch between the tfjs-core version that face-api.js and the version of @tensorflow/tfjs-node your have installed via npm. In this case tfjs will display a warning message.

  3. Everything is set up correctly, but your CPU is just tremendously slow. In this case you can either try to use @tensorflow/tfjs-node-gpu (if you have a CUDA compatible nvidia GPU) or you can change the faceDetectionOptions to new faceapi.TinyFaceDetectorOptions(), which will run the TinyFaceDetector instead of the default SSD Mobilenet v1 model, which is much faster.

The reason why the first call in the browser takes that long is not due to the actual prediction time. It is because using the WebGL backend of tfjs, on the first run (warm up run) all the shader programs are compiled, which takes so long. Afterwards these are cached. The prediction in the browser takes only a few miliseconds because the WebGL backend is GPU accelerated. The 10s warm up time in the browser and the prediction time you are seeing in nodejs are not related at all.

like image 126
justadudewhohacks Avatar answered Oct 06 '22 01:10

justadudewhohacks


Tensorflow.js will generally perform better when using a GPU (instead of a CPU).

So one thing that can explain the performance difference is that on the browser side, tensorflow will run on the GPU (via WebGL), whereas on node, it will run on the CPU (unless you are using @tensorflow/tfjs-node-gpu).

It seems that by default, the face-api.js library uses @tensorflow/tfjs-node (https://github.com/justadudewhohacks/face-api.js#face-apijs-for-nodejs). So maybe you can try to replace the import with @tensorflow/tfjs-node-gpu.

In order to use the GPU on node, check the tfjs-node github : https://github.com/tensorflow/tfjs-node

like image 21
Louis T. Avatar answered Oct 06 '22 02:10

Louis T.