I had followed this blog --> https://medium.com/@teyou21/training-your-object-detection-model-on-tensorflow-part-2-e9e12714bdf , and built a SSD Mobilenet model which is pre-trained on the COCO Dataset called "ssd_mobilenet_v2_quantized_coco".
What happens here is that it perfectly detects my new classes, but I want to include the pre-trained classes as well.
I tried changing the number of classes to 96 ( 90 pre-trained + 6 new ) and edited the "labelmap.pbtxt" with the name and corresponding id of all labels from the COCO Dataset with the new classes being added at the last from ids 91 - 96.
It still detects only the new classes only.
What should I do to detect both the pre-trained and new classes?
🤗 Transformers provides access to thousands of pretrained models for a wide range of tasks. When you use a pretrained model, you train it on a dataset specific to your task.
It depends on how you use the pre trained weights:
The first option only trains the detection head and not the backbone of the network - This means that the backbone weights are sherd between your model and the original model.
In the second option you train all the network, backbone + detection head- This means that you have two different models
If in your case you use the second option then the only way to do what you want is to load both networks and run inference on the image once with the original network and second with your new network. Then you combine your results.
If you use the first option then you could do the following:
The idea is that because the backbone is the same for both we can use the backbone to extract the features for the image and then feed each detection head with the features.
This is a tutorial on how to extract weights from one graph and combine them in a new one (This is for TF1) TensorFlow: saving/restoring and mixing multiple models
Here you can read on how to save and restore part of a model - save-and-restore-a-subset-of-variables
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With