I'm trying to convert CNN Keras model for Emotion Recognition using FER2013 dataset to PyTorch model and I have following error:
Traceback (most recent call last):
File "VGG.py", line 112, in <module>
transfer.keras_to_pytorch(keras_network, pytorch_network)
File "/home/eorg/NeuralNetworks/user/Project/model/nntransfer.py", line 121, in keras_to_pytorch
pytorch_model.load_state_dict(state_dict)
File "/home/eorg/.local/lib/python2.7/site-packages/torch/nn/modules/module.py", line 334, in load_state_dict
own_state[name].copy_(param)
RuntimeError: inconsistent tensor size at /b/wheel/pytorch-src/torch/lib/TH/generic/THTensorCopy.c:51
I understood that the error is related to the shape of images. In Keras the input size is defined to be 48 by 48.
And my question is how to define in PyTorch models that of my pictures are the shape of 48x48? I couldn't find such function in the documentation and examples.
Any help would be useful!
To convert an image to a tensor in PyTorch we use PILToTensor() and ToTensor() transforms. These transforms are provided in the torchvision. transforms package. Using these transforms we can convert a PIL image or a numpy.
In order to automatically resize your input images you need to define a preprocessing pipeline all your images go through. This can be done with torchvision.transforms.Compose()
(Compose docs). To resize Images you can use torchvision.transforms.Scale()
(Scale docs) from the torchvision package.
See the documentation:
Note, in the documentation it says that .Scale()
is deprecated and .Resize()
should be used instead. Resize docs
This would be a minimal working example:
import torch
from torchvision import transforms
p = transforms.Compose([transforms.Scale((48,48))])
from PIL import Image
img = Image.open('img.jpg')
img.size
# (224, 224) <-- This will be the original dimensions of your image
p(img).size
# (48, 48) <-- This will be the rescaled/resized dimensions of your image
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With