I build a image classification model in R by keras for R.
Got about 98% accuracy, while got terrible accuracy in python.
Keras version for R is 2.1.3, and 2.1.5 in python
following is the R model code:
model=keras_model_sequential()
model=model %>%
layer_conv_2d(filters = 32,kernel_size = c(3,3),padding = 'same',input_shape = c(187,256,3),activation = 'elu')%>%
layer_max_pooling_2d(pool_size = c(2,2)) %>%
layer_dropout(.25) %>% layer_batch_normalization() %>%
layer_conv_2d(filters = 64,kernel_size = c(3,3),padding = 'same',activation = 'relu') %>%
layer_max_pooling_2d(pool_size = c(2,2)) %>%
layer_dropout(.25) %>% layer_batch_normalization() %>% layer_flatten() %>%
layer_dense(128,activation = 'relu') %>%
layer_dropout(.25)%>%
layer_batch_normalization() %>%
layer_dense(6,activation = 'softmax')
model %>%compile(
loss='categorical_crossentropy',
optimizer='adam',
metrics='accuracy'
)
I try to rebuild a same model in python, with same input data.
While, got totally different performance. The accuracy even less than 30%
Because R keras is calling python for run keras. With same model architecture, they should get similar performance.
I wonder if this issue caused by preprocess, but still show my python code:
model=Sequential()
model.add(Conv2D(32,kernel_size=(3,3),activation='relu',input_shape=(187,256,3),padding='same'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(BatchNormalization())
model.add(Conv2D(64, (3, 3), activation='relu',padding='same'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(BatchNormalization())
model.add(Flatten())
model.add(Dense(128, activation='relu'))
model.add(Dropout(0.25))
model.add(BatchNormalization())
model.add(Dense(len(label[1]), activation='softmax'))
model.compile(loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
This is a simple classification. I just do as same as most instruction.
Can not find others faced same problem. So want to ask how it happen and how to solve. Thx
That is a dramatic difference so perhaps there's a bug in the code or something unexpected in the data but reproducing Keras
results from R
in Python
is more difficult than it may seem since setting the seed on the R
side is insufficient. Instead of set.seed
you should use use_session_with_seed
, which comes with the R libraries for tensorflow
and keras
. Note that for full reproducibility you need to use_session_with_seed(..., disable_gpu=TRUE, disable_parallel_cpu=TRUE)
. See also stack and tf docs. Also, here is an example using the github version of kerasformula
and a public dataset. Also, watch out for functions like layer_dropout
that accept seed
as a parameter.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With