Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

'Tensor' object has no attribute 'lower'

I am fine-tuning a MobileNet with 14 new classes. When I add new layers by:

x=mobile.layers[-6].output
x=Flatten(x)
predictions = Dense(14, activation='softmax')(x)
model = Model(inputs=mobile.input, outputs=predictions)

I get the error:

'Tensor' object has no attribute 'lower'

Also using:

model.compile(Adam(lr=.0001), loss='categorical_crossentropy', metrics=['accuracy'])
model.fit_generator(train_batches, steps_per_epoch=18,
                validation_data=valid_batches, validation_steps=3, epochs=60, verbose=2)

I get the error:

Error when checking target: expected dense_1 to have 4 dimensions, but got array with shape (10, 14)

What does lower mean? I saw other fine-tuning scripts and there were no other arguments other than the name of the model which is x in this case.

like image 601
Shiro Mier Avatar asked Nov 05 '18 11:11

Shiro Mier


People also ask

How do I fix attributeerror 'int' object has no attribute 'lower'?

The Python "AttributeError: 'int' object has no attribute 'lower'" occurs when we call the lower () method on an integer. To solve the error, make sure the value you are calling lower on is of type string.

Can a tensor be passed as an argument to a layer?

The tensor must be passed to the layer when you are calling it, and not as an argument. Therefore it must be like this: Thanks for contributing an answer to Stack Overflow! Please be sure to answer the question. Provide details and share your research!

How to avoid the divergence when the input is a tensor?

Add a check in _make_kl_divergence_fn function, in order to avoid computing the divergence when the input is a Tensor object, like this: with tf. name_scope ( 'kldivergence_loss' ): if isinstance ( distribution_a, tf. Tensor ): return 0.0 ...

Why does my TensorFlow model's distribution change when I run a function?

I drilled down in the TensorFlow code. It's due to the automatic TensorFlow creating an automatic wrapper around your function. It casts and reshapes the model output (the distribution) to the type of the metric (which seems odd to me anyways). So, to prevent it, you should create your own wrapper, that doesn't perform this cast.


1 Answers

The tensor must be passed to the layer when you are calling it, and not as an argument. Therefore it must be like this:

x = Flatten()(x)  # first the layer is constructed and then it is called on x

To make it more clear, it is equivalent to this:

flatten_layer = Flatten()  # instantiate the layer
x = flatten_layer(x)       # call it on the given tensor
like image 81
today Avatar answered Oct 14 '22 01:10

today