I want to write a *.txt file with the neural network hyperparameters and the model architecture. Is it possible to write the object model.summary() to my output file?
(...) summary = str(model.summary()) (...) out = open(filename + 'report.txt','w') out.write(summary) out.close
It happens that I'm getting "None" as you can see below.
Hyperparameters ========================= learning_rate: 0.01 momentum: 0.8 decay: 0.0 batch size: 128 no. epochs: 3 dropout: 0.5 ------------------------- None val_acc: 0.232323229313 val_loss: 3.88496732712 train_acc: 0.0965207634216 train_loss: 4.07161939425 train/val loss ratio: 1.04804469418
Any idea how to deal with that?
Summarize ModelKeras provides a way to summarize a model. The summary is textual and includes information about: The layers and their order in the model. The output shape of each layer. The number of parameters (weights) in each layer.
Model summaryCall model. summary() to print a useful summary of the model, which includes: Name and type of all layers in the model. Output shape for each layer.
With my version of Keras (2.0.6
) and Python (3.5.0
), this works for me:
# Create an empty model from keras.models import Sequential model = Sequential() # Open the file with open(filename + 'report.txt','w') as fh: # Pass the file handle in as a lambda function to make it callable model.summary(print_fn=lambda x: fh.write(x + '\n'))
This outputs the following lines to the file:
_________________________________________________________________ Layer (type) Output Shape Param # ================================================================= Total params: 0 Trainable params: 0 Non-trainable params: 0 _________________________________________________________________
For me, this worked to just get the model summary as a string:
stringlist = [] model.summary(print_fn=lambda x: stringlist.append(x)) short_model_summary = "\n".join(stringlist) print(short_model_summary)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With