Do you know about a way to constrain tensorflow
or keras
to a set of discrete weights and use discrete/rigid activation functions (e.g. like sign
or hard-tanh
)?
The APIs seem to have only smooth activation functions.
What I also thought about is to discretize the weights via a custom regularization function, but I don't know how to make the frameworks take this into account.
Probably I'll have to extend the (for example) Dense Layer class (of the respective framework) and define a custom forward-propagation function (and its derivative). Do you have any examples for this?
In my opinion changing weights and activations from smooth to discrete ones might be a huge problem in Keras. I see at least two major difficulties in this approach:
2^dimension
).These are the reasons why solution of your problem might be really difficult.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With