Hello people,

I am wondering if there is a way in which the weights in a (feedforward) neural network are restricted to be either 0 or 1 during training (using back propagation).

Thanks in advance for your help!

Similar questions and discussions