I am wondering if there is a way in which the weights in a (feedforward) neural network are restricted to be either 0 or 1 during training (using back propagation).
Networks with binary weights were studied even earlier, https://hal.archives-ouvertes.fr/jpa-00211064/document and beyond the limitations of exact enumeration.
It's not an issue of memory constraints; the space of networks is different. That's why the phase diagram is different from the case when the couplings take continuous values.
The practical benefit for deep learning would be speed, since bitwise operations are much faster.
Of course the algorithm for updating such couplings, in general, isn't backpropagation, but a walk on the hypercube.
In the following recent articles, binary weights are used in deep neural networks:
Courbariaux, M., Bengio, Y., & David, J. P. (2015). Binaryconnect: Training deep neural networks with binary weights during propagations. In Advances in Neural Information Processing Systems (pp. 3123-3131).
Courbariaux, M., & Bengio, Y. (2016). Binarynet: Training deep neural networks with weights and activations constrained to+ 1 or-1. arXiv preprint arXiv:1602.02830.
Merolla, P., Appuswamy, R., Arthur, J., Esser, S. K., & Modha, D. (2016). Deep neural networks are robust to weight binarization and other non-linear distortions. arXiv preprint arXiv:1606.01981.
since the values of the weights and biases are between 0 and 1, so in your MATLAB code you can add round commend for calculated weights and biases so al the values will be 0 or 1: