I'm using the feedforward neural network with 100 inputs and 10 units in hidden layer and one output neural network. I train the network several times using the same input training data and the same network architecture/settings with random initialize but I understand that there will be differences in the weightings produced within the NN each time and that no two neural networks will be identical, but what can I try to produce networks that are more consistent across each train, given the identical data?

More Mohammad Amiri's questions See All
Similar questions and discussions