There is no specific number as this depnds on both data and algorithm, but we can discover this number by applying several approaches on the available data. For instance, one can compute the error produced by the algorithm after learning on the tranining set percentage that is frequently subtracted from the full dataset in each of the 10 cross-validation's folds (in the case of using stratified 10-fold cross-validation).
The answer is rarely simple. Rather it depends on the type of neural network being used and on the amount of random variation in the data. The usual answer is “as much as possible” because the more data we have, the better we can identify the structure and patterns that are used for forecasting.
Thank for your answers. Please consider the following concrete example:
If we have 6 features (inputs) and only one output Y. I want to learn the program how to predict Y correctly based on the 6 inputs. I have training data (examples) equal to 10. So, with only 10 examples will be sufficient to learn the program?
I think it is not just a matter of "how many" tests, but also of "which" tests. You should evaluate the range of variability of your input parameters: the wider the range, the higher the number of tests required. As a rule of thumb, I'd suggest you to use approximately a number of test which is at least 10 times the number of your input parameters.
Depends on the number of parameters in your network, I would bet. Say you got a totally i features and one output y encoded as a one-hot vector (2 outputs each as a probability). And assume you use a forward fully connected network with only a single hidden layer (j neurons). you got (i x j + j x 2) parameters, and each of which is float 32bits, then your potential net could be picked up through 2^32x(i x j + j x 2) of candidates, or close. Thus you probably need those punch of training sample, or close. But, you can use technology like augmentation/dropout to prevent overfitting.
the estimation could be taken as a upper boundary, but as Sara said, 10 times more much more.