Typically, you have a fixed training set for your neural network. Training your network on each item of the set once is an epoch. So, if you want to teach your network to recognize the letters of the alphabet, 100 epochs would mean you have 2600 individual training trials.
Typically, you have a fixed training set for your neural network. Training your network on each item of the set once is an epoch. So, if you want to teach your network to recognize the letters of the alphabet, 100 epochs would mean you have 2600 individual training trials.
An epoch is a single step in training a neural network; in other words when a neural network is trained on every training samples only in one pass we say that one epoch is finished. So training process may consist more than one epochs.
Epoch is number of iteration related with each input samples in the dataset. Large value of epoch leads to train more times which sometimes gives better accuracy.
Epoch is when an entire dataset is passed forward and backward only once through the neural network. Since we cannot pass an entire dataset as such into a neural network, we divide it into batches. The number of training examples present in a single batch is called batch size. The number of batches required to complete one epoch is called iteration. For example: when a dataset of 200 examples is divided into batches of size 50, it takes 4 iterations to complete one epoch. So here, 50 is the batch size and 4 is the number of iterations. Epoch and batch size are important hyperparameters that affect performance of neural network and deep learning models.
for reference : https://towardsdatascience.com/epoch-vs-iterations-vs-batch-size-4dfb9c7ce9c9