Epoch

Home Glossary Item Epoch
« Back to Glossary Index

Epoch is a term used to denote one complete pass through the entire training dataset while training a learning algorithm. This is specifically relevant in context of neural networks and other iterative learning algorithms. The term epoch is used in conjunction with the process of tuning the weights of the connections in the neural network, which is done incrementally over multiple passes, or epochs.

 

The number of epochs is a hyperparameter of the learning algorithm that defines the number times the learning process will work through the entire training dataset. One epoch implies that every sample in the training set has had an opportunity to update the internal parameters of the model. Training for many epochs can allow the learning algorithm to better fit the training data, improving accuracy.

It’s important to be cautious about setting an appropriate number of epochs. Training over too few epochs may mean the model is underfit, which could result in poor performance. On the other hand, too many epochs can result in overfitting, where the model learns the training data too well, and fails to effectively generalize when confronted with new, unseen data. Balancing the number of epochs is hence a crucial part of achieving a successful machine learning model.

« Back to Glossary Index

allix