Epoch
Epochs are an important part of training machine learning models. A single epoch is a single pass through the entire dataset. This is usually too small of a dataset to train a model on. So, multiple epochs are used. The number of epochs is a hyperparameter that you can tune. Too few epochs will result in a model that is not trained well enough. Too many epochs will result in a model that is overtrained. In the cryptocurrency world, an epoch is a specific period of time. For example, the Ethereum epoch is defined as the number of seconds since the UNIX epoch, which is the time when the first block of the Ethereum blockchain was mined. |