Underfitting/Overfitting

If your model is underfitting the training data, adding more training examples will not help. You need to use a more complex model or come up with better features.

There is a gap between the curves. This means that the model performs significantly better on the training data than on the validation data, which is the hallmark of an overfitting model. However, if you used a much larger training set, the two curves would continue to get closer.

Possible solutions for overfitting are to simplify the model, constrain it (i.e., regularize it), or get a lot more training data.

Cost Function

It is quite common for the cost function used during training to be different from the performance measure used for testing. Apart from regularization, another reason why they might be different is that a good training cost function should have optimization-friendly derivatives, while the performance measure used for testing should be as close as possible to the final objective. A good example of this is a classifier trained using a cost function such as the log loss (discussed in a moment) but evaluated using precision/recall.

results matching ""

    No results matching ""