23 October 2020 3 3K Report

For making a linear regression of a training set for a test set, the test error increases when the sample size is reduced.

Is it correct that this simply occurs because there is more data available and thus the regression can be made more precisely or is there a different underlying reason?

Similar questions and discussions