Skip to main content

Posts

Showing posts from April 13, 2018

Where do variance and bias come from ?

What are variance and bias ? Variance and bias are assertions to evaluate our model based on different datasets. Both of them happen together and we have to find the methodology to trade-off them. A good model is a small variance and bias model.  In wikipedia, we have definitions of variance and bias, formally: The  bias  is an error from erroneous assumptions in the learning  algorithm . The  variance  is an error from sensitivity to small fluctuations in the training set.  (source:  https://en.wikipedia.org/wiki/Bias–variance_tradeoff ) However, I want to make them more normal and practical, so we could understand them as bellowing definitions: Variance asserts fluctuation of precision of model against training data, test data with real data(new data). One model has a hight precision in training and test phase but has low precision with real data, It means this model has hight variance. Hight variance also is called as overfitting: Too...