# Keyword Analysis & Research: estimator consistency and unbiased

## Frequently Asked Questions

What does consistency mean in an estimator?

To be slightly more precise - consistency means that, as the sample size increases, the sampling distribution of the estimator becomes increasingly concentrated at the true parameter value. An estimator is unbiased if, on average, it hits the true parameter value.

Is your estimator consistent or unbiased?

Your estimator is on the other hand inconsistent, since x ~ is fixed at x 1 and will not change with the changing sample size, i.e. will not converge in probability to μ. Perhaps an easier example would be the following. Let β n be an estimator of the parameter β. Suppose β n is both unbiased and consistent.

What is the difference between consistency and unbiasedness?

3 Answers. Consistency of an estimator means that as the sample size gets large the estimate gets closer and closer to the true value of the parameter. Unbiasedness is a finite sample property that is not affected by increasing sample size. An estimate is unbiased if its expected value equals the true parameter value.

What is the consistency of OLS estimator?

In other words- consistency means that, as the sample size increases, the sampling distribution of the estimator becomes more concentrated at the population parameter value and the variance becomes smaller. Under OLS assumptions, OLS estimator is BLUE (least variance among all linear unbiased estimators).