# Keyword Analysis & Research: prove an estimator is consistent

## Keyword Research: People who searched prove an estimator is consistent also searched

How do you know if an estimator of is consistent?

EDIT: Fixed minor mistakes. An estimator of θ (let's call it T n) is consistent if it converges in probability to θ. Using your notation p l i m n → ∞ T n = θ. lim n → ∞ P ( | T n − θ | ≥ ϵ) = 0 for all ϵ > 0. The easiest way to show convergence in probability/consistency is to invoke Chebyshev's Inequality, which states:

Is TN a consistent estimator of?

However, T n is not a consistent estimator of μ. EDIT 3: See cardinal's points in the comments below. @G.JayKerns Unbiasedness is unnecessary for this. Consider S n = 1 n − 1 ∑ i = 1 n ( X i − X n ¯) 2. S n is a biased estimator of the standard deviation yet you can use the above argument to show that it's consistent.

How does consistency extend from the sequence of estimators to the rule?

Thus, the concept of consistency extends from the sequence of estimators to the rule used to generate it. For instance, suppose that the rule is to "compute the sample mean", so that is a sequence of sample means over samples of increasing size.

Is S2 a consistent estimator?

Consistency is a relatively weak property and is considered necessary of all reasonable estimators. This is in contrast to optimality properties such as eﬃciency which state that the estimator is “best”. ... The above theorem can be used to prove that S2 is a consistent estimator of Var(X i) S2 = 1