Keyword | CPC | PCC | Volume | Score | Length of keyword |
---|---|---|---|---|---|

how to find consistent estimator | 0.54 | 0.1 | 1977 | 68 | 32 |

how | 0.43 | 0.5 | 3122 | 71 | 3 |

to | 0.76 | 0.5 | 2624 | 6 | 2 |

find | 0.22 | 0.2 | 9172 | 20 | 4 |

consistent | 0.55 | 0.2 | 3563 | 53 | 10 |

estimator | 0.75 | 0.5 | 2919 | 4 | 9 |

Definition. Formally speaking, an estimator Tn of parameter θ is said to be consistent, if it converges in probability to the true value of the parameter: A more rigorous definition takes into account the fact that θ is actually unknown, and thus the convergence in probability must take place for every possible value of this parameter.

For example, the method of moments estimator is consistent but doesn't have the invariance property! Plus convergence of moments isn't the same as being consistent in general. You need to use the correct definition: convergence in probability.

However, T n is not a consistent estimator of μ. EDIT 3: See cardinal's points in the comments below. @G.JayKerns Unbiasedness is unnecessary for this. Consider S n = 1 n − 1 ∑ i = 1 n ( X i − X n ¯) 2. S n is a biased estimator of the standard deviation yet you can use the above argument to show that it's consistent.

This improvement continues to the limiting case when the size of the data sample becomes as large as the population, where the estimate becomes equal to the true value of the parameter. Consistency is one of the properties of an estimator, along with other properties such as bias, mean squared error, and efficiency.