Keyword | CPC | PCC | Volume | Score | Length of keyword |
---|---|---|---|---|---|

a consistent estimator for the mean | 0.09 | 0.7 | 9425 | 49 | 35 |

a | 0.14 | 0.3 | 4863 | 1 | 1 |

consistent | 0.92 | 0.3 | 4238 | 84 | 10 |

estimator | 1.07 | 0.5 | 2886 | 89 | 9 |

for | 0.89 | 0.5 | 9791 | 39 | 3 |

the | 1.14 | 0.4 | 8571 | 67 | 3 |

mean | 0.61 | 0.9 | 7392 | 1 | 4 |

If converges in probability to the mean of the distribution that generated the samples, then we say that is consistent. By a slight abuse of language, we also say that the sample mean is a consistent estimator. Examples The following table contains examples of consistent estimators (with links to lectures where consistency is proved).

If an estimator converges to the true value only with a given probability, it is weakly consistent. If convergence is almost certain then the estimator is said to be strongly consistent (as the sample size reaches infinity, the probability of the estimator being equal to the true value becomes 1).

Definition of Consistent Estimator in the context of A/B testing (online controlled experiments). What is a Consistent Estimator? A consistent estimator in statistics is such an estimate which hones in on the true value of the parameter being estimated more and more accurately as the sample size increases.

If this "imaginary" sequence of estimators converges in probability to the true parameter value, then it is said to be consistent. Definition A sequence of estimators is said to be consistent if and only if where denotes convergence in probability. Note that we have defined "consistent sequences of estimators".