Consistent Estimator

In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a rule for computing estimates of a parameter θ0—having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ0.

This means that the distributions of the estimates become more and more concentrated near the true value of the parameter being estimated, so that the probability of the estimator being arbitrarily close to θ0 converges to one.

Consistent Estimator
{T1, T2, T3, ...} is a sequence of estimators for parameter θ0, the true value of which is 4. This sequence is consistent: the estimators are getting more and more concentrated near the true value θ0; at the same time, these estimators are biased. The limiting distribution of the sequence is a degenerate random variable which equals θ0 with probability 1.

In practice one constructs an estimator as a function of an available sample of size n, and then imagines being able to keep collecting data and expanding the sample ad infinitum. In this way one would obtain a sequence of estimates indexed by n, and consistency is a property of what occurs as the sample size “grows to infinity”. If the sequence of estimates can be mathematically shown to converge in probability to the true value θ0, it is called a consistent estimator; otherwise the estimator is said to be inconsistent.

Consistency as defined here is sometimes referred to as weak consistency. When we replace convergence in probability with almost sure convergence, then the estimator is said to be strongly consistent. Consistency is related to bias; see bias versus consistency.

Definition

Formally speaking, an estimator Tn of parameter θ is said to be weakly consistent, if it converges in probability to the true value of the parameter:

    Consistent Estimator 

i.e. if, for all ε > 0

    Consistent Estimator 

An estimator Tn of parameter θ is said to be strongly consistent, if it converges almost surely to the true value of the parameter:

    Consistent Estimator 

A more rigorous definition takes into account the fact that θ is actually unknown, and thus, the convergence in probability must take place for every possible value of this parameter. Suppose {pθ: θ ∈ Θ} is a family of distributions (the parametric model), and Xθ = {X1, X2, … : Xi ~ pθ} is an infinite sample from the distribution pθ. Let { Tn(Xθ) } be a sequence of estimators for some parameter g(θ). Usually, Tn will be based on the first n observations of a sample. Then this sequence {Tn} is said to be (weakly) consistent if

    Consistent Estimator 

This definition uses g(θ) instead of simply θ, because often one is interested in estimating a certain function or a sub-vector of the underlying parameter. In the next example, we estimate the location parameter of the model, but not the scale:

Examples

Sample mean of a normal random variable

Suppose one has a sequence of statistically independent observations {X1, X2, ...} from a normal N(μ, σ2) distribution. To estimate μ based on the first n observations, one can use the sample mean: Tn = (X1 + ... + Xn)/n. This defines a sequence of estimators, indexed by the sample size n.

From the properties of the normal distribution, we know the sampling distribution of this statistic: Tn is itself normally distributed, with mean μ and variance σ2/n. Equivalently, Consistent Estimator  has a standard normal distribution:

    Consistent Estimator 

as n tends to infinity, for any fixed ε > 0. Therefore, the sequence Tn of sample means is consistent for the population mean μ (recalling that Consistent Estimator  is the cumulative distribution of the normal distribution).

Establishing consistency

The notion of asymptotic consistency is very close, almost synonymous to the notion of convergence in probability. As such, any theorem, lemma, or property which establishes convergence in probability may be used to prove the consistency. Many such tools exist:

  • In order to demonstrate consistency directly from the definition one can use the inequality
      Consistent Estimator 

the most common choice for function h being either the absolute value (in which case it is known as Markov inequality), or the quadratic function (respectively Chebyshev's inequality).

  • Another useful result is the continuous mapping theorem: if Tn is consistent for θ and g(·) is a real-valued function continuous at point θ, then g(Tn) will be consistent for g(θ):
      Consistent Estimator 
  • Slutsky's theorem can be used to combine several different estimators, or an estimator with a non-random convergent sequence. If Tn →dα, and Sn →pβ, then
      Consistent Estimator 
  • If estimator Tn is given by an explicit formula, then most likely the formula will employ sums of random variables, and then the law of large numbers can be used: for a sequence {Xn} of random variables and under suitable conditions,
      Consistent Estimator 

Bias versus consistency

Unbiased but not consistent

An estimator can be unbiased but not consistent. For example, for an iid sample {x
1
,..., x
n
} one can use T
n
(X) = x
n
as the estimator of the mean E[X]. Note that here the sampling distribution of T
n
is the same as the underlying distribution (for any n, as it ignores all points but the last), so E[T
n
(X)] = E[X] and it is unbiased, but it does not converge to any value.

However, if a sequence of estimators is unbiased and converges to a value, then it is consistent, as it must converge to the correct value.

Biased but consistent

Alternatively, an estimator can be biased but consistent. For example, if the mean is estimated by Consistent Estimator  it is biased, but as Consistent Estimator , it approaches the correct value, and so it is consistent.

Important examples include the sample variance and sample standard deviation. Without Bessel's correction (that is, when using the sample size Consistent Estimator  instead of the degrees of freedom Consistent Estimator ), these are both negatively biased but consistent estimators. With the correction, the corrected sample variance is unbiased, while the corrected sample standard deviation is still biased, but less so, and both are still consistent: the correction factor converges to 1 as sample size grows.

Here is another example. Let Consistent Estimator  be a sequence of estimators for Consistent Estimator .

    Consistent Estimator 

We can see that Consistent Estimator , Consistent Estimator , and the bias does not converge to zero.

See also

Notes

References

  • Amemiya, Takeshi (1985). Advanced Econometrics. Harvard University Press. ISBN 0-674-00560-0.
  • Lehmann, E. L.; Casella, G. (1998). Theory of Point Estimation (2nd ed.). Springer. ISBN 0-387-98502-6.
  • Newey, W. K.; McFadden, D. (1994). "Chapter 36: Large sample estimation and hypothesis testing". In Robert F. Engle; Daniel L. McFadden (eds.). Handbook of Econometrics. Vol. 4. Elsevier Science. ISBN 0-444-88766-0. S2CID 29436457.
  • Nikulin, M. S. (2001) [1994], "Consistent estimator", Encyclopedia of Mathematics, EMS Press
  • Sober, E. (1988), "Likelihood and convergence", Philosophy of Science, 55 (2): 228–237, doi:10.1086/289429.

Tags:

Consistent Estimator DefinitionConsistent Estimator ExamplesConsistent Estimator Establishing consistencyConsistent Estimator Bias versus consistencyConsistent EstimatorConvergence in probabilityEstimatorStatistics

🔥 Trending searches on Wiki English:

The Walking Dead (TV series)Corey HaimPete Townshend2026 FIFA World Cup qualification – AFC second roundElvis PresleyBad Boys for LifeSouth AfricaJohn Wayne GacyShah Rukh KhanWashington, D.C.National Basketball AssociationJake PaulBattle of BadrJoe BidenTiger WoodsThe Goat LifeLauryn HillWeCrashedLionel MessiJasmin ParisLee GreenwoodJames Earl JonesRed heiferState of PalestineShivam DubeMahatma GandhiMax VerstappenGrey's AnatomyWorld Wide WebAnyone but YouWar (card game)David DastmalchianCable (character)2024 Indian general election in West BengalOpinion polling for the next United Kingdom general electionLeonardo DiCaprioList of most-streamed artists on Spotify2024 ICC Men's T20 World CupRaindrop cakeKate WinsletGiancarlo EspositoHeath LedgerDerek DraperIndiaTelegram (software)YouTubeNarendra ModiElizabeth HolmesCzech RepublicGodzilla Minus OneWilliam ShakespeareMonk (TV series)Danielle CollinsArgentina national football teamRule 34Lewis HamiltonLarry DavidAnya Taylor-Joy2026 FIFA World Cup qualificationIndian Premier LeagueShirley ChisholmFlorence PughNeymarMarch 27Tom HanksMinecraftKung Fu Panda 4Porno y heladoI-40 bridge disasterCrew (film)2026 FIFA World CupLisa LillienJontay PorterTasman Bridge disaster2023 Indian Premier League🡆 More