## You are here

HomeLehmann-Scheff\'e theorem

## Primary tabs

# Lehmann-Scheffé theorem

A statistic $S(\boldsymbol{X})$ on a random sample of data $\boldsymbol{X}=(X_{1},\ldots,X_{n})$ is said to be a *complete statistic* if for any Borel measurable function $g$,

$E(g(S))=0\quad\mbox{implies}\quad P(g(S)=0)=1.$ |

In other words, $g(S)=0$ almost everywhere whenever the expected value of $g(S)$ is $0$. If $S(\boldsymbol{X})$ is associated with a family $f(x\mid\theta)$ of probability density functions (or mass function in the discrete case), then completeness of $S$ means that $g(S)=0$ almost everywhere whenever $E_{{\theta}}(g(S))=0$ for every $\theta$.

###### Theorem 1 (Lehmann-Scheffé).

If $S(\boldsymbol{X})$ is a complete sufficient statistic and $h(\boldsymbol{X})$ is an unbiased estimator for $\theta$, then, given

$h_{0}(s)=E(h(\boldsymbol{X})|S(\boldsymbol{X})=s),$ |

$h_{0}(S)=h_{0}(S(\boldsymbol{X}))$ is a uniformly minimum variance unbiased estimator of $\theta$. Furthermore, $h_{0}(S)$ is unique almost everywhere for every $\theta$.

## Mathematics Subject Classification

62F10*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff
- Corrections