Problem of estimation

We observe N values x = (x1,..., xN ) derived from a drawing independent of a random variable X. This v.a. has a probability density f(x; \(\theta\)) or \(\theta\) represents a set of unknown parameters considered to be deterministic, scalar, or vector. Note that X = (X1,..., XN) is the random vector whose realization is x.

To extract θ, we need to find the transformation:

θ = Ꞙ(x)

In the equation, where x is a random variable, \(\theta\) is also considered to be a random variable. However, an exact relationship between \(\theta\) and x as described by this equation is not possible to establish.

The random variable (R. V) θ = Ꞙ(x) is noted as \(\hat{\theta}\) or\( \hat{\theta}\left(x\right)\) and is called an estimator of the parameter θ. The estimate of the parameter is a realization of this random variable, denoted as θ. The main objective of estimation theory is to approximate numerically the value of the parameter θ.

There are two forms of estimation:

  1. Point estimate: assign a unique value to θ (or even to θ).

  2. Confidence interval estimation: assign a set of values to θ. Generally, we make an estimate using the information from a sample.

An estimate of θ is a measurable function \(\hat{\theta}\left(x\right)\) of observations. The achievement of x determines the value of the estimate. Examples of estimators are:

  1. \({\hat{\theta}}_1\left(x\right)=x_1\) ,

  2. \({\hat{\theta}}_\mathrm{2}\left(x\right)=\frac{\left(x_\mathrm{1}+x_\mathrm{1}\right)}{\mathrm{2}}\) ,

  3. \({\hat{\theta}}_\mathrm{3}\left(x\right)=\left(min\left\{x_\mathrm{1},...,x_n\right\}+max\left\{x_\mathrm{1},...,x_n\right\}\right)\) ,

  4. \({\hat{\theta}}_\mathrm{1}\left(x\right)=\frac{\mathrm{1} }{n}\sum_{i=\mathrm{1}}^{n}X_i\) .

There are several possible estimators for a given problem. It is by means of criteria that we will divide the estimators based on criteria.

In the following, \(\hat{\theta}\) will be a rating for an estimator of the parameter \(\hat{\theta}\left(x\right)\).