â¢ In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data â¢ Example- i. X follows a normal distribution, but we do not know the parameters of our distribution, namely mean (Î¼) and variance (Ï2 ) ii. properties at the same time, and sometimes they can even be incompatible. This class of estimators has an important property. is unbiased for . Small-Sample Estimator Properties Nature of Small-Sample Properties The small-sample, or finite-sample, distribution of the estimator Î²Ë j for any finite sample size N < â has 1. a mean, or expectation, denoted as E(Î²Ë j), and 2. a variance denoted as Var(Î²Ë j). Only once weâve analyzed the sample minimum can we say for certain if it is a good estimator or not, but it is certainly a natural ï¬rst choice. ECONOMICS 351* -- NOTE 3 M.G. This ï¬exibility in Indeed, any statistic is an estimator. Properties of Good Estimators ¥In the Frequentist world view parameters are Þxed, statistics are rv and vary from sample to sample (i.e., have an associated sampling distribution) ¥In theory, there are many potential estimators for a population parameter ¥What are characteristics of good estimators? Example: Suppose X 1;X 2; ;X n is an i.i.d. We would like to have an estimator with smaller bias and smaller variance : if one can nd several unbiased estimators, we want to use an estimator with smaller vari-ance. Ë= T (X) be an estimator where . The small-sample properties of the estimator Î²Ë j are defined in terms of the mean ( ) If we have a parametric family with parameter Î¸, then an estimator of Î¸ is usually denoted by Î¸Ë. PROPERTIES OF ESTIMATORS (BLUE) KSHITIZ GUPTA 2. Then relative e ciency of ^ 1 relative to ^ 2, X. be our data. If ^(x) is a maximum likelihood estimate for , then g( ^(x)) is a maximum likelihood estimate for g( ). 1.1 Unbiasness. 2.4.1 Finite Sample Properties of the OLS and ML Estimates of Î¸. Ë. Abbott 2. Large-sample properties of estimators I asymptotically unbiased: means that a biased estimator has a bias that tends to zero as sample size approaches in nity. Let . Let . T. is some function. Some of the properties are defined relative to a class of candidate estimators, a set of possible T(") that we will denote by T. The density of an estimator T(") will be denoted (t, o), or when it is necessary to index the estimator, T(t, o). WHAT IS AN ESTIMATOR? Î¸. When we want to study the properties of the obtained estimators, it is convenient to distinguish between two categories of properties: i) the small (or finite) sample properties, which are valid whatever the sample size, and ii) the asymptotic properties, which are associated with large samples, i.e., when tends to . â¢ Obtaining a point estimate of a population parameter â¢ Desirable properties of a point estimator: â¢ Unbiasedness â¢ Efficiency â¢ Obtaining a confidence interval for a mean when population standard deviation is known â¢ Obtaining a confidence interval for a mean when population standard deviation is â¦ We say that . random sample from a Poisson distribution with parameter . I When no estimator with desireable small-scale properties can be found, we often must choose between di erent estimators on the basis of asymptotic properties 1 Estimators. Point estimators. For example, if is a parameter for the variance and ^ is the maximum likelihood estimator, then p ^ is the maximum likelihood estimator for the standard deviation. But the sample mean Y is also an estimator of the popu-lation minimum. Relative e ciency (Def 9.1) Suppose ^ 1 and ^ 2 are two unbi-ased estimators for , with variances, V( ^ 1) and V(^ 2), respectively. An estimator is a function of the data. Properties of estimators. Relative e ciency: If ^ 1 and ^ 2 are both unbiased estimators of a parameter we say that ^ 1 is relatively more e cient if var(^ 1)