Path: blob/master/notebook-for-learning/Chapter-7-Statistical-Estimation-and-Sampling-Distributions.ipynb
388 views
Chapter 7 - Statistical Estimation and Sampling Distributions
Point estimates
Parameters
Parameters: term used in statistical inference for a quantity determining the shape of an unknown probabiity distribution
Goal: estimate the unknown parameters to obtain the distribution
Statistics
Statistic: function of a random sample (e.g. sample mean, variance, quantile...)
Statistics are random variables whose observed values can be calculated from a set of observed data
Estimation
Estimation: procedure of "guessing" properties of the population from which data are collected
Point estimate: statistic representing a "best guess" of the real value
Properties of Point Estimates
Unbiased Estimates
Unbiased point estimate: a for a parameter satisfying:
Bias definition:
Point estimate of a population mean: given a random sample from a distribution with mean , the sample mean is an unbiased estimate of
Point estimate of a population variance: given a random sample from a distribution with variance , the sample variance is an unbiased estimate of
Minimum Variance Estimates
Minimum variance unbiased estimate: unbiased point estimate whose variance is smaller than any other unbiased point estimate
Relative efficiency
Relative efficiency: of an unbiased point estimate to another
Mean squared error (MSE)
Alternative form:
Sample Proportion
Sample Proportion
If then the sample proportion has approximately the distribution
Standard error of : When is large, then is approximated by .
Sample Mean
Distribution of sample mean: given a random sample from a distribution with mean and variance , the centra limit theorem says:
Standard error of the sample mean:
When is unknown and is large, then the standard error is approximated by
Sample Variance
Distribution of sample variance: given a random sample from , then:
t-statistics: given a random sample from , then
Constructing Parameter Estimates
The Methods of Moments
Method of Moments point estimate (MME) for one parameter: given a data set of observations from a probability distribution depending on one parameter , then the MME is found by solving the following equation
Method of Moments point estimate (MME) for two parameters: the unknown parameters can be found as: and
Maximum Likelihood Estimates
Maximum likelihood estimate for one parameter: given a data set of observations from a probability distribution then where is the likelihood function
Maximum likelihood estimate for two parameters: and are the values of the parameters at which the likelihood function is maximized
MLE for
For some distribution, the MLE may not be found by differentiation and we should look at the curve of the likelihood function itself
MLE of