Where:
The estimator of the expected value or mean is:
\[\mathbb{E}(X)=\frac{1}{n}\sum_{i=1}^{n}x_i\]See : Wikipedia page for expected value.
Where:
The standard deviation is the square root of the variance it is \(\sigma\).
Unbiased variance (with Bessel’s correction ie n-1 in place of n in the denominator):
\[Var(X)=\sigma^2=\frac{1}{n-1}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)^2=\left(\frac{1}{n}\sum_{i=1}^{n}x_i^2\right)-\bar{x}^2\]Where:
The estimator of the standard deviation is just \(\sigma\), the square root of the estimated variance.
For X and Y two random variables and a and b two determistic values:
See : Wikipedia page for variance.
\(Cov(X, Y)=\mathbb{E}\left[\left(X-\mathbb{E}(X)\right)\left(Y-\mathbb{E}(Y)\right)\right]=\mathbb{E}(XY)-\mathbb{E}(X)\mathbb{E}(Y)\)
Unbiased covariance estimator (with Bessel’s correction ie n-1 in place of n in the denominator):
\[Cov(X, Y)=\frac{1}{n-1}\sum_{i=1}^{n}\left(x_i-\bar{x}\right)\left(y_i-\bar{y}\right)\]For X and Y two random variables and a and b two determistic values:
For \(Cov(X,Y)=0 \not\implies X\text{, }Y\text{ independents}\), an example is to take \(X\) centered on 0 and \(Y=X^2\). Covariance is null but \(Y\) is totally determined by \(X\).
See : Wikipedia page for covariance.
Where:
The estimator of the correlation is the estimator of the covariance of X and Y divided by the product of the estimated standard deviation of X and Y.
For X and Y two random variables and a and b two determistic values:
For \(rho(X,Y)=0 \not\implies X\text{, }Y\text{ independents}\), an example is to take \(X\) centered on 0 and \(Y=X^2\). Correlation is null but \(Y\) is totally determined by \(X\).
See : Wikipedia page for correlation.
Skewness is the third moment of a distribution. It measures the asymmetry of a probability distribution.
Kurtosis is the forth moment of a distribution. It describes the shape of a probability distribution and in particular the probability of extreme values (fat tails).
The characteristic function of a random variable X is:
\[\phi_X(t)=\mathbb{E}\left[e^{itX}\right]=\mathbb{E}\left[\cos(tX)\right]+i\mathbb{E}\left[\sin(tX)\right]\]If the distribution of this random variable has a probability density then its characteristic function is the inverse Fourier transform of this probability density function:
\[\phi_X(t)=\int_{\mathbb{R}}f_X(x)e^{itx}dx\]The characteristic function is very useful to compute the moments of a distribution. Its successive derivatives taken at 0 give the moment of equivalent order.
\[\frac{d\phi_X(t)}{dt}=\mathbb{E}\left[iXe^{itX}\right]\] \[\frac{d\phi_X(0)}{dt}=i\mathbb{E}[X]\]So the first derivative gives the mean.