# Expectation and Variance

One of the most basic statistics of the distribution is an expectation. When *x* is a random variable, *f* (x) is a function of x, and *p(x)* is a probabilistic discrete distribution of x, the expectation of the value of *f (x)* is defined as

When *p(x)* is continuous, the expectation is defined as

In this case, *p(x)* is a probability density function. If a limited number of samples are used, the expectation is approximated as

In the case that the probabilistic distribution is conditional, the expectation also becomes conditional, and it is referred to as a *conditional expectation:*

The suffix means that the expectation or the summation are calculated with respect to x.

Another of the most basic statistics of the distribution is *variance.* It is defined by the following equation:

The square root of variance is called the *standard deviation.* Assuming that there are two random variables *x* and y, the covariance of *x* and *y* is calculated as

The value, cov[x, y], evaluates how *x* and *y* are statistically dependent together. For two random variable vectors *x* and *y* instead of random variables, a covariance matrix is defined by

The covariance between the components in *x* is calculated as cov[x] = cov[x, x]. The diagonal components of this matrix are variance, and the non-diagonal components are covariance.