Expectations of Random Variables
If $P_X = P \left( X=x_i \right) = p_i$, then the expectation is of a discrete random variable is defined as:
$$ E[X] = \sum _{i=1}^{\infty} x_{i} \, p_{i}. $$For a continuous random variable $X$, with a probability density function $f_X$, then the expectation is given by
$$ E[X] = \int_{-\infty}^{\infty} x \, f_X(x) \, \text{d}x. $$Let $Y = g(X)$, then by the definition of the expectation, the random variable $Y$ is given by
$$ {E} [Y] = \int_{-\infty}^{\infty} y \, f_Y(y) \, \mathrm{d}y $$but the expectation can be computed via
$$ {E} [Y] = \int_{-\infty}^{\infty} g(x) \, f_X(x) \, \text{d}x. $$Conditional Expectations
The conditional expectation of $X$ given that an event $B$ has occurred is
$$ E \left[ X \, \vert \, B \right] = \int_{-\infty}^{\infty} x \, f_{X \, \vert \, B}\left( x \, \vert \, B \right) \, \mathrm{d}x. $$Let $X$ and $Y$ be discrete random variables with a joint PMF $P_{X,Y}\left(x_i, y_j\right)$. Then the conditional expectation of $Y$ given $X=x_i$, denoted by $E\left[ Y \, \vert \, X=x_i \right]$ is
$$ E \left[ Y \, \vert \, X=x_i \right] = \sum_{j} y_j \, P_{Y \vert X}\left( y_j \, \vert \, x_i \right), $$where the conditional probability that $\left\{ Y = y_j \right\}$ occurs given $\left\{ X=x_i \right\}$ has occurred is given by
$$ P_{Y \vert X}\left( y_j \, \vert \, x_i \right) = \dfrac{P_{X,Y}\left(x_i, y_j\right) }{P_X \left(x_i \right)}. $$Let $X$ and $Y$ be continuous random variables with a joint PDF $f_{XY}\left(x, y\right)$. The conditional pdf of $Y$ given $X=x$ is denoted by
$$ f_{Y \vert X} \left( y \, \vert \, x \right) = \dfrac{f_{XY}\left(x,y\right)}{ f_X ( x ) }, $$where $f_X(x) \ne 0$. Then the conditional expectation of $Y$ given that $X=x$ is given by $E\left[ Y \,\vert\, X=x \right]$
$$ E \left[ Y \, \vert X=x \, \right] = \int_{-\infty}^{\infty} y \, f_{Y \vert X}\left( y \, \vert \, x \right) \text{d}y. $$