Definition 4.3.1.
Given a random variable \(X:\Omega\mapsto \mathbb{R}\) on a finite probability space \((\Omega,P)\text{,}\) the expected value of \(X\) is defined to be
\begin{equation*}
E[X]=\sum_{x\in \Omega} P(x)X(x) \, .
\end{equation*}
Intuitively, the expected value of \(X\) is the average value of \(X\) if we select an event from \(\Omega\) according to the probability distribution \(P\text{.}\)
