Probabilistic vs. Stochastic

Definitions

Stochastic processes

Formal definitions from Stochastic Processes - A Survey of the Mathematical Theory and Wikipedia:

A stochastic process is defined as a collection of random variables defined on a common probability space $(\Omega ,{\mathcal {F}},P)$, where $\Omega$ is a sample space, ${\mathcal {F}}$ is a $\sigma$-algebra, and $P$ is a probability measure; and the random variables, indexed by some set $T$, all take values in the same mathematical space $S$, which must be measurable with respect to some $\sigma$-algebra $\Sigma$.

In other words, for a given probability space $(\Omega ,{\mathcal {F}},P)$ and a measurable space $(S,\Sigma )$, a stochastic process is a collection of $S$-valued random variables, which can be written as: ${\displaystyle {X(t):t\in T}.}$ - a collection of random variables indexed by $T$.

Indexed sets $T$ are usually related to time:

  • if $T$ is a set of integer numbers, we have stochastic processes in discrete-time and can use $X_n$ to denote it. DTMCs and MDPs are such examples.
  • if $T$ is a set of real numbers, we have stochastic processes in continuous-time and can use $X_t$ to denote it. CTMCs are such examples.

Markov Chains

A definition from Probability - An Introduction by Geoffrey Grimmett and Dominic Welsh.

Informal:

A stochastic process is said to have the ‘Markov property’ if, conditional on its present value, its future is independent of its past.

The sequence $X$ is a Markov chain if, conditional on the present value $X_n$, the future ($X_r : r > n$) is independent of the past ($X_m : m < n$).

Formal definition:

Defintion 12.1 A sequence $X$ of random variables is called a Markov chain if it satisfies the Markov property $$\begin{align*} & \forall n \leq 0 \land i_0, i_1, ..., i_{n+1} \in S \bullet \\ & \qquad \mathbb{P}\left(X_{n+1}=i_{n+1} | X_0 = i_0, X_1 = i_1, ..., X_n=i_n\right) \\ & \qquad = \mathbb{P}\left(X_{n+1}=i_{n+1} | X_n=i_n\right) \end{align*}$$ where $S$ is a countable set called the state space.

A Markov chain needs two quantities $\left(\lambda, P\right)$ for probability calculation:

  • the transition probability matrix $P: S \times S \to [0,1]$
  • the initial distribution $\lambda: S \to [0,1]$ where $\forall i \in S \bullet \lambda_i = \mathbb{P}\left(X_0=i\right)$.

Additionally,

  • $P$ is a stochastic matrix: $\forall i, j \in S \bullet P(i, j) \geq 0$ and $\forall i \in S \bullet \Sigma_{j\in S} P(i, j)=1$.

This is the exact meaning of our description in the paper “Probabilistic modelling and verification using RoboChart and PRISM”:

“NFM-1 is needed due to the fact that DTMCs and MDPs are stochastic. Any state in a PRISM model has to have at least one outgoing transition.”

For this (stochastic) reason, the row sum in $P$ must be equal to 1 and so every state must have at least one outgoing transition.

Alternative semantics

A DTMC is a Kripke structure where states are labelled with atomic propositions and each transition has an associated discrete probability. A Kripke structure is a quadruple $M = \left(S, I, R, L\right)$. Particularly, $R$ is left-total: $$\forall s \in S \bullet \exists s’ \in S \bullet (s, s’) \in R$$. For the DTMC, $R$ becomes $P$ - a stochastic transition probability matrix, as discussed above.

Some other related discussions about probabilistic and stochastic

Probabilistic choice and stochastic timing

Probabilistic choice and stochastic timing from “Extensions of Statecharts with Probability, Time, and Stochastic Timing” by David Jansen

In this dissertation, we concentrate on two forms of randomness: In probabilistic choice, it is uncertain which possibility out of a discrete set the system will take. A probability space over this set describes how likely the single possibilities are. Throwing a die or tossing a coin are examples of probabilistic choice experiments. Probabilistic choice is typically used in randomised algorithms.

In stochastic timing, it is uncertain how long a specific process will take. The timing is then described by a probability space over the (nonnegative) real numbers. Stochastic timing is typically used when evaluating quality of service or varying workload. For example, how long somebody phones typically varies with time; a stochastic description can serve to predict the probability of his future behaviour. Figure 1.2 shows a possible distribution of phone calls over time.

DTMCs and CTMCs

Probabilistic choice (in DTMCs) and Continuous real time and probabilistic choice (in CTMCs) from Symbolic model checking for probabilistic timed automata..

DTMCs, admit probabilistic choice, in the sense that one can specify the probability of making a transition from one state to another.

CTMCs, frequently used in performance analysis, which model continuous real time and probabilistic choice: one can specify the rate of making a transition from one state to another. Probabilistic choice, in this model, arises through race conditions when two or more transitions in a state are enabled.

  • Transitions can occur at any (real-valued) time instant,
  • Transition rate ($R(s, s’)$) as a parameter of the exponential distribution,
  • A transition can only occur if its rate>0 and, in this case, the probability of this transition being triggered within $t$ time-units equals $1 − e^{−R(s,s′)*t}$.
Kangfeng (Randall) Ye
Kangfeng (Randall) Ye
Research Associate (Computer Science)

My research interests include probabilistic modelling and verification using formal specification and verification (both model checking and theorem proving) and model-based engineering.