In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable...
25 KB (4,252 words) - 09:10, 10 June 2025
transition is to be made, the process moves according to the jump chain, a discrete-time Markov chain with stochastic matrix: [ 0 1 2 1 2 1 3 0 2 3 5 6 1 6 0 ]...
23 KB (4,240 words) - 02:41, 27 June 2025
the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain...
96 KB (12,900 words) - 18:23, 29 July 2025
X_{n}} in the Markov renewal process is a discrete-time Markov chain. In other words, if the time variables are ignored in the Markov renewal process...
4 KB (834 words) - 02:10, 13 July 2023
Absorbing Markov chain Continuous-time Markov chain Discrete-time Markov chain Nearly completely decomposable Markov chain Quantum Markov chain Telescoping...
2 KB (229 words) - 07:10, 17 June 2024
In statistics, Markov chain Monte Carlo (MCMC) is a class of algorithms used to draw samples from a probability distribution. Given a probability distribution...
63 KB (8,546 words) - 17:14, 28 July 2025
Markov chains, there can be continuous-time absorbing Markov chains with an infinite state space. However, this article concentrates on the discrete-time...
12 KB (1,762 words) - 11:26, 30 December 2024
additive Markov chain is a Markov chain with an additive conditional probability function. Here the process is a discrete-time Markov chain of order m...
4 KB (785 words) - 13:55, 6 February 2023
Stationary distribution (category Time series)
Discrete-time Markov chain § Stationary distributions and continuous-time Markov chain § Stationary distribution, a special distribution for a Markov...
2 KB (253 words) - 00:29, 19 June 2024
Ising model. A discrete-time stochastic process satisfying the Markov property is known as a Markov chain. A stochastic process has the Markov property if...
8 KB (1,124 words) - 20:27, 8 March 2025
examples of Markov chains and Markov processes in action. All examples are in the countable state space. For an overview of Markov chains in general state...
14 KB (2,405 words) - 06:52, 29 July 2025
be discrete-time stochastic processes and n ≥ 1 {\displaystyle n\geq 1} . The pair ( X n , Y n ) {\displaystyle (X_{n},Y_{n})} is a hidden Markov model...
52 KB (6,811 words) - 07:33, 3 August 2025
equation. The Leslie model is very similar to a discrete-time Markov chain. The main difference is that in a Markov model, one would have f x + s x = 1 {\displaystyle...
7 KB (1,224 words) - 21:05, 14 April 2025
from its connection to Markov chains, a concept developed by the Russian mathematician Andrey Markov. The "Markov" in "Markov decision process" refers...
35 KB (5,169 words) - 20:19, 22 July 2025
and congestion collapse. To understand stability, Lam created a discrete-time Markov chain model for analyzing the statistical behaviour of slotted ALOHA...
23 KB (3,343 words) - 00:53, 16 July 2025
Kolmogorov's criterion (category Markov processes)
and sufficient condition for a Markov chain or continuous-time Markov chain to be stochastically identical to its time-reversed version. The theorem states...
4 KB (861 words) - 17:10, 21 June 2024
Markov chain on a measurable state space is a discrete-time-homogeneous Markov chain with a measurable space as state space. The definition of Markov...
5 KB (1,000 words) - 02:43, 6 July 2025
theory Petri net theory Discrete event system specification Boolean differential calculus Markov chain Queueing theory Discrete-event simulation Concurrent...
1 KB (128 words) - 14:14, 11 May 2025
from discretizing the time-series to hidden Markov-models combined with wavelets and the Markov-chain mixture distribution model (MCM). Markov chain Monte...
10 KB (1,234 words) - 05:46, 7 July 2025
Uniformization (probability theory) (category Markov processes)
finite state continuous-time Markov chains, by approximating the process by a discrete-time Markov chain. The original chain is scaled by the fastest...
5 KB (608 words) - 14:39, 2 September 2024
Stochastic process (redirect from Discrete-time stochastic process)
definition of a Markov chain varies. For example, it is common to define a Markov chain as a Markov process in either discrete or continuous time with a countable...
168 KB (18,657 words) - 11:11, 30 June 2025
is a discrete phase-type distribution if it is the distribution of the first passage time to the absorbing state of a terminating Markov chain with finitely...
4 KB (596 words) - 19:49, 14 March 2025
Foster's theorem (category Markov processes)
state while starting from it within a finite time interval. Consider an irreducible discrete-time Markov chain on a countable state space S {\displaystyle...
2 KB (274 words) - 20:59, 14 April 2025
pair of states i {\displaystyle i} and j {\displaystyle j} . A discrete time Markov chain (DTMC) with transition matrix P {\displaystyle P} and equilibrium...
8 KB (924 words) - 05:31, 12 January 2025
of discrete time Markov processes, which are described by the Chapman–Kolmogorov equation, and sought to derive a theory of continuous time Markov processes...
9 KB (1,438 words) - 22:49, 6 May 2025
including discrete-time Markov chains, continuous-time Markov chains, Markov decision processes and probabilistic extensions of the timed automata formalism...
3 KB (389 words) - 05:38, 18 October 2024
statistics and statistical physics, the Metropolis–Hastings algorithm is a Markov chain Monte Carlo (MCMC) method for obtaining a sequence of random samples...
30 KB (4,556 words) - 09:14, 9 March 2025
Chapman–Kolmogorov equation (category Markov processes)
the probability distribution on the state space of a Markov chain is discrete and the Markov chain is homogeneous, the Chapman–Kolmogorov equations can...
6 KB (996 words) - 23:23, 6 May 2025
) {\displaystyle (X,k)} , we can then construct a reversible discrete-time Markov chain on X {\displaystyle X} (a process known as the normalized graph...
19 KB (2,482 words) - 16:25, 13 June 2025
any finite time interval. When M {\displaystyle M} has a discrete distribution, the Markov state vector M t {\displaystyle M_{t}} takes finitely many...
12 KB (1,572 words) - 19:10, 26 September 2024