• theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. An absorbing state is a state that,...
    12 KB (1,762 words) - 11:26, 30 December 2024
  • whose moves are determined entirely by dice is a Markov chain, indeed, an absorbing Markov chain. This is in contrast to card games such as blackjack, where...
    14 KB (2,405 words) - 17:52, 29 March 2025
  • Thumbnail for Discrete-time Markov chain
    In probability, a discrete-time Markov chain (DTMC) is a sequence of random variables, known as a stochastic process, in which the value of the next variable...
    25 KB (4,252 words) - 18:52, 20 February 2025
  • Thumbnail for Markov chain
    In probability theory and statistics, a Markov chain or Markov process is a stochastic process describing a sequence of possible events in which the probability...
    96 KB (12,900 words) - 21:01, 27 April 2025
  • process Absorbing Markov chain Continuous-time Markov chain Discrete-time Markov chain Nearly completely decomposable Markov chain Quantum Markov chain Telescoping...
    2 KB (229 words) - 07:10, 17 June 2024
  • Discrete phase-type distribution (category Markov models)
    the time until absorption of an absorbing Markov chain with one absorbing state. Each of the states of the Markov chain represents one of the phases. It...
    4 KB (596 words) - 19:49, 14 March 2025
  • Fundamental matrix (linear differential equation) Fundamental matrix (absorbing Markov chain) This disambiguation page lists articles associated with the title...
    207 bytes (51 words) - 15:25, 27 February 2022
  • importance for an active trader. Business and economics portal Absorbing Markov chain (used in mathematical finance to calculate risk of ruin) Asset allocation...
    8 KB (1,127 words) - 06:56, 12 April 2025
  • stochastic matrix is a square matrix used to describe the transitions of a Markov chain. Each of its entries is a nonnegative real number representing a probability...
    20 KB (2,959 words) - 14:55, 5 May 2025
  • COMAP/UMAP, 1983. U105, U109. Markov chains and applications of matrix methods : fixed point and absorbing Markov chains by Mary K Keller; Consortium for...
    15 KB (1,312 words) - 19:02, 28 March 2025
  • Abductive reasoning Absolute deviation Absolute risk reduction Absorbing Markov chain ABX test Accelerated failure time model Acceptable quality limit...
    87 KB (8,280 words) - 23:04, 12 March 2025
  • Thumbnail for Snakes and ladders
    Snakes and ladders (category Markov models)
    version of snakes and ladders can be represented exactly as an absorbing Markov chain, since from any square the odds of moving to any other square are...
    23 KB (2,783 words) - 12:41, 31 March 2025
  • Thumbnail for Trajectory inference
    terminal states and inferring cell-fate plasticity using a scalable Absorbing Markov chain model. Monocle first employs a differential expression test to reduce...
    16 KB (1,865 words) - 19:19, 9 October 2024
  • anemone-dwelling clownfish, and cavity-nesting birds. Society portal Markov chains Structural functionalism Pinfield, Lawrence (1995). The Operation of...
    8 KB (1,092 words) - 10:41, 8 May 2024
  • "centrality" and "diversity" in a unified mathematical framework based on absorbing Markov chain random walks (a random walk where certain states end the walk)....
    52 KB (6,825 words) - 16:27, 10 May 2025
  • Thumbnail for Stochastic process
    scientists. Markov processes and Markov chains are named after Andrey Markov who studied Markov chains in the early 20th century. Markov was interested...
    168 KB (18,657 words) - 20:31, 17 May 2025
  • Thumbnail for Arthur Engel (mathematician)
    could be used to determine the basic descriptive qualities of an absorbing Markov chain. The algorithm depended on recurrence of the initial distribution...
    16 KB (1,712 words) - 13:17, 25 August 2024
  • Seneta, E. (1965). "On Quasi-Stationary Distributions in Absorbing Discrete-Time Finite Markov Chains". Journal of Applied Probability. 2 (1): 88–100. doi:10...
    7 KB (1,226 words) - 02:02, 27 June 2024
  • sampled empirical measures. In contrast with traditional Monte Carlo and Markov chain Monte Carlo methods these mean-field particle techniques rely on sequential...
    60 KB (8,594 words) - 09:41, 15 December 2024
  • describing the time until absorption of a Markov process with one absorbing state. Each of the states of the Markov process represents one of the phases....
    18 KB (2,368 words) - 07:02, 28 October 2023
  • 0)&{\text{ if }}X(t)=0.\end{cases}}} The operator is a continuous time Markov chain and is usually called the environment process, background process or...
    23 KB (2,602 words) - 19:40, 22 November 2023
  • Markov additive process Markov blanket / Bay Markov chain mixing time / (L:D) Markov decision process Markov information source Markov kernel Markov logic...
    35 KB (3,026 words) - 12:15, 30 October 2023
  • Thumbnail for Weighted automaton
    and are related to other probabilistic models such as Markov decision processes and Markov chains. Weighted automata have applications in natural language...
    14 KB (1,691 words) - 06:31, 14 April 2025
  • a finite state Markov process. If we have a k+1 state process, where the first k states are transient and the state k+1 is an absorbing state, then the...
    11 KB (1,758 words) - 10:15, 12 November 2024
  • Path dependence (category Markov models)
    but will instead reach one of several equilibria (sometimes known as absorbing states). This dynamic vision of economic evolution is very different from...
    37 KB (4,119 words) - 07:15, 2 May 2025
  • Dependability state model (category Markov models)
    dependability state diagram is a method for modelling a system as a Markov chain. It is used in reliability engineering for availability and reliability...
    3 KB (486 words) - 02:32, 26 December 2024
  • distribution which describes the first hit time of the absorbing state of a finite terminating Markov chain. The extended negative binomial distribution The...
    22 KB (2,620 words) - 07:59, 2 May 2025
  • Thumbnail for Semigroup
    syntactic monoid. In probability theory, semigroups are associated with Markov processes. In other areas of applied mathematics, semigroups are fundamental...
    37 KB (4,714 words) - 00:02, 25 February 2025
  • Thumbnail for Voter model
    coalescing[clarification needed] Markov chains. Frequently, these problems will then be reduced to others involving independent Markov chains. A voter model is a (continuous...
    23 KB (4,457 words) - 16:33, 26 November 2024
  • Carlo method: Direct simulation Monte Carlo Quasi-Monte Carlo method Markov chain Monte Carlo Metropolis–Hastings algorithm Multiple-try Metropolis — modification...
    70 KB (8,335 words) - 20:20, 17 April 2025