Concept information
Término preferido
Markov process
Definición
-
A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC). It is named after the Russian mathematician Andrey Markov.
(Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/wiki/Markov_chain)
Concepto genérico
Conceptos específicos
Etiquetas alternativas
- Markov chain
En otras lenguas
-
francés
-
chaîne de Markov
URI
http://data.loterre.fr/ark:/67375/PSR-KQDNC91K-9
{{label}}
{{#each values }} {{! loop through ConceptPropertyValue objects }}
{{#if prefLabel }}
{{/if}}
{{/each}}
{{#if notation }}{{ notation }} {{/if}}{{ prefLabel }}
{{#ifDifferentLabelLang lang }} ({{ lang }}){{/ifDifferentLabelLang}}
{{#if vocabName }}
{{ vocabName }}
{{/if}}