- n. probability theory A discrete-time stochastic process with the Markov property.
GNU Webster's 1913
- n. (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a
Markov chainare discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk.
- n. a Markov process for which the parameter is discrete time values
Sorry, no example sentences found.
These user-created lists contain the word ‘Markov chain’.
parsing, tagging, computational lin..., computer science, language processing, machine learning, natural language ..., semantic level, word sense ambiguity, discourse level, anaphora, ambiguity and 332 more...
Looking for tweets for Markov chain.