Definitions

from Wiktionary, Creative Commons Attribution/Share-Alike License

  • n. A discrete-time stochastic process with the Markov property.

from the GNU version of the Collaborative International Dictionary of English

  • n. A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk.

from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.

  • n. a Markov process for which the parameter is discrete time values

Etymologies

Sorry, no etymologies found.

Examples

Sorry, no example sentences found.

Wordnik is becoming a not-for-profit! Read our announcement here.

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.