Definitions

from The American Heritage® Dictionary of the English Language, 5th Edition.

  • adjective Of or relating to a probabilistic stochastic model in which predictions are made based only on the current state of the system, and not on any previous states.

from Wiktionary, Creative Commons Attribution/Share-Alike License.

  • adjective statistics, of a process Exhibiting the Markov property, in which the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states.

from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.

  • adjective relating to or generated by a Markov process

Etymologies

from The American Heritage® Dictionary of the English Language, 4th Edition

[After Andrei Andreyevich Markov, (1856–1922), Russian mathematician.]

from Wiktionary, Creative Commons Attribution/Share-Alike License

Markov +‎ -ian, named for the Russian mathematician Andrey Markov.

Support

Help support Wordnik (and make this page ad-free) by adopting the word Markovian.

Examples

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.