Definitions

from Wiktionary, Creative Commons Attribution/Share-Alike License

  • adj. Exhibiting the Markov property, in which the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states.

from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.

  • adj. relating to or generated by a Markov process

Etymologies

Markov +‎ -ian, named for the Russian mathematician Andrey Markov. (Wiktionary)

Examples

Wordnik is becoming a not-for-profit! Read our announcement here.

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.