Markov jump process love

Markov jump process

Definitions

from Wiktionary, Creative Commons Attribution/Share-Alike License.

  • noun mathematics A time-dependent variable that starts in an initial state and stays in that state for a random time, when it makes a transition to another random state, and so on.

Etymologies

Sorry, no etymologies found.

Examples

    Sorry, no example sentences found.

Comments

Log in or sign up to get involved in the conversation. It's quick and easy.