LOADING ...

US store- http://smarturl.it/InUtero20

World store- http://smarturl.it/InUtero3CD

iTunes: http://smarturl.it/InUteroiTunes

Nirvana posted on Apr 12, 2019

1,582 likes / 227 comments

In probability theory and related fields, a Markov process, named after the Russian mathematician Andrey Markov, is a stochastic process that satisfies the Markov property[1][3][4] (sometimes characterized as "memorylessness"). Roughly speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history, hence independently from such history; i.e., conditional on the present state of the system, its future and past states are independent.

A Markov chain is a type of Markov process that has either a discrete state space or a discrete index set (often representing time), but the precise definition of a Markov chain varies.[5] For example, it is common to define a Markov chain as a Markov process in either discrete or continuous time with a countable state space (thus regardless of the nature of time),[6][7][8][9] but it is also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).[5]