Markovian

математика марковский (о процессе)

Англо-русский научно-технический словарь

Markovian

or Markov; also Markoff adjective of, relating to, or resembling a Markov process or Markov chain especially by having probabilities defined in terms of transition from the possible existing states to other states

Энциклопедический словарь Мерриама-Вебстера