请输入您要查询的单词:

 

单词 Markov chain
释义

Markov chain

Definition
Sequence of stochastic events (based on probabilities instead of certainties) where the current state of a variable or system is independent of all past states, except the current (present) state. Movements of stock/share prices, and growth or decline in a firm's market share, are examples of Markov chains. Named after the inventor of Markov analysis, the Russian mathematician Andrei Andreevich Markov (1856-1922). Also called Markov model. See also Markov process.
随便看

 

英汉经管词典收录了27404条经济管理类英汉双解词条,基本涵盖了经济学、管理学、金融学、会计学、证券期货、商务活动等领域的常用英语单词及短语词组的翻译及用法,是学习及工作的有利工具。

 

Copyright © 2000-2023 Newdu.com.com All Rights Reserved
更新时间:2025/3/15 2:52:39