Skip to content
2000

Markov Chain Process

image of Markov Chain Process

In this chapter, and from the historical introduction raised in the previous chapters, we introduce and exemplify all the components of a Markov Chain Process such as: initial state vector, Markov property (or Markov property), matrix of transition probabilities, and steady-state vector. A Markov Chain Process is formally defined and by way of categorization this process is divided into two types: Discrete-Time Markov Chain Process and Continuous-Time Markov Chain Process, which occurs as a result of observing whether the time between states in a random walk is discrete or continuous. Each of its components is exemplified, and analytically all the examples are solved.

/content/books/9789815080476.chap4
dcterms_subject,pub_keyword
-contentType:Journal
10
5
Chapter
content/books/9789815080476
Book
false
en
Loading
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error
Please enter a valid_number test