Exam 16: Markov Processes

arrow
  • Select Tags
search iconSearch Question
flashcardsStudy Flashcards
  • Select Tags

All Markov chain transition matrices have the same number of rows as columns.

Free
(True/False)
4.8/5
(44)
Correct Answer:
Verified

True

A state i is a transient state if there exists a state j that is reachable from i, but the state i is not reachable from state j.

Free
(True/False)
4.9/5
(28)
Correct Answer:
Verified

True

In Markov analysis, we are concerned with the probability that the

Free
(Multiple Choice)
4.7/5
(37)
Correct Answer:
Verified

B

A unique matrix of transition probabilities should be developed for each customer.

(True/False)
5.0/5
(34)

The fundamental matrix is used to calculate the probability of the process moving into each absorbing state.

(True/False)
4.9/5
(29)

All entries in a row of a matrix of transition probabilities sum to 1.

(True/False)
5.0/5
(33)

A transition probability describes

(Multiple Choice)
4.8/5
(30)

A state i is an absorbing state if pii = 0.

(True/False)
4.9/5
(28)

All Markov chains have steady-state probabilities.

(True/False)
4.7/5
(43)

All entries in a matrix of transition probabilities sum to 1.

(True/False)
5.0/5
(33)

The probability of reaching an absorbing state is given by the

(Multiple Choice)
4.9/5
(39)

Steady state probabilities are independent of initial state.

(True/False)
4.8/5
(26)

At steady state

(Multiple Choice)
4.7/5
(34)

Markov processes use historical probabilities.

(True/False)
4.8/5
(31)

For a situation with weekly dining at either an Italian or Mexican restaurant,

(Multiple Choice)
4.8/5
(36)

Analysis of a Markov process

(Multiple Choice)
4.8/5
(25)

Absorbing state probabilities are the same as

(Multiple Choice)
4.8/5
(25)

The probability that a system is in a particular state after a large number of periods is

(Multiple Choice)
4.7/5
(29)

If an absorbing state exists, then the probability that a unit will ultimately move into the absorbing state is given by the steady state probability.

(True/False)
4.8/5
(36)

A Markov chain cannot consist of all absorbing states.

(True/False)
4.8/5
(35)
Showing 1 - 20 of 25
close modal

Filters

  • Essay(0)
  • Multiple Choice(0)
  • Short Answer(0)
  • True False(0)
  • Matching(0)