Exam 16: Markov Processes

arrow
  • Select Tags
search iconSearch Question
flashcardsStudy Flashcards
  • Select Tags

If an absorbing state exists, then the probability that a unit will ultimately move into the absorbing state is given by the steady state probability.

Free
(True/False)
4.9/5
(27)
Correct Answer:
Verified

False

Calculate the steady state probabilities for this transition matrix. [.60.40.30.70]\left[ \begin{array} { c c } .60 & .40 \\.30 & .70\end{array} \right]

Free
(Essay)
4.9/5
(27)
Correct Answer:
Verified

π\pi 1 = 3/7, π\pi 2 = 4/7

The daily price of a farm commodity is up, down, or unchanged from the day before. Analysts predict that if the last price was down, there is a .5 probability the next will be down, and a .4 probability the price will be unchanged. If the last price was unchanged, there is a .35 probability it will be down and a .35 probability it will be up. For prices whose last movement was up, the probabilities of down, unchanged, and up are .1, .3, and .6. a.Construct the matrix of transition probabilities. b.Calculate the steady state probabilities.

Free
(Essay)
4.8/5
(37)
Correct Answer:
Verified

a. [.50.40.10.35.30.35.10.30.60]\left[ \begin{array} { c c c } .50 & .40 & .10 \\.35 & .30 & .35 \\.10 & .30 & .60\end{array} \right] b. π\pi 1 = .305, π\pi 2 = .333, π\pi 3 = .365

For Markov processes having the memoryless property, the prior states of the system must be considered in order to predict the future behavior of the system.

(True/False)
4.7/5
(40)

When absorbing states are present, each row of the transition matrix corresponding to an absorbing state will have a single 1 and all other probabilities will be 0.

(True/False)
4.9/5
(37)

All entries in a row of a matrix of transition probabilities sum to 1.

(True/False)
4.8/5
(31)

Analysis of a Markov process

(Multiple Choice)
4.9/5
(47)

A state i is a transient state if there exists a state j that is reachable from i, but the state i is not reachable from state j.

(True/False)
4.8/5
(32)

At steady state

(Multiple Choice)
4.9/5
(28)

The medical prognosis for a patient with a certain disease is to recover, to die, to exhibit symptom 1, or to exhibit symptom 2. The matrix of transition probabilities is Recover Die Recover 1 0 0 0 Die 0 1 0 0 1/4 1/4 1/3 1/6 1/4 1/8 1/8 1/2 a.What are the absorbing states? b.What is the probability that a patient with symptom 2 will recover?

(Essay)
4.8/5
(34)

Give two examples of how Markov analysis can aid decision making.

(Short Answer)
4.8/5
(31)

Markov processes use historical probabilities.

(True/False)
4.8/5
(20)

Explain the concept of memorylessness.

(Short Answer)
4.8/5
(38)

All entries in a matrix of transition probabilities sum to 1.

(True/False)
4.8/5
(35)

All Markov chain transition matrices have the same number of rows as columns.

(True/False)
4.9/5
(29)

A unique matrix of transition probabilities should be developed for each customer.

(True/False)
4.8/5
(31)

The probability of going from state 1 in period 2 to state 4 in period 3 is

(Multiple Choice)
4.7/5
(39)

Accounts receivable have been grouped into the following states: State 1: Paid State 2: Bad debt State 3: 0-30 days old State 4: 31-60 days old Sixty percent of all new bills are paid before they are 30 days old. The remainder of these go to state 4. Seventy percent of all 30 day old bills are paid before they become 60 days old. If not paid, they are permanently classified as bad debts. a.Set up the one month Markov transition matrix. b.What is the probability that an account in state 3 will be paid?

(Essay)
4.8/5
(31)

What assumptions are necessary for a Markov process to have stationary transition probabilities?

(Short Answer)
4.9/5
(32)

A transition probability describes

(Multiple Choice)
5.0/5
(34)
Showing 1 - 20 of 41
close modal

Filters

  • Essay(0)
  • Multiple Choice(0)
  • Short Answer(0)
  • True False(0)
  • Matching(0)