Exam 16: Markov Processes

arrow
  • Select Tags
search iconSearch Question
flashcardsStudy Flashcards
  • Select Tags

For a situation with weekly dining at either an Italian or Mexican restaurant,

Free
(Multiple Choice)
4.8/5
(43)
Correct Answer:
Verified

A

All Markov chains have steady-state probabilities.

Free
(True/False)
4.8/5
(32)
Correct Answer:
Verified

False

Bark Bits Company is planning an advertising campaign to raise the brand loyalty of its customers to 0.80. a.The former transition matrix is as follows: Bark Bits Company is planning an advertising campaign to raise the brand loyalty of its customers to 0.80. a.The former transition matrix is as follows:     What is the new one?  b.What are the new steady-state probabilities? c.If each point of market share increases profit by $15,000,what is the most you would pay for the advertising? What is the new one? b.What are the new steady-state probabilities? c.If each point of market share increases profit by $15,000,what is the most you would pay for the advertising?

Free
(Essay)
4.9/5
(46)
Correct Answer:
Verified

​ a. ​ a.   b. π<sub>1</sub> = 0.5,π<sub>2</sub> = 0.5 c. The increase in market share is 0.5 − 0.444 = 0.056. (5.6 points) ($15,000/point)= $84,000 value for the campaign. b. π1 = 0.5,π2 = 0.5 c. The increase in market share is 0.5 − 0.444 = 0.056. (5.6 points) ($15,000/point)= $84,000 value for the campaign.

All entries in a matrix of transition probabilities sum to 1.

(True/False)
4.7/5
(42)

Steady-state probabilities are independent of initial state.

(True/False)
4.9/5
(31)

Transition probabilities are conditional probabilities.

(True/False)
4.9/5
(30)

A state i is a transient state if there exists a state j that is reachable from i,but the state i is not reachable from state j.

(True/False)
4.7/5
(32)

The fundamental matrix is derived from the matrix of transition probabilities and is relatively easy to compute for Markov processes with a small number of states.

(True/False)
4.7/5
(33)

In Markov analysis,we are concerned with the probability that the

(Multiple Choice)
4.8/5
(36)

All entries in a row of a matrix of transition probabilities sum to 1.

(True/False)
4.7/5
(34)

A state,i,is an absorbing state if,when i = j,pij = 1.

(True/False)
4.9/5
(31)

The probability that a system is in state 2 in the fifth period is π5(2).

(True/False)
4.9/5
(28)

If a Markov chain has at least one absorbing state,steady-state probabilities cannot be calculated.

(True/False)
4.8/5
(32)

The probability of reaching an absorbing state is given by the

(Multiple Choice)
4.8/5
(29)

If an absorbing state exists,then the probability that a unit will ultimately move into the absorbing state is given by the steady-state probability.

(True/False)
4.9/5
(29)

State j is an absorbing state if pij = 1.

(True/False)
4.8/5
(30)

A unique matrix of transition probabilities should be developed for each customer.

(True/False)
4.8/5
(21)

Absorbing state probabilities are the same as

(Multiple Choice)
4.9/5
(31)

Markov process models

(Multiple Choice)
4.8/5
(25)

The probability that a system is in a particular state after a large number of periods is

(Multiple Choice)
4.9/5
(29)
Showing 1 - 20 of 36
close modal

Filters

  • Essay(0)
  • Multiple Choice(0)
  • Short Answer(0)
  • True False(0)
  • Matching(0)