Exam 16: Markov Processes
Exam 1: Introduction49 Questions
Exam 2: An Introduction to Linear Programming52 Questions
Exam 3: Linear Programming: Sensitivity Analysis and Interpretation of Solution47 Questions
Exam 4: Linear Programming Applications in Marketing, Finance and Operations Management38 Questions
Exam 5: Advanced Linear Programming Applications35 Questions
Exam 6: Distribution and Network Problems54 Questions
Exam 7: Integer Linear Programming43 Questions
Exam 8: Nonlinear Optimization Models48 Questions
Exam 9: Project Scheduling: Pertcpm44 Questions
Exam 10: Inventory Models51 Questions
Exam 11: Waiting Line Models48 Questions
Exam 12: Simulation49 Questions
Exam 13: Decision Analysis42 Questions
Exam 14: Multicriteria Decisions45 Questions
Exam 15: Forecasting47 Questions
Exam 16: Markov Processes41 Questions
Exam 17: Linear Programming: Simplex Method46 Questions
Exam 18: Simplex-Based Sensitivity Analysis and Duality34 Questions
Exam 19: Solution Procedures for Transportation and Assignment Problems42 Questions
Exam 20: Minimal Spanning Tree18 Questions
Exam 21: Dynamic Programming30 Questions
Select questions type
If an absorbing state exists, then the probability that a unit will ultimately move into the absorbing state is given by the steady state probability.
Free
(True/False)
4.9/5
(27)
Correct Answer:
False
Calculate the steady state probabilities for this transition matrix.
Free
(Essay)
4.9/5
(27)
Correct Answer:
1 = 3/7, 2 = 4/7
The daily price of a farm commodity is up, down, or unchanged from the day before. Analysts predict that if the last price was down, there is a .5 probability the next will be down, and a .4 probability the price will be unchanged. If the last price was unchanged, there is a .35 probability it will be down and a .35 probability it will be up. For prices whose last movement was up, the probabilities of down, unchanged, and up are .1, .3, and .6.
a.Construct the matrix of transition probabilities.
b.Calculate the steady state probabilities.
Free
(Essay)
4.8/5
(37)
Correct Answer:
a. b. 1 = .305, 2 = .333, 3 = .365
For Markov processes having the memoryless property, the prior states of the system must be considered in order to predict the future behavior of the system.
(True/False)
4.7/5
(40)
When absorbing states are present, each row of the transition matrix corresponding to an absorbing state will have a single 1 and all other probabilities will be 0.
(True/False)
4.9/5
(37)
All entries in a row of a matrix of transition probabilities sum to 1.
(True/False)
4.8/5
(31)
A state i is a transient state if there exists a state j that is reachable from i, but the state i is not reachable from state j.
(True/False)
4.8/5
(32)
The medical prognosis for a patient with a certain disease is to recover, to die, to exhibit symptom 1, or to exhibit symptom 2. The matrix of transition probabilities is Recover Die Recover 1 0 0 0 Die 0 1 0 0 1/4 1/4 1/3 1/6 1/4 1/8 1/8 1/2
a.What are the absorbing states?
b.What is the probability that a patient with symptom 2 will recover?
(Essay)
4.8/5
(34)
Give two examples of how Markov analysis can aid decision making.
(Short Answer)
4.8/5
(31)
All Markov chain transition matrices have the same number of rows as columns.
(True/False)
4.9/5
(29)
A unique matrix of transition probabilities should be developed for each customer.
(True/False)
4.8/5
(31)
The probability of going from state 1 in period 2 to state 4 in period 3 is
(Multiple Choice)
4.7/5
(39)
Accounts receivable have been grouped into the following states:
State 1:
Paid
State 2:
Bad debt
State 3:
0-30 days old
State 4:
31-60 days old
Sixty percent of all new bills are paid before they are 30 days old. The remainder of these go to state 4. Seventy percent of all 30 day old bills are paid before they become 60 days old. If not paid, they are permanently classified as bad debts.
a.Set up the one month Markov transition matrix.
b.What is the probability that an account in state 3 will be paid?
(Essay)
4.8/5
(31)
What assumptions are necessary for a Markov process to have stationary transition probabilities?
(Short Answer)
4.9/5
(32)
Showing 1 - 20 of 41
Filters
- Essay(0)
- Multiple Choice(0)
- Short Answer(0)
- True False(0)
- Matching(0)