Explore BrainMass

Markov Processes

     A Markov process is a stochastic process where the future outcomes of the process can be predicted conditional on only the present state. This means that knowledge of past events have no bearing whatsoever on the future. This is also called the Markov property. Someone taking a multiple choice test could be thought of as a Markov process. Assume that this person makes random guesses on all his questions and has therefore a 1/5 probability of getting each question correct. At any point of the test, as long as you know the current score of the student, you can make a prediction of the future score after 1 question, 3 questions, or at the end of the test. Having knowledge of what guesses were made beforehand will not benefit your prediction. That is to say knowing that the student achieved his current score by guessing all A's or guessing all C's will make no difference.


     An example of a non-Markov process can be outlined as follows. Imagine that you have 10 cards labelled 1-10. You draw cards out at random. Your first four cards happen to be 1, 2, 3, and 8. Your current sum is 14. Drawing your next card you know that the sum must be at least 18 or greater hereafter because the lowest card left in the deck is 4. But, if you had no knowledge of the past and only knowledge of the present state of having a sum of 14, you could not come to the same conclusion. If you only knew the sum, it could be possible that the four cards you have drawn were 2, 3, 4, and 5. This means there is a possibility you draw the 1 next and get a sum of 15. Here, knowledge of the history of the process changes your prediction of the future. Therefore it is not a Markov process.  

Decisions and Queues

Looking for explanations and an example to help illustrate them. I get the basic idea of each question, but I would like something more illustrative to nail down the topics 1. Describe the types of decisions that do not involve uncertainty and risk and common approaches used for analyzing them. 2. Describe how to model de

Statistics Using SPSS

1. Use the above dataset, calculate the following: a) Means, standard deviations, sums of squares and cross products, and the three r's (correlation between y and x1, y and x2 and x1 and x2). b) Regression equation of Y on X1 and X2. What method does linear regression use for parameter estimation? Please further elaborate the

Markov Analysis: Book Store Example (Please place in POM-QM Software format)

The University of South Wisconsin has had steady enrollments over the past five years. The school has its own bookstore, called University Book Store, but there are also three private bookstores in town: Bill's Book Store, College Book Store, and Battle's Book Store. The university is concerned about the large number of stude

Using Markov Analysis

A study of accounts receivable at the A&W Department Store indicates that bills are either current, one month overdue, two months overdue, written off as bad debts, or paid in full. Of those that are current, 80% are paid that month, and the rest become one month overdue. Of the one month overdue bills, 90% are paid, and the res

Qualitative Methods - Markov Processes

Markov Processes 5. For a situation with weekly dining at either an Italian or Mexican restaurant, a. the weekly visit is the trial and the restaurant is the state. b. the weekly visit is the state and the restaurant is the trial. c. the weekly visit is

A simple problem on branching processes (Markov chains)

This is a problem on a branching process. I have come up with a solution to this problem, but have no way of telling whether it´s correct or not. Need a detailed solution, thank you. See attached file for full problem description.

Markov Chains

Markov chains. I am finding the continuous case a bit unintuitive. This problem concerns martingales and branching processes, but it is supposedly not hard. I just need a detailed solution to this problem, so that I can see what concepts are used to solve it.

2 problems

Question #1. Chris Dunphy, executive vice president for marketing and sales of Sumu Electronics, is considering the possibility of introducing a new line of inexpensive wrist watches, which would be oriented primarily toward young adults. The watch would have a plastic faceplate and wristband and a variety of features, includin

Markov Analysis, market share and brand switching calculation

Markov Analysis 1. Three fast food hamburger restaurants are competing for the college lunch crowd. Burger Bills has 40% of the market while Hungary Heifer and Salty Sams each have 30% of the market. Burger Bills loses 10% of its customers to the Hungary Heifer and 10% to Salty Sams each month. The Hungary Heifer loses 5%

Markov Chains

All welders in a factory begin as apprentices. Every year the performance of each apprentice is reviewed. Past records indicate that after each review, 10% of the apprentices are promoted to professional welder, 20% are terminated for unsatisfactory performance, and the remainder continue as apprentices. A. Draw a transitio

Markov Chains - Scheduling problem

An outdoor restaurant in a summer resort closes only on days that it rains. From past records it is found that from May through September, when it rains one day, the probability of rain for the next day is .6; when it does not rain one day, the probability of rain the next day is .1. A. Draw a transition diagram. B. Write

Markov Chain

Please see the attached file for the fully formatted problems. At the beginning of each year, my car in good, fair, or broken-down condition. A good car will good at the beginning of the next year with the probability 0.85; fair with the probability 0.10; or broken-down quality with a probability of 0.05. A fair car will be