Explore BrainMass

Explore BrainMass

    Markov Processes

         A Markov process is a stochastic process where the future outcomes of the process can be predicted conditional on only the present state. This means that knowledge of past events have no bearing whatsoever on the future. This is also called the Markov property. Someone taking a multiple choice test could be thought of as a Markov process. Assume that this person makes random guesses on all his questions and has therefore a 1/5 probability of getting each question correct. At any point of the test, as long as you know the current score of the student, you can make a prediction of the future score after 1 question, 3 questions, or at the end of the test. Having knowledge of what guesses were made beforehand will not benefit your prediction. That is to say knowing that the student achieved his current score by guessing all A's or guessing all C's will make no difference.


         An example of a non-Markov process can be outlined as follows. Imagine that you have 10 cards labelled 1-10. You draw cards out at random. Your first four cards happen to be 1, 2, 3, and 8. Your current sum is 14. Drawing your next card you know that the sum must be at least 18 or greater hereafter because the lowest card left in the deck is 4. But, if you had no knowledge of the past and only knowledge of the present state of having a sum of 14, you could not come to the same conclusion. If you only knew the sum, it could be possible that the four cards you have drawn were 2, 3, 4, and 5. This means there is a possibility you draw the 1 next and get a sum of 15. Here, knowledge of the history of the process changes your prediction of the future. Therefore it is not a Markov process.  

    © BrainMass Inc. brainmass.com April 16, 2024, 9:43 am ad1c9bdddf

    BrainMass Solutions Available for Instant Download

    Decisions and Queues

    Looking for explanations and an example to help illustrate them. I get the basic idea of each question, but I would like something more illustrative to nail down the topics 1. Describe the types of decisions that do not involve uncertainty and risk and common approaches used for analyzing them. 2. Describe how to model de

    Statistics Using SPSS

    1. Use the above dataset, calculate the following: a) Means, standard deviations, sums of squares and cross products, and the three r's (correlation between y and x1, y and x2 and x1 and x2). b) Regression equation of Y on X1 and X2. What method does linear regression use for parameter estimation? Please further elaborate the

    Markov Chain: Invariant Distribution

    Please see attached file. Consider the Markov chain with transition matrix (please see the attached file) where 0 <= p <= 1 and 0 <= q <= 1. Find an invariant distribution of this chain, For which values of p and q is the distribution unique? Given an initial distribution (please see the attached file), for which values of p

    Fair coin tosses: Markov chains

    Please see attached file. Let a fair coin be tossed indefinitely. Denote respectively by H_n and T_n the total number of heads and tails after n tosses. Let X_n = H_n and Y_n = H_n - T_n. Are these Markov chains? If so, what is their transition matrix?

    Markov Chain: Drawing the State Diagram of the Chain

    See attached file for better format. A particle moves on the states 1, 2, 3, and 4 according to a time-homogeneous Markov chain {Xn ; n &#8805; 0 } with initial distribution p(0) = ( .2, .8, 0, 0 ) and transition matrix : p = [&#9632;(.6&.4&0&[email protected]&0&.4&0@0&.6&0&.4@0&0&.6&.4)] a) Draw the state diagram of the chain.

    Stochastic Processes - Markov Chain

    This problem involves the use of Matlab. Step 1: Choose any Markov Chain with a 3x3 transition matrix that is irreducible and aperiodic. The reason it needs to be irreducible and aperiodic is because we are looking for a Markov Chain that converges. Calculate the stationary distribution of the Markov Chain by hand. Ste

    Markov Analysis: Book Store Example (Please place in POM-QM Software format)

    The University of South Wisconsin has had steady enrollments over the past five years. The school has its own bookstore, called University Book Store, but there are also three private bookstores in town: Bill's Book Store, College Book Store, and Battle's Book Store. The university is concerned about the large number of stude

    Using Markov Analysis

    A study of accounts receivable at the A&W Department Store indicates that bills are either current, one month overdue, two months overdue, written off as bad debts, or paid in full. Of those that are current, 80% are paid that month, and the rest become one month overdue. Of the one month overdue bills, 90% are paid, and the res

    Markov's Inequality

    Let S be a random variable (not necessarily positive). Prove using Markov's inequality that for every p>0 and for every constant q, P(S>=a) (=<) exp(-pq)E(exp(pS)) Here >= denoted greater than or equal to, and =< denotes less than or equal to.

    Qualitative Methods - Markov Processes

    Markov Processes 5. For a situation with weekly dining at either an Italian or Mexican restaurant, a. the weekly visit is the trial and the restaurant is the state. b. the weekly visit is the state and the restaurant is the trial. c. the weekly visit is

    Branching Process and Markov Chain

    This is a problem on a branching process. I have come up with a solution to this problem, but have no way of telling whether it´s correct or not. Need a detailed solution, thank you. See attached file for full problem description.

    Markov Chains

    Markov chains. I am finding the continuous case a bit unintuitive. This problem concerns martingales and branching processes, but it is supposedly not hard. I just need a detailed solution to this problem, so that I can see what concepts are used to solve it.

    2 problems

    Question #1. Chris Dunphy, executive vice president for marketing and sales of Sumu Electronics, is considering the possibility of introducing a new line of inexpensive wrist watches, which would be oriented primarily toward young adults. The watch would have a plastic faceplate and wristband and a variety of features, includin

    Markov Analysis, market share and brand switching calculation

    Markov Analysis 1. Three fast food hamburger restaurants are competing for the college lunch crowd. Burger Bills has 40% of the market while Hungary Heifer and Salty Sams each have 30% of the market. Burger Bills loses 10% of its customers to the Hungary Heifer and 10% to Salty Sams each month. The Hungary Heifer loses 5%

    Bayesian Statistics Approach

    Explain with the help of a text or personal example, the use and application of Bayes' approach. Briefly, what is the history of Bayesian statistics, and how is it being used in business today? What is the difference between Bayes' and Empirical Bayes' approaches?

    Reflected Brownian motion of a Wiener process is a Markov process.

    Let Wt be a standard Wiener process and define the reflected Brownian motion as Z(t) = |W(t)|. Show that the reflected Brownian motion is a Markov process and express its transition probability density function p(t; x, y) in terms of the transition density of the standard Wiener process Hint: Prove that the following iden

    Probability Concepts Used in Business

    Using our data set from Unit 1, compose an email to the head of the American Intellectual Union which discusses the following: 1. How would you use the concept of probabilities to apply to profiles for hiring more satisfied individuals? Job satisfaction is an attitude about one's job. It may be measured globally or via facet

    Markov Chains - Scheduling Problem

    An outdoor restaurant in a summer resort closes only on days that it rains. From past records it is found that from May through September, when it rains one day, the probability of rain for the next day is .6; when it does not rain one day, the probability of rain the next day is .1. A. Draw a transition diagram. B. Write

    Some questions about a Markov chain.

    Category: Statistics > Simple Regression Subject: birth-death process Details: Let a Markov chain have transition probabilities Pi,i+1 = 1/(i+1) and Pi,0 = i/(i+1) for i=0,1,2, . . . . i) Is the chain positive recurrent? Or transient or null recurrent? ii) Does a stationary distribution exist? If it does, what is it? S

    Markov Chain

    Please see the attached file for the fully formatted problems. At the beginning of each year, my car in good, fair, or broken-down condition. A good car will good at the beginning of the next year with the probability 0.85; fair with the probability 0.10; or broken-down quality with a probability of 0.05. A fair car will be