《随机过程-第二版》（英文电子版） *Sheldon M. Ross* 答案整理，此书作为随机过程经典教材没有习题讲解，所以将自己的学习过程记录下来，部分习题解答参考了网络，由于来源很多，出处很多也不明确，无法一一注明，在此一并表示感谢！希望对大家有帮助，水平有限，不保证正确率，欢迎批评指正，转载请注明出处。

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)

Since there is no official solution manual for this book, I

handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but **No Correctness or Accuracy Guaranteed. **Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

**5.1** A population of organisms consists of both male and female members. In a small colony any particular male is likely to mate with any particular female in any time interval of length \(h\), with probability \(\lambda h + o(h)\). Each mating immediately produces one offspring, equally likely to be male of female. Let \(N_1(t)\) and \(N_2(t)\) denote the number of males and females in the population at \(t\). Derive the parameters of the continuous-time Markov chain \(\{N_1(t), N_2(t)\}\).

$$q[(n, m), (n+1, m)] = q[(n,m),(n, m+1)] = \frac{\lambda mn}{2}$$

**5.2** Suppose that a one-celled organism can be in one of two states–either \(A\) or \(B\). An individual in state \(A\) will change to state \(B\) at an exponential rate \(\alpha\), an individual in state \(B\) divides into two new individuals of type \(A\) at an exponential rate \(\beta\). Define an appropriate continuous-time Markov chain for a population of such organisms and determine the appropriate parameters for this model.

$$\begin{align}

q[(n, m), (n-1, m+1)] &= \alpha n\\

q[(n, m), (n+2, m-1)] &= \beta m

\end{align}$$

**5.4** For a pure birth process with birth parameters \(\lambda_n, n \geq 0\), compute the mean, variance, and moment generating function of the time it takes the population to go from size \(1\) to size \(N\).

Let \(T_i\) denote the time between birth of \(i-1\) and \(i\), then \(T_i\) are independent and \(T_i \sim \text{Exponential}(\lambda_i)\).

$$

E[\sum_{i=1}^{n}T_i|n = N-1] = \sum_{i=1}^{N-1} E[T_i] = \sum_{i=1}^{N-1} 1/\lambda_i \\

Var(\sum_{i=1}^{n}T_i|n = N-1 ) = \sum_{i=1}^{N-1} Var(T_i) = \sum_{i=1}^{N-1} 1/\lambda_i^2 \\

\phi(t) = \prod_{i=1}^{N-1}\frac{\lambda_i}{\lambda_i – t}$$

**5.6** Verify the formula

$$A(t) = a_0 + \int_0^t X(s)ds,$$

given in Example 5.3(B).

Consider this as a renewal reward process, every one get unit paid per unit time, then the sum of cost is also the sum of the age.

$$A(t) = a_0 + t + \int_0^t X(s) – 1 ds = a_0 + \int_0^t X(s)ds$$

**5.7** Consider a Yule process starting with a single individual and suppose that with probability \(P(s)\) an individual born at time \(s\) will be robust. Compute the distribution of the number of robust individuals born in \((0, t)\).

Let \(N(t)\) denote the number of individuals born in \((0, t)\), and \(R(t)\) of which are robust. Then

$$\begin{align}

&P\{R(t) = k\} \\

=& \sum_{n=k}^{\infty}P\{R(t) = k|N(t) = n\}P\{N(t) = n\}\\

=& \sum_{n=k}^{\infty}{n \choose k}[\int_0^t f(s)P(s)ds]^k[ 1- \int_0^t f(s)P(s)ds] e^{-\lambda t}(1 – e^{-\lambda t})^{n}

\end{align}$$

where \(f(s) = \frac{\lambda e^{-\lambda(t-s)}}{1 – e^{-\lambda t}}\)

**5.9** Prove Lemma 5.4.2.

$$\begin{align}

P_{ij}(t+s) &= P\{X(t+s) = j | X(0) = i\} \\

&= \sum_{k=0}^{\infty} P\{X(t+s) = j, X(t) = k | X(0) = i\} \\

&= \sum_{k=0}^{\infty} P\{X(t+s) = j| X(t) = k, X(0) = i\}P\{X(t) = k | X(0) = i\}\\

&= \sum_{k=0}^{\infty} P\{X(t+s) = j| X(t) = k\}P\{X(t) = k | X(0) = i\}\\

&= \sum_{k=0}^{\infty} P_{kj}(s)P_{ik}(t)\\

\end{align}$$

**5.10** Let \(P(t) = P_{00}(t)\).**(a)** Find \(\lim_{t \to 0}\frac{1 – P(t)}{t}\).**(b)** Show that \(P(t)P(s) \leq P(t+s) \leq 1 – P(s) + P(s)P(t)\).**(c)** Show \(|P(t) – P(s)| \leq 1 – P(t-s), \quad s < t\), and conclude that \(P\) is continuous.

**(a)** \(\lim_{t \to 0}\frac{1 – P(t)}{t} = \lim_{t \to 0}\frac{1 – P(t)}{t} = v_0 \) **(b)** $$\begin{align}

P(t)P(s) &\leq \sum_{k=0}^{\infty}P_{0k}(t)P_{k0}(s) = P(t+s)\\

&= \sum_{k=1}^{\infty}P_{0k}(s)P_{k0}(t) + P_{00}(t)P_{00}(s) \\

&\leq \sum_{k=0}^{\infty}P_{0k}(s) + P(s)P(t) \\

&= 1 – P(s) + P(s)P(t)

\end{align}$$**(c)** From (b) we have

$$P(s)P(t-s) \leq P(t) \leq 1 – P(t-s) + P(t-s)P(s)\\

P(s)(P(t-s) – 1) \leq P(t) – P(s) \leq (1 – P(s))(1 – P(t-s)) \\

P(t-s) – 1 \leq P(t) – P(s) \leq (1 – P(t-s) $$

From the inequality, we have

$$\lim_{t \to s}|P(t) – P(s)| \leq \lim_{t \to s} 1 – P(t-s) = 1 – P(0) = 0$$

**5.11** For the Yule process**(a)** Verify that

$$P_{ij}(t) = {j – 1 \choose i – 1}e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i}$$

satisfies the forward and backward equations.**(b)** Suppose that \(X(0) = 1\) and that at time \(T\) the process stops and is replaced by an emigration process in which departures occur in a Poisson process of rate \(\mu\). Let \(\tau\) denote the time taken after \(T\) for the population to vanish. Find the density function of \(\tau\) and show that \(E[\tau] = e^{\lambda T}/\mu\).

**(a)** $$\begin{align}

P_{ij}^{\prime}(t) &= {j-1 \choose i-1} \lambda e^{-i\lambda t} [-i(1 – e^{-\lambda t})^{j-i} + (j-i)e^{-\lambda t}(1 – e^{-\lambda t})^{j-i-1}] \\

&= {j-1 \choose i-1} \lambda e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i-1}(je^{-\lambda t} – i)\\

&\sum_{k \neq i} q_{ik}P_{kj}(t) – v_iP_{ij}(t) \\

=&q_{i,i+1}P_{i+1, j}(t) – v_iP_{ij}(t) \\

=&i\lambda[{j – 1 \choose i – 2}e^{-i\lambda t}e^{-\lambda t}(1 – e^{-\lambda t})^{j-i -1} – {j – 1 \choose i – 1}e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i}]\\

=&i\lambda e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i-1} [{j-1 \choose i}e^{-\lambda t} – {j-1 \choose i – 1}(1 – e^{-\lambda t})] \\

=&{j-1 \choose i-1} \lambda e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i-1}(je^{-\lambda t} – i)\\

&\sum_{k \neq j} q_{kj}P_{ik}(t) – v_jP_{ij}(t) \\

=&q_{j-1,j}P_{i, j-1}(t) – v_jP_{ij}(t) \\

=&\lambda[(j-1){j – 2 \choose i – 1}e^{-i\lambda t}(1 – e^{-\lambda t})^{j-1-i} – j{j – 1 \choose i – 1}e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i}]\\

=&\lambda e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i-1} [(j-1){j-2 \choose i – 1} – j{j-1 \choose i – 1}(1 – e^{-\lambda t})] \\

=&{j-1 \choose i-1} \lambda e^{-i\lambda t}(1 – e^{-\lambda t})^{j-i-1}(je^{-\lambda t} – i)\\

\end{align}$$**(b)** $$\begin{align}

P\{\tau = t\} &= \sum_{n=1}^{\infty} P\{\tau = t | X(T) = n\}P\{X(T) = n\} \\

&= \sum_{n=1}^{\infty} e^{-\mu t}\frac{(\mu t)^{n-1}}{(n-1)!}e^{-\lambda T}(1 – e^{-\lambda T})^{n-1} \end{align}$$

Let \(t_i\) denote the time between \(i-1\) and \(i\) member departure. Then

$$E[\tau] = E[\sum_{i=1}^{X(T)} t_i] = E[X(T)]E[t_i] = e^{\lambda T}/\mu $$

**5.12** Suppose that the “state” of the system can be modeled as a two-state continuous-time Markov chain with transition rates \(v_0 = \lambda, v_1 = \mu\). When the state of the system is \(i\), “events” occur in accordance with a Poisson process with rate \(\alpha_i, i = 0, 1\). Let \(N(t)\) denote the number of events in \((0, t)\).**(a)** Find \(\lim_{t \to \infty}N(t)/t\).**(b)** If the initial state is state 0, find \(E[N(t)]\).

**(a)** Define a renewal reward process as follows. A renewal occurs when the process enters state 0 and reward in cycle equals the number of events in that cycle. Let the length of nth cycle \(X_n\) is the sum of time spent in state 0 and 1, say \(X_{0n}\) and \(X_{1n}\) respectively. Thus \(E[X_n] = E[X_{0n}] + E[X_{1n}] = \lambda^{-1} + \mu^{-1}\). Further if \(R_n\) is the reward in the nth cycle, with \(R_{0n}\) and \(R_{1n}\) earned in state 0 and 1 respectively

$$\begin{align}

E[R_n] &= E[R_{0n}] + E[R_{1n}] = E[E[R_{0n}| X_{0n}] + E[R_{1n}|X_{1n}]] \\

&= E[\alpha_0X_{0n}] + E[\alpha_1X_{1n}]\\

&= \alpha_0/\lambda + \alpha_1/\mu.

\end{align}$$Thus

$$\lim_{t \to \infty} \frac{N(t)}{t} = \lim_{t \to \infty} \frac{R(t)}{t} = \frac{E[R]}{E[X]} = \frac{\alpha_0\mu + \alpha_1\lambda}{\lambda + \mu}$$**(b)** Let \(T_i(t)\) denote the time spent in state \(i\) by the time \(t\).

$$\begin{align}

E[N(t)] &= \alpha_0E[T_0(t)] + \alpha_1(1 – E[T_0(t)])\\

&= (\alpha_0 – \alpha_1)E[T_0(t)] + \alpha_1t \\

&= \alpha_1t + (\alpha_0 – \alpha_1)\int_0^t P_{00}(s)ds

\end{align}$$

By the forward equation, \(P_{00}^{\prime} = – (\lambda + \mu)P_{00}(t) + \mu\), and \(P_{00}(t) = 1\). Solve the equations, we get

$$\begin{align}

P_{00}(t) &= \frac{\mu}{\lambda + \mu} + \frac{\lambda}{\lambda + \mu}e^{-(\lambda + \mu)t} \\

E[N(t)] &= \frac{\alpha_0\mu + \alpha_1\lambda}{\lambda + \mu}t + \frac{\alpha_1 – \alpha_0}{(\lambda + \mu)^2}\lambda(e^{-(\lambda + \mu)t} – 1).

\end{align}$$

**5.13** Consider a birth and death process with birth rates \(\{\lambda_n\}\) and death rates \(\{\mu_n\}\). Starting in state \(i\), find the probability that the first \(k\) events are all births.

$$\prod_{n=i}^{i+k-1}\frac{\lambda_n}{\lambda_n + \mu_n}$$

**5.15** Consider a population in which each individual independently gives birth at an exponential rate \(\lambda\) and dies at an exponential rate \(\mu\). In addition, new members enter the population in accordance with a Poisson process with rate \(\theta\). Let \(X(t)\) denote the population size at time \(t\).**(a)** What type of process is \(\{X(t), t \geq 0\}\)?**(b)** What are its parameters?**(c)** Find \(E[X(t)|X(0)=i]\).

**(a)(b)** This is a linear birth/death process with immigration, having parameters \(\mu_n = n\mu\) and \(\lambda_n = n\lambda + \theta\).**(c)** Note that

$$E[X(t+h)|X(0)] = E[X(t)|X(0)] + (\lambda – \mu)E[X(t)|X(0)]h + \theta h + o(h)$$

Defining \(M(t) \equiv E[X(t)|X(0)]\) we get the differential equation:

$$M^{\prime}(t) = (\lambda – \mu)M(t) + \theta$$Solve the equations with \(M(0) = i\)

$$M(t) = \left\{\begin{array}{ll}

\theta t + i \quad \lambda = \mu \\

(i+\frac{\theta}{\lambda – \mu})e^{(\lambda-\mu)t}-\frac{\theta}{\lambda-\mu} \quad otherwise \\ \end{array}\right. $$

**5.16** In Example 5.4(D), find the variance of the number of males in the population at time \(t\).

$$\begin{align}

&E[Y^2(t+h)|Y(t)]\\

=& [Y(t) + 1]^2\lambda(1-p)X(t)h + [Y(t) – 1]^2vY(t)h\\ +& Y^2(t)[1 -\lambda(1-p)X(t)h – vY(t)h] + o(h)\\

=& Y^2(t) – 2vhY^2(t) + 2\lambda(1-p)hY(t)X(t) + vY(t)h + \lambda(1-p)X(t)h

\end{align}$$

Let \(M_{Y2}(t)\) denote \(E[Y^2(t)]\), divide above by \(h\) and taking expectation,

$$M_{Y2}^{\prime} = 2\lambda(1-p)M_{XY}(t) + \lambda(1-p)M_X(t) – 2vM_{Y2}(t) + vM_Y(t)$$

**5.17** Let \(A\) be a specified set of states of a continuous-time Markov chain and \(T_i(t)\) denote the amount of time spent in \(A\) during the time interval \([0,t]\) given that the chain begins in state \(i\). Let \(Y_1, \dots, Y_n\) be independent exponential random variables with mean \(\lambda\). Suppose the \(Y_i\) are independent of the Markov chain, and set \(t_i(n) = E[T_i(Y_1 + \dots + Y_n)]\).**(a)** Derive a set of linear equations for \(t_i(1), i \geq 0\).**(b)** Derive a set of linear equations for \(t_i(n)\) in terms of the other \(t_j(n)\) and \(t_i(n-1)\).**(c)** When \(n\) is large, for what value of \(\lambda\) is \(t_i(n)\) a good approximation of \(E[T_i(t)]\).

**5.18** Consider a continuous-time Markov chain with \(X(0) = 0\). Let \(A\) denote a set of states that does not include 0 and set \(T = Min\{t > 0: X(t) \in A\}\). Suppose that \(T\) is finite with probability 1. Set \(q_i = \sum_{j \in A}q_{ij}\), and consider the random variable \(H = \int_0^Tq_{X(t)}dt\), called the random hazard.**(a)** Find the hazard rate function of \(H\), That is, find \(\lim_{h \to 0} P\{s < H < s+h|H>s\}/h\)(*Hint: Condition on the state of the chain at the time \(\tau\) when \(\int_0^Tq_{X(t)}dt = s\)*). **(b)** What is the distribution of \(H\)?

**5.19** Consider a continuous-time Markov chain with stationary probabilities \(\{P_i, i \geq 0\}\), and let \(T\) denote the first time that the chain has been in state 0 for \(t\) consecutive time units. Find \(E[T|X(0)=0]\).

**5.20** Each individual in a biological population is assumed to give birth at an exponential rate \(\lambda\) and to die at an exponential rate \(\mu\). In addition, there is an exponential rate of increase \(\theta\) due to immigration. However, immigration is not allowed when the population size is \(N\) or larger.**(a)** Set this up as a birth and death model.**(b)** If \(N=3, 1 = \theta=\lambda, \mu=2\), determine the proportion of time that immigration is restricted.

**5.21** A small barbershop, operated by a single barber, has room for at most two customers. Potential customer arrive at a Poisson rate of three per hour, and the successive service times are independent exponential random variables with mean 1/4 hour.**(a)** What is the average number of customers in the shop?**(b)** What is the proportion of potential customers that enter the shop?**(c)** If the barber could work twice as fast, how much more business would she do?

**5.22** Find the limiting probabilities for the M/M/s system and determine the condition needed for these to exist.

**5.23** If \(\{X(t), t \geq 0\}\) and \(\{Y(t), t \geq 0\}\) are independent time-reversible Markov chains, show that the process \(\{X(t), Y(t), t \geq 0\}\) is also.

**5.24** Consider two M/M/1 queues with respective parameters \(\lambda_i, \mu_i, i=1,2\). Suppose they both share the same waiting room, which has finite capacity \(N\). (That is, whenever this room is full all potential arrivals to either queue are lost.) Compute the limiting probability that there will be \(n\) people at the first queue (1 being served and \(n-1\) in the waiting room when \(n > 0\) and \(m\) at the second. *(Hint: Use the result of Problem 5.23)*

**5.25** What can you say about the departure process of the stationary M/M/1 queue having finite capacity.

**5.26** In the stochastic population model of Section 5.6.2:**(a)** Show that \(P(\underline{n})q(\underline{n},D_j\underline{n}) = P(D_j\underline{n})q(D_j\underline{n},\underline{n})\), where \(P(\underline{n})\) is given by (5.6.4) with \(\alpha_j = (\lambda/jv)(v/\mu)^j\).**(b)** Let \(D(t)\) denote the number of families that die out in \((0,t)\). Assuming that the process is in steady state 0 at time \(t = 0\), what type of stochastic process is \(\{D(t), t \geq 0\}\)? What if the population is initially empty at \(t = 0\)?

**5.27** Complete the proof of the conjecture in the queueing network model of Section 5.7.1

**5.28** \(N\) customers move about among \(r\) servers. The service times at server \(i\) are exponential at rate \(\mu_i\) and when a customer leaves server \(i\) it joins the queue (if there are any waiting–or else it enters service) at server \(j, j \neq i\), with probability \(1/(r-1)\). Let the state be \((n_1, \dots, n_r)\) when there are \(n_i\) customers at server \(i, i = 1, \dots, r\). Show the corresponding continuous-time Markov chain is time reversible and find the limiting probabilities.

**5.29** Consider a time-reversible continuous-time Markov chain having parameters \(v_i, P_{ij}\) and having limiting probabilities \(P_j, j \geq 0\). Choose some state–say state 0–and consider the new Markov chain, which makes state 0 an absorbing state. That is, reset \(v_0\) to equal 0. Suppose now at time points chosen according to a Poisson process with rate \(\lambda\), Markov chains–all of the above type (having 0 as an absorbing state)–are started with the initial states chosen to be \(j\) with probability \(P_{0j}\). All the existing chains are assumed to be independent. Let \(N_j(t)\) denote the number of chains in state \(j, j > 0\), at time \(t\)**(a)** Argue that if there are no initial chains, then \(N_j(t), j > 0\), are independent Poisson random variables.**(b)** In steady state argue that the vector process \(\{(N_1(t), N_2(t), \dots)\}\) is time reversible with stationary probabilities

$$P(\underline{n}) = \prod_{j=1}^{\infty}e^{-\alpha_j}\frac{\alpha_j^{n_j}}{n_j!}, \quad \text{for } \underline{n} = (n_1, n_2, \dots),$$

where \(\alpha_j = \lambda P_j/P_0v_0\).

**5.30** Consider an \(M/M/\infty\) queue with channels (servers) numbered \(1, 2, \dots\) On arrival, a customer will choose the lowest numbered channel that is free. Thus, we can think of all arrivals as occurring at channel 1. Those who find channel 1 busy overflow and become arrivals at channel 2. Those finding both channels 1 and 2 busy overflow channel 2 and become arrivals at channel 3, and so on.**(a)** What fraction of time is channel 1 busy?**(b)** By considering the corresponding M/M/2 loss system, determine the fraction of time that channel 2 is busy.**(c)** Write down an expression for the fraction of time channel \(c\) is busy for arbitrary \(c\).**(d)** What is the overflow rate from channel \(c\) to channel \(c+1\)? Is the corresponding overflow process a Poisson process? A renewal process? Express briefly.**(e)** If the service distribution were general rather than exponential, which (if any) of your answer to (a)-(d) would change? Briefly explain.

**5.31** Prove Theorem 5.7.1

**5.32** **(a)** Prove that a stationary Markov process is reversible if, and only if, its transition rates satisfy

$$q(j_1, j_2)q(j_2, j_3)\dots q(j_{n-1}, j_n)q(j_n, j_1)=q(j_1, j_n)q(j_n, j_{n-1})\dots q(j_3, j_2)q(j_2, j_1) $$for any finite sequence of state \(j_1, j_2, \dots, j_n\)**(b)** Argue that is suffices to verify that the equality in (a) holds for sequences of distinct states.**(c)** Suppose that the stream of customers arriving at a queue forms a Poisson process of rate \(v\) and that there are two servers who possibly differ in efficiency. Specifically, suppose that a customer’s service time at server \(i\) is exponentially distributed with rate \(\mu_i\), for \(i=1, 2\), where \(\mu_1 + \mu_2 > v\). If a customer arrives to find both servers free, he is equally likely to be allocated to either server. Define an appropriate continuous-time Markov chain this model, show that it is time reversible, and determine the limiting probabilities.

**5.33** The work in a queueing system at any time is defined as the sum of the remaining service times of all customers in the system at that time. For the M/G/1 in steady state compute the mean and variance of the work in the system.

**5.36** Consider the two-state Markov chain of Example 5.8(A), with \(X(0) = 0\).**(a)** Compute \(Cov(X(s), X(y))\).**(b)** Let \(S_0(t)\) denote the occupation time of state 0 by \(t\). Use (a) and (5.8.4) to compute \(Var(S(t))\).

**5.37** Let \(Y_i = X_{(i)} – X_{(i-1)}, i = 1, \dots, n+1\) where \(X_{(0)} = 0\), \(X_{(n+1)} = t\), and \(X_{(1)} \leq X_{(2)} \leq \dots \leq X_{(n)}\) are the ordered values of a set of \(n\) independent uniform \((0,t)\) random variables. Argue that \(P\{Y_i \leq y_i, i=1, \dots, n+1\}\) is a symmetric function of \(y_1, \dots, y_n\).