为什么我等的公交车总是迟到？

数学证明

\begin{align} P\{X_{N(t)+1} \geq x\} &= \int_0^{\infty} P\{X_{N(t)+1} \geq x | S_{N(t)} = s \}dF_{S_{N(t)}}(s)\\ &= \int_0^{\infty} P\{X_{N(t)+1} \geq x | X_{N(t) + 1} > t – s \}dF_{S_{N(t)}}(s) \\ &= \int_0^{\infty} \frac{P\{X_{N(t)+1} \geq x, X_{N(t) + 1} > t – s \}}{P\{X_{N(t) + 1} > t – s \}}dF_{S_{N(t)}}(s) \\ &= \int_0^{\infty} \frac{1 – F(max\{x, t-s\})}{1 – F(t-s)}dF_{S_{N(t)}}(s) \\ &= \int_0^{\infty} min\{\frac{1 – F(x)}{1 – F(t-s)}, \frac{1 – F(t-s)}{1 – F(t-s)}\}dF_{S_{N(t)}}(s) \\ &= \int_0^{\infty} min\{\frac{1 – F(x)}{1 – F(t-s)}, 1\}dF_{S_{N(t)}}(s) \\ &\geq \int_0^{\infty} 1 – F(x)dF_{S_{N(t)}}(s) \\ &= \bar{F}(x) \end{align}

$$F(x) = 1 – e^{-\lambda x}$$

\begin{align} P\{X_{N(t)+1} \geq x\} &= \int_0^{\infty} \frac{e^{-\lambda x} }{e^{-\lambda(t-s)}}dF_{S_{N(t)}}(s) \\ &= \int_0^{t-x} dF_{S_{N(t)}}(s) + \int_{t-x}^t e^{-\lambda x} dm(s) \\ &= – e^{-\lambda t} + (1 + x/\lambda)e^{-\lambda x} \\ & = – e^{-\lambda t} > \bar{F}(x) \end{align}

经济学的困境

Reference:

Ross, S. M., Kelly, J. J., Sullivan, R. J., Perry, W. J., Mercer, D., Davis, R. M., … & Bristow, V. L. (1996). Stochastic processes (Vol. 2). New York: Wiley.

《随机过程》勘误

p9 例1.3(C)： 随机变量至少有一个值与其均值一样大

p30 题1.17： 在……的第i个最小者中

p31 题1.22： $$Var(X) = E[Var(X|Y)] + Var(E[E|Y])$$

p31 题1.23：以a记质点……

p33 题1.43：此题应该少了一个条件$$t \geq 0$$, 虽然本书和原书都没有这个条件，但当$$t < 0$$时，容易找出一个反例使该不等式不成立

p40 最后一段： ……中第k个最小值

p80 例3.5(C)第5行：直至投掷中出现两个反面

p82 第4行：其中初始分布式$$Y_D(t)$$的分布

p109 倒数第4行：令$$n$$趋向于0然后令$$M$$趋向$$\infty$$，导致……

p117 倒数第9行：且N是一个……停时

p128 第7行：则对$$j$$求和导致

p130 第2行：移动到它的叶子的概率

p131 定理4.7.2第2行：此处应删去多余的$$i_1, i_2$$

p146 例5.3(A)： 在群体中每个个体假定以指数率$$\lambda$$出生

p156 5.5节第4行： 则极限概率为$$P_j = \lim_{i \to \infty}P_{ij}^t$$

p185 鞅的更多例子(4)： 那么如1.9节所示

p192 例6.3(B)： 结束其空的状态

p215 第4行： $$P\{$$迟早越过$$A\} \leq e^{-\theta A}$$

p215 倒数第8行： $$X_{n+1} + \sum_{i=1}^{n-1}(Y_i – X_{i+1})$$

p217 第5行： $$S_n = \sum_{i=1}^{n}(Y_i – cY_i)$$

p223 第18行： 与过程在时刻$$t$$以前的一切值独立

p302 3.17答案第3行： $$g = h + h * F = (h + g*F)*F_2$$

p305 4.13答案应为3.33题答案

p305 4.13答案第5行：$$\lim_{k \to \infty}\frac{\text{直至}N_k+m\text{访问}j\text{的次数}}{n}\frac{n}{N_k + m}$$

$$\lim_{n \to \infty}\frac{\text{number of visits to } j\text{ by time }N_n + m}{n}\frac{n}{N_n + m}$$

p308 5.3答案：$$P\{N(t) \geq n\} \leq \sum_{j=n}^{\infty}e^{-Mt}\frac{M(t)^j}{j!}$$

p309 5.4答案最后1行：$$P_{ij}(t) = v_iP_{ij}t + o(t)$$

Solutions to Stochastic Processes Ch.8

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

In Problem 8.1, 8.2 and 8.3, let $$\{X(t), t \geq 0\}$$ denote a Brownian motion process.

8.1 Let $$Y(t) = tX(1/t)$$.
(a) What is distribution of $$Y(t)$$?
(b) Compute $$Cov(Y(s), Y(t)$$
(c) Argue that $$\{Y(t), t \geq 0\}$$ is also Brownian motion
(d) Let $$T = inf\{t>0: X(t)=0\}$$. Using (c) present an argument that $$P\{T = 0\} = 1$$.

(a) $$X(1/t) \sim N(0, 1/t)$$ then $$Y(t) = tX(1/t) \sim N(0, t)$$
(b) \begin{align} Cov(Y(s), Y(t)) &= Cov(sX(1/s), tX(1/t)) \\ &= st\cdot Cov(X(1/s), X(1/t)) \\ &= st\cdot min(1/s, 1/t) = min(s, t) \end{align}
(c) Since $$\{X(t)\}$$ is a Gaussian process so is $$\{Y(t)\}$$. Further from parts (a) and (b) above $$\{Y(t)\}$$ is a Brownian motion.
(d) Since $$Y(t)$$ is Brownian motion then $$T_1 \equiv sup\{t: Y(t) = 0\} = \infty$$ with probability 1. Note $$\{T = 0\} = \{T_1 = \infty\}$$. Thus, $$P\{T = 0\} = 1$$

8.2 Let $$W(t) = X(a^2t)/a$$ for $$a > 0$$. Verify that $$W(t)$$ is also Brownian motion.

$$W(0) = X(0)/a = 0$$. Non-overlapping increments of $$W(t)$$ map to non-overlapping increments of $$X(t)$$. Thus increments of $$W(t)$$ are independent. Further, for $$s < t$$,
$$W(t) – W(s) = \frac{X(a^2t) – X(a^2s)}{a} \sim N(0, t-s)$$
Thus $$W(t)$$ has stationary increments with required distribution. Therefore, $$W(t)$$ is a Brownian motion.

8.5 A stochastic process $$\{X(t), t \geq 0\}$$ is said to be stationary if $$X(t_1), \dots, X(t_n)$$ has the same joint distribution as $$X(t_1+a), \dots, X(t_n +a)$$ for all $$n, a, t_1, \dots, t_n$$.
(a) Prove that a necessary and sufficient condition for a Gaussian process to be stationary is that $$Cov(X(s), X(t))$$ depends only on $$t-s, s \leq t$$, and $$E[X(t)] = c$$.
(b) Let $$\{X(t), t \geq 0\}$$ be Brownian motion and define
$$V(t) = e^{-\alpha t/2}X(\alpha e^{\alpha t})$$
Show that $$\{V(t), t \geq 0\}$$ is a stationary Gaussian process. It is called Ornstein-Uhlenbeck process.

(a) If the Gaussian process is stationary then for $$t > s, (X(t), X(s))$$ and $$(X(t-s), X(0))$$ have same distribution. Thus, $$E[X(s)] = E[X(0)]$$ for all $$s$$ and $$Cov(X(t), X(s)) = Cov(X(t-s), X(0))$$, for all $$t < s$$. Now, assume $$E[X(t)] = c$$ and $$Cov(X(t), X(s)) = h(t-s)$$. For any $$T = (t_1, \dots, t_k)$$ define vector $$X_T \equiv (X(t_1), \dots, X(t_k))$$. Let $$\tilde{T} = (t_1-a, \dots, t_k -a)$$. If $$\{X(t)\}$$ is a Gaussian process then both $$X_T$$ and $$X_{\tilde{T}}$$ are multivariate normal and it suffices to show that they have the same mean and covariance. This follows directly from the fact that they have the same element-wise mean $$c$$ and the equal pair-wise covariances, $$Cov(X(t_i-a), X(t_j -a)) = h(t_i-t_j) = Cov(X(t_i), X(t_j))$$
(b) Since all finite dimensional distributions of $$\{V(t)\}$$ are normal, it is a Gaussian process. Thus from part (a) is suffices to show the following:
(i) $$E[V(t)] = e^{-at/2}E[X(\alpha e^{\alpha t})] = 0$$. Thus $$E[V(t)]$$ is constant.
(ii) For $$s \leq t$$, \begin{align} Cov(V(s), V(t)) &= e^{-\alpha(t+s)/2}Cov(X(\alpha e^{\alpha s}), X(\alpha e^{\alpha t}))\\ &= e^{-\alpha(t+s)/2}\alpha e^{\alpha s} = \alpha e^{-\alpha(t-s)/2} \end{align}
which depends only on $$t-s$$.

8.8 Suppose $$X(1) = B$$. Characterize, in the manner of Proposition 8.1.1, $$\{X(t), 0 \leq t \leq 1\}$$ given that $$X(1) = B$$.

Condition on $$X(1) = B, X(t) \sim N(Bt, t(1-t))$$, then $$Z(t) \sim N(0, t(1-t))$$ which is a Brownian motion.

8.9 Let $$M(t) = max_{0 \leq s \leq t} X(s)$$ and show that
$$P\{M(t) > a| M(t) = X(t)\} = e^{-a^2/2t}, \quad a > 0$$

From Section 8.3.1, we get
$$P\{M(t) > y, X(t) < x\} = \int_{2y-x}^{\infty}\frac{1}{\sqrt{2\pi t}}e^{-u^2/2t}du$$
By using Jacobian formula, we can derive the density function of $$M(t)$$ and $$W(t) = M(t) – X(t)$$, which we denote by $$f_{MW}$$. Thus
$$f_W(w) = 2\int_0^{\infty}f_{MW}(m, w)dm \\ P\{M(t) > a | W(t) = 0\} = 1 – \int_0^a \frac{f_{MW}(m, 0)}{f_W(0)}dm$$
The last equation can be computed, which equal $$e^{-a^2/2t}$$

8.10 Compute the density function of $$T_x$$, the time until Brownian motion hits $$x$$.

\begin{align} f_{T_x}(t) &= F_{T_x}^{\prime}(t) = (\frac{2}{\sqrt{2\pi}}\int_{|x|/\sqrt{t}}^{\infty}e^{-y^2/2}dy)^{\prime} \\ &= -\frac{2}{\sqrt{2\pi}} \cdot e^{-x^2/2t} \cdot \frac{x}{2}t^{-3/2}\\ &= -\frac{x}{\sqrt{2\pi}}e^{-x^2/2t}t^{-3/2} \end{align}

8.11 Let $$T_1$$ denote the largest zero of $$X(t)$$ that is less than $$t$$ and let $$T_2$$ be the smallest zero greater than $$t$$. Show that
(a) $$P\{T_2 < s\} = (2/\pi)\arccos\sqrt{t/s}, s> t$$.
(b) $$P\{T_1 < s, T_2 > y\} = (2/\pi)\arcsin\sqrt{s/y}, s < t< y$$.

(a) \begin{align} P\{T_2 < s\} &= 1 – P\{\text{no zeros in } (t, s)\} \\ &= 1 – \frac{2}{\pi}\arcsin\sqrt{t/s} \\ &= (2/\pi)\arccos\sqrt{t/s} \end{align}
(b) $$P\{T_1 < s, T_2 > y\} = P\{\text{no zeros in } (s, y)\} = \frac{2}{\pi}\arcsin\sqrt{s/y}$$

8.12 Verify the formulas given in (8.3.4) for the mean and variance of $$|X(t)|$$.

$$f_Z(y) = (\frac{2}{\sqrt{2\pi t}}\int_{-\infty}^y e^{-x^2/2t}dx – 1)^{\prime} = \frac{2}{\sqrt{2\pi t}}e^{-y^2/2t}\\ E[Z(t)] = \int_{0}^{\infty}yf_Z(y)dy = -\frac{2t}{\sqrt{2\pi t}}e^{-y^2/2t}|_0^{\infty} = \sqrt{2t/\pi} \\ Var(Z(t)) = E[Z^2(t)] – E^2[Z(t)] = E[X^2(t)] – E^2[Z(t)] = (1 – \frac{2}{\pi})t$$

8.13 For Brownian motion with drift coefficient $$\mu$$, show that for $$x>0$$
$$P\{max_{0 \leq s \leq h} |X(s)| > x\} = o(h).$$

8.18 Let $$\{X(t), t \geq 0\}$$ be a Brownian motion with drift coefficient $$\mu, \mu < 0$$, which is not allowed to become negative. Find the limiting distribution of $$X(t)$$.

8.19 Consider Brownian motion with reflecting barriers of $$-B$$ and $$A, A >0, B > 0$$. Let $$p_t(x)$$ denote the density function of $$X_t$$.
(a) Compute a differential equation satisfied by $$p_t(x)$$.
(b) Obtain $$p(x) = \lim_{t \to \infty} p_t(x)$$.

8.20 Prove that, with probability 1, for Brownian motion with drift $$\mu$$.
$$\frac{X(t)}{t} \to \mu, \quad \text{ as } t \to \infty$$

8.21 Verify that if $$\{B(t), t \geq 0\}$$ is standard Brownian motion then $$\{Y(t), t \geq 0\}$$ is a martingale with mean 1, when $$Y(t) = exp\{cB(t) – c^2t/2\}$$

\begin{align} E[Y(t)] &= \int_{-\infty}^{\infty} exp\{cx – c^2t/2\}\frac{1}{\sqrt{2\pi t}}exp\{-x^2/2t\}dx\\ &= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi t}}exp\{-(x-ct)^2/2t\}dx = 1 \end{align} \begin{align} E[Y(t)|Y(u), 0 \leq u \leq s] &= Y(s) + E[Y(t) – Y(s)|Y(u), 0 \leq u \leq s ]\\ &= Y(s) \cdot E[exp\{c(B(t) – B(s)) – c^2(t-s)/2\}] \\ &= Y(s) \cdot E[Y(t-s)] = Y(s) \end{align}

8.22 In Problem 8.16, find $$Var(T_a)$$ by using a martingale argument.

8.23 Show that
$$p(x,t;y) \equiv \frac{1}{\sqrt{2\pi t}}e^{-(x – y – \mu t)^2/2t}$$
satisfies the backward and forward diffusion. Equations (8.5.1) and (8.5.2)

Just do it : )

8.24 Verify Equation (8.7.2)

Let $$f(s, y) = [\phi(se^{-\alpha y}) – 1]dy$$, then
\begin{align} E[X(t)] &=\frac{d}{ds}E[exp\{sX(t)\}]|_{s=0} \\ &= exp\{\lambda\int_0^t f(0, y)dy\} \lambda \int_0^t \frac{d}{ds}f(s, y)|_{s=0} dy \\ &= \lambda E[X](1 – e^{-\alpha t})/\alpha\\ Var(X(t)) &= E[X^2(t)] – E^2[X(t)] \\ &= \frac{d^2}{ds^2}E[exp\{sX(t)\}]|_{s=0} – E^2[X(t)] \\ &= \lambda E[X^2](1 – e^{-2\alpha t})/2\alpha \end{align}

8.25 Verify that $$\{X(t) = N(t + L) – N(t), t \geq 0\}$$ is stationary when $$\{N(t)\}$$ is a Poisson process.

For any $$t, X(t) = N(t + L) – N(t) = N(L)$$, thus
$$E[X(t)] = E[N(L)] = \lambda L\\ Cov(X(t), X(t+s)) = Var(N(L)) = \lambda L$$

8.26 Let $$U$$ be uniformly distributed over $$(-\pi, \pi)$$, and let $$X_n = cos(nU)$$. By using trigonometric identity
$$\cos x \cos y = \frac{1}{2} [\cos(x+y) + \cos(x-y)]$$
verify that $$\{X_n, n \geq 1\}$$ is a second-order stationary process.

\begin{align} E[X_n] &= \frac{1}{n\pi}\int_{-n\pi}^{n\pi} \cos xdx = 0\\ Cov(X_{n+L}, X_n) &= E[X_{n+L}X_n] – E[X_{n+L}]E[X_n] \\ &= \frac{1}{2}E[X_{2n+L} + X_L] = 0 \end{align}

8.27 Show that
$$\sum_{i=1}^n \frac{R(i)}{n} \to 0 \quad \text{implies} \quad {\sum\sum}_{i < j < n}\frac{R(j-i)}{n^2} \to 0$$
thus completing the proof of Proposition 8.8.1

8.28 Prove the Cauchy-Schwarz inequality:
$$(E[XY])^2 \leq E[X^2]E[Y^2]$$
(Hint: Start with the inequality $$2|xy| \leq x^2 + y^2$$ and then substitute $$X/\sqrt{E[X^2]}$$ for $$x$$ and $$Y/\sqrt{E[Y^2]}$$ for $$y$$)

Since $$2xy \leq x^2 + y^2$$, then
\begin{align} 2\frac{X}{\sqrt{E[X^2]}}\frac{Y}{\sqrt{E[Y^2]}} &\leq \frac{X^2}{E[X^2]} + \frac{Y^2}{E[Y^2]} \\ E[2\frac{X}{\sqrt{E[X^2]}}\frac{Y}{\sqrt{E[Y^2]}}] &\leq E[\frac{X^2}{E[X^2]} + \frac{Y^2}{E[Y^2]}]\\ 2\frac{E[XY]}{\sqrt{E[X^2]E[Y^2]}} &\leq 2\\ (E[XY])^2 &\leq E[X^2]E[Y^2] \end{align}

8.29 For a second-order stationary process with mean $$\mu$$ for which $$\sum_{i=0}^{n-1}R(i)/n \to 0$$, show that for any $$\varepsilon > 0$$
$$\sum_{i=0}^{n-1}P\{|\bar{X_n} – \mu| > \varepsilon \} \to 0 \quad \text{as } n \to \infty$$

Solutions to Stochastic Processes Ch.7

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

7.1 Consider the following model for the flow of water in and out of a dam. Suppose that, during day $$n, Y_n$$ units of water flow into the dam from outside sources such as rainfall and river flow. At the end of each day, water is released from the dam according to the following rule: If the water content of the dam is greater than $$a$$, then the amount of $$a$$ is released. If it is less than or equal to $$a$$, then the total contents of the dam are released. The capacity of the dam is $$C$$, and once at capacity any additional water that attempts to enter the dam is assumed lost. Thus, for instance, if the water level at the beginning of day $$n$$ is $$x$$, then the level at the end of the day (before any water is released) is $$min(x + Y_n, C)$$. Let $$S_n$$ denote the amount of water in the dam immediately after the water been released at the end of day $$n$$. Assuming that the $$Y_n, n \geq 1$$, are independent and identically distributed, show that $$\{S_n, n \geq 1\}$$ is a random walk with reflecting barriers at 0 and $$C-a$$.

7.2 Let $$X_1, \dots, X_n$$ be equally likely to be any of the $$n!$$ permutations of $$(1,2,\dots, n)$$. Argue that,
$$P\{\sum_{j=1}^njX_j \leq a\} = P\{\sum_{j=1}^njX_j\geq n(n+1)^2/2 -a\}$$

\begin{align} P\{\sum_{j=1}^njX_j \leq a \} &= P\{nS_n – \sum_{i=1}^{n-1}S_i \leq a\} \\ &= P\{\sum_{i=1}^nS_i \geq (n+1)S_n – a\} \\ &= P\{\sum_{j=1}^njX_j\geq n(n+1)^2/2 -a\} \end{align}

7.3 For the simple random walk compute the expected number of visits to state $$k$$.

Suppose $$p \geq 1/2$$, and starting at state $$i$$. When $$i \leq k$$,
$$p_{ik} = 1 \\ f_{kk} = 2 -2p\\ E = 1 + \frac{1}{2p – 1}$$
When $$i > k$$
$$p_{ik} = (\frac{1-p}{p})^{i – k}\\ f_{kk} = 2 – 2p\\ E = p_{ik}(1 + \frac{1}{2p-1})$$

7.4 Let $$X_1, X_2, \dots, X_n$$ be exchangeable. Compute $$E[X_1|X_{(1)}, X_{(2)}, \dots, X_{(n)}]$$, where $$X_{(1)} \leq X_{(2)} \leq \dots \leq X_{(n)}$$ are the $$X_i$$ in ordered arrangement.

Since $$X_i$$ is exchangeable, $$X_1$$ can be any of $$X_{(i)}$$ with equal probability. Thus,
$$E[X_1|X_{(1)}, X_{(2)}, \dots, X_{(n)}] = \frac{1}{n}\sum_{i=1}^nX_{(i)}$$

7.6 An ordinary deck of cards is randomly shuffled and then the cards are exposed one at a time. At some time before all the cards have been exposed you must say “next”, and if the next card exposed is a spade then you win and if not then you lose. For any strategy, show that at the moment you call “next” the conditional probability that you win is equal to the conditional probability that the last card is spade. Conclude from this that the probability of winning is 1/4 for all strategies.

Let $$X_n$$ indicate if the nth card is a spade and $$Z_n$$ be the proportion of spades in the remaining cards after the $$n$$ card. Thus $$E|Z_n| < \infty$$ and
$$E[Z_{n+1}|Z_1, \dots , Z_n] = \frac{(52 -n)Z_n – 1}{52 -n-1}Z_n + \frac{(52-n)Z_n}{52-n-1}(1 – Z_n) = Z_n$$
Hence $$Z_n$$ is a martingale.
Note that $$X_52 = Z_51$$. Thus
\begin{align} E[X_{n+1}|X_1, \dots, X_n]&= E[X_{n+1}|Z_1, \dots, Z_n] = Z_n\\ &= E[Z_51|Z_1, \dots, Z_n] = E[X_52|X_1, \dots, X_n] \end{align}
Finally, let $$N$$ be the stopping time corresponding to saying “next” for a given strategy.
\begin{align} P\{\text{Win}\} &= E[X_{N+1}] = E[E[X_{N+1}|N]] \\ &= E[Z_N] = E[Z_1] = 1/4 \end{align}

7.7 Argue that the random walk for which $$X_i$$ only assumes the values $$0, \pm 1, \dots, \pm M$$ and $$E[X_i] = 0$$ is null recurrent.

7.8 Let $$S_n, n \geq 0$$ denote a random walk for which
$$\mu = E[S_{n+1} – S_n] \neq 0$$
Let, for $$A >0, B > 0$$,
$$N = min\{n: S_n \geq A \text{ or } S_n \leq -B\}$$
Show that $$E[N] < \infty$$. (Hint: Argue that there exists a value $$k$$ such that $$P\{S_k > A +B\} > 0$$. Then show that $$E[N] \leq kE[G]$$, where $$G$$ is an appropriately defined geometric random variable.)

Suppose $$\mu > 0$$, and let $$k > (A+B)/\mu$$, then
\begin{align} k\mu – A -B &= E[S_k – A – B] \\ &= E[S_k – A – B|S_k > A + B]P\{S_k > A+B\} \\&+ E[S_k – A – B|S_k \leq A + B]P\{S_k \leq A+B\}\\ &\leq E[S_k – A – B|S_k > A + B]P\{S_k > A+B\} \end{align} Thus, there exists $$k > (A+B)/\mu, p = P\{S_k > A +B\} > 0$$. Let $$Y_i = \sum_{j = ik+1}^{(i+1)k} X_j$$, then $$P\{Y_i > A + B\} = p$$. And it’s obviously that if any of $$Y_i$$ exceeds $$A+B$$, $$S_N$$ occurs. Hence, $$E[N] \leq k/p$$

7.10 In the insurance ruin problem of Section 7.4 explain why the company will eventually be ruined with probability 1 if $$E[Y] \geq cE[X]$$.

7.11 In the ruin problem of Section 7.4 let $$F$$ denote the interarrival distribution of claims and let $$G$$ be the distribution of the size of a claim. Show that $$p(A)$$, the probability that a company starting with $$A$$ units of assets is ever ruined, satifies
$$p(A) = \int_0^{\infty}\int_0^{A + ct}p(A + ct -x)dG(x)dF(t) + \int_0^{\infty}\bar{G}(A+ct)dF(t)$$

Condition on the first claim, then
\begin{align} p(A) &= P\{\text{ruined at first claim}\} + P\{\text{ruined after first claim}\} \\ &= \int_0^{\infty}\bar{G}(A+ct)dF(t) + \int_0^{\infty}\int_0^{A + ct}p(A + ct -x)dG(x)dF(t) \end{align}

7.12 For a random walk with $$\mu = E[X] > 0$$ argue that, with probability 1,
$$\frac{u(t)}{t} \to \frac{1}{\mu} \quad \text{as } t \to \infty$$
where $$u(t)$$ equals the number of $$n$$ for which $$0 \leq S_n \leq t$$.

7.13 Let $$S_n = \sum_{i=1}^n X_i$$ be a random walk and let $$\lambda_i, i > 0$$, denote the probability that a ladder height equals $$i$$ — that is, $$\lambda_i = P$${first positive value of $$S_n$$ equals $$i$$}.
(a) Show that if
$$P\{X_i = j\} = \left\{\begin{array}{ll} q \quad j = -1 \\ \alpha_j \quad j \geq 1 \\ \end{array}\right. \\ q + \sum_{j=1}^{\infty} \alpha_j = 1$$
then $$\lambda_i$$ satisfies
$$\lambda_i = \alpha_i + q(\lambda_{i+1} + \lambda_1\lambda_i) \quad i > 0$$
(b) If $$P\{X_i = j\} = 1/5, j = -2,-1,0,1,2$$, show that
$$\lambda_1 = \frac{1+\sqrt{5}}{3+\sqrt{5}} \quad \lambda_2 = \frac{2}{3+\sqrt{5}}$$

7.14 Let $$S_n, n\geq 0$$, denote a random walk in which $$X_i$$ has distribution $$F$$. Let $$G(t,s)$$ denote the probability that the first value of $$S_n$$ that exceeds $$t$$ is less than or equal to $$t+s$$. That is,
$$G(t,s) = P\{\text{first sum exceeding } t \text{ is } \leq t+s\}$$
Show that
$$G(t, s) = F(t + s) – F(t) + \int_{-\infty}^t G(t-y, s)dF(y)$$

$$S_n|X_1$$ is distributed as $$X_1 + S_{n-1}$$. Thus if $$A$$={first sum exceeding $$t$$ is $$\leq t + s$$},
\begin{align} G(t,s) &\equiv P\{A\} = E[P\{A|X_1\}] \\ &= F(t+s) – F(t) + \int_{-\infty}^t G(t-y, s)dF(y) \end{align}

Solutions to Stochastic Processes Ch.6

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

6.1 If $$\{Z_n, n \geq 1 \}$$ is a martingale show that, for $$1 \leq k < n,$$
$$E[Z_n|Z_1, \dots, Z_k] = Z_k$$

\begin{align} E[Z_n|Z_1, \dots, Z_k] &= E[E[Z_n|Z_1, \dots, Z_{n-1}]|Z_1, \dots, Z_k]\\ &= E[Z_{n-1}|Z_1, \dots, Z_k]\\ &\dots\\ &=E[Z_{k+1}|Z_1, \dots, Z_k] = Z_k \end{align}

6.3 Verify that $$X_n/m^n, n\geq 1$$, is a martingale when $$X_n$$ is the size of the nth generation of a branching process whose mean number of offspring per individual is $$m$$

\begin{align} E[\frac{X_{n+1}}{m^{n+1}}|\frac{X_1}{m}, \dots, \frac{X_n}{m^n}]&=E[\frac{\sum_{i=1}^{X_n}Z_i}{m^{n+1}}|\frac{X_1}{m}, \dots, \frac{X_n}{m^n}] \\ &= E[\frac{mX_n}{m^{n+1}}|\frac{X_1}{m}, \dots, \frac{X_n}{m^n}] \\ &= \frac{X_n}{m^n} \end{align}

6.4 Consider the Markov chain which at each transition either goes up 1 with probability $$p$$ or down with probability $$q = 1 – p$$. Argue that $$(q/p)^{S_n}, n \geq 1$$, is a martingale.

\begin{align} E[(q/p)^{S_{n+1}}|(q/p)^{S_1}, \dots, (q/p)^{S_n}]&= E[(q/p)^{S_n}(q/p)^{X_{n+1}}|(q/p)^{S_1}, \dots, (q/p)^{S_n}] \\ &= (q/p)^{S_n} E[(q/p)^{X_{n+1}}] \\ &= (q/p)^{S_n} \end{align}

6.5 Consider a Markov chain $$\{X_n, n \geq 0\}$$ with $$p_{NN} = 1$$. Let $$P(i)$$ denote the probability that this chain eventually enters state $$N$$ given that it starts in state $$i$$. Show that $$\{P(X_n), n \geq 0\}$$ is a martingale.

\begin{align} E[P(X_{n+1})|P(X_0), \dots, P(X_n)]&= \sum_k P_{X_n, k}P(k) \\ &=P(X_n) \end{align}

6.6 Let $$X(n)$$ denote the size of the nth generation of a branching process, and let $$\pi_0$$ denote the probability that such a process, starting with a single individual, eventually goes extinct. Show that $$\{\pi_0^{X_n}, n \geq 0\}$$ is a martingale.

\begin{align} E[\pi_0^{X_{n+1}}|\pi_0^{X_1}, \dots, \pi_0^{X_n}]&= \prod_{i=1}^{X_n}E[\pi_0^{Z_i}|\pi_0^{X_1}, \dots, \pi_0^{X_n}] \\ &=\pi_0^{X_n} \end{align}

6.9 A process $$\{Z_n, n \geq 1\}$$ is said to be a reverse, or backwards, martingale if $$E|Z_n| < \infty$$ for all $$n$$ and
$$E[Z_n|Z_{n+1}, Z_{n+2}, \dots] = Z_{n+1}$$
Show that if $$X_i, i \geq 1$$, are independent and identically distributed random variables with finite expectation, then $$Z_n = (X_1 + \dots + X_n)/n, n \geq 1$$, is a reverse martingale.

\begin{align} &E[Z_n|Z_{n+1}, Z_{n+2}, \dots]\\ =& E[\frac{(n+1)Z_{n+1} – X_{n+1}}{n}|Z_{n+1}, Z_{n+2}, \dots ] \\ =& E[Z_{n+1}|Z_{n+1}, Z_{n+2}, \dots] + \frac{1}{n}E[\frac{\sum_{i=1}^{n+1}X_i}{n+1} – X_{n+1}| Z_{n+1}, Z_{n+2}, \dots] \\ =& Z_{n+1} \end{align}

6.10 Consider successive flips of a coin having probability $$p$$ of landing heads. Use a martingale argument to compute the expected number of flips until the following sequences appear:
(a) HHTTHHT
(b) HTHTHTH

(a) $$X_n = N-2 – (p^{-4}q^{-3} – 1) – (p^{-2}q^{-1} – 1) \\ E[N] = E[X_n] + p^{-4}q^{-3} + p^{-2}q^{-1} = p^{-4}q^{-3} + p^{-2}q^{-1}$$
(b) $$E[N] = p^{-4}q^{-3} + p^{-3}q^{-2} + p^{-2}q^{-1} + p^{-1}$$

6.11 Consider a gambler who at each gamble is equally likely to either win or lose 1 unit. Suppose the gambler will quit playing when his winnings are either $$A$$ or $$-B, A > 0, B > 0$$. Use an appropriate martingale to show that the expected number of bets is $$AB$$.

Similar to Example 6.2(C), let two players with A and B coins. When the gambler win, one player gives a coin to another.

6.12 In Example 6.2(C), find the expected number of stages until one of the players is eliminated.

$$M_n = X_nY_nZ_n + ns/3\\ E[M_T] = E[M_0] = xyz\\ M_T = ns/3$$
It’s easy to prove that $$M_n$$ is a martingale. Then $$E[T] = 3E[M_T]/s = 3xyz/s$$

6.13 Let $$Z_n = \prod_{i=1}^n X_i$$, where $$X_i, i \geq 1$$ are independent random variables with$$P\{X_i = 2\} = P\{X_i = 0\} = 1/2$$
Let $$N = Min\{n: Z_n = 0\}$$. Is the martingale stopping theorem applicable? If so, what would you conclude? If not, why not?

Not applicable.

6.14 Show that the equation
$$e^{\beta} – e^{-\beta} = 2\beta e^{\beta^2/2}$$
has no solution when $$\beta \neq 0$$.
(Hint: Expand in a power series.)

\begin{align} &e^{\beta} – e^{-\beta} – 2\beta e^{\beta^2/2}\\ =& \sum_{i=0}^{\infty} \beta^{2i+1}(1/(2i+1)! – 1/(i!2^i)) \\ \end{align}
Since, it equals zero iff $$\beta = 0$$.

6.15 Let $$X$$ denote the number of heads in $$n$$ independent flips of a fair coin. Show that:
(a) $$P\{X- n/2 \geq a\} \leq exp\{-2a^2/n\}$$
(b) $$P\{X- n/2 \leq -a\} \leq exp\{-2a^2/n\}$$

$$Z_i = X_i – i/2$$ is a martingale with mean 0, and $$-1/2 \leq Z_i – Z_{i-1} \leq 1/2$$. From Azuma inequality, we get the result.

6.16 Let $$X$$ denote the number of successes in $$n$$ independent Bernoulli trials, with trial $$i$$ resulting in a success with probability $$p_i$$. Given an upper bound for $$P\{|X – \sum_{i=1}^n p_i| \geq a\}$$.

$$Z_i = X_i – \sum_{k=1}^ip_k$$ is a martingale with mean 0. and $$0 \leq Z_i – Z_{i-1} \leq 1$$. From Azuma inequality,
$$P\{X – \sum_{i=1}^n p_i \geq a\} \leq 2exp\{-2a^2/n\} \\ P\{X – \sum_{i=1}^n p_i \leq -a\} \leq 2exp\{-2a^2/n\}$$

6.17 Suppose that 100 balls are to be randomly distributed among 20 urns. Let $$X$$ denote the number of urns that contains at least five balls. Derive an upper bound for $$P\{X \geq 15\}$$.

$$E[X] = 20[1 – \sum_{k=0}^4{100 \choose k}(\frac{1}{20})^k(\frac{19}{20})^{100-k}]$$
And it’s easy to see that, if $$x$$ and $$y$$ differ in at most one coordinate, then $$|h(x) – h(y)| \leq 1$$. Thus,
$$P\{X – E[X] \geq a\} \leq exp\{-a^2/200\}$$
Let $$a = 15 – E[X]$$, we get the result.

6.18 Let $$p$$ denote the probability that a random selection of 88 people will contain at least three with the same birthday. Use Azuma’s inequality to obtain an upper bound on $$p$$. (It can be shown that $$p \approx 0.50$$).

Similarly to Problem 6.17,
$$p \leq exp\{-a^2/176\} \\ a = 1 – E[X]\\ E[X] = 365[1 – \sum_{i=0}^2{88 \choose i}(\frac{1}{365})^i(\frac{364}{365})^{88-i}]$$

6.19 For binary n-vectors $$\boldsymbol x$$ and $$\boldsymbol y$$ (meaning that each coordinate of these vectors is either 0 or 1) define the distance between them by
$$\rho(\boldsymbol x, \boldsymbol y) = \sum_{i=1}^n |x_i – y_i|$$
(This is called the Hamming distance). Let $$A$$ be a finite set of such vectors, and let $$X_1, \dots, X_n$$ be independent random variables that are each equally likely to be either 0 or 1. Set $$D = min_{y \in A} \rho(\boldsymbol X, \boldsymbol y)$$
and let $$\mu = E[D]$$. In terms of $$\mu$$, find an upper bound for $$P\{D \geq b\}$$ when $$b > \mu$$

It easy to see that $$|h(x) – h(y)| \leq 1$$, thus
$$P\{D – E[D] \geq a\} \leq exp\{-a^2/2n\}$$
Let $$a + E[D] = b$$, we get
$$P\{D \geq b\} \leq exp\{-(b – \mu)^2/2n\}$$

6.21 A group of $$2n$$ people, consisting of $$n$$ men and $$n$$ women, are to be independently distributed among $$m$$ rooms. Each woman chooses room $$j$$ with probability $$p_j$$ while each man chooses it with probability $$q_j, j = 1, \dots, m$$. Let $$X$$ denote the number of rooms that will contain exactly one man and one woman.
(a) Find $$\mu = E[X]$$
(b) Bound $$P\{|X – \mu| > b\}$$ for $$b > 0$$

(a) $$E[X] = \sum_{i=1}^m{n \choose 1}p_i(1-p_i)^{n-1}{n \choose 1}q_i(1-q_i)^{n-1}$$
(b) Let $$2a = b$$ then, $$P\{|X – \mu| > b\} \leq exp\{-b^2/16n\}$$

6.22 Let $$\{X_n, n \geq 0\}$$ be a Markov process for which $$X_0$$ is uniform on (0, 1) and, conditional on $$X_n$$,
$$X_{n+1} = \left\{\begin{array}{ll} \alpha X_n + 1 – \alpha \quad \text{with probability } X_n \\ \alpha X_n \quad \text{with probability } 1 – X_n \\ \end{array}\right.$$
where $$0 < \alpha < 1$$. Discuss the limiting properties of the sequence $$X_n, n \geq 1$$.

$$E[X_{n+1}|X_n] = X_n(\alpha X_n + 1 – \alpha) + (1 – X_n)\alpha X_n = X_n$$
Thus, $$X_n$$ is martingale, and $$E[|X_n|] = E[X_n] = E[X_1] < \infty$$, so the limit exists.

6.23 An urn initially contains one white and one black ball. At each stage a ball is drawn and is then replaced in the urn along with another ball of the same color. Let $$Z_n$$ denote the fraction of balls in the urn that are white after the nth replication.
(a) Show that $$\{Z_n, n \geq 1\}$$ is a martingale.
(b) Show that the probability that the fraction of white balls in the urn is ever as large as 3/4 is at most 2/3.

(a) \begin{align} &E[Z_{n+1}|Z_1, \dots, Z_n] \\ =&Z_n\frac{(n+2)Z_n + 1}{n+3} + (1-Z_n)\frac{(n+2)Z_n}{n+3}\\ =&Z_n\end{align}
(b) $$P\{max(Z_1, \dots, Z_n) > 3/4\} \leq 4E[Z_n]/3 = 4E[Z_1]/3 = 2/3$$

6.24 Consider a sequence of independent tosses of a coin and let $$P\{head\}$$ be the probability of a head on any toss. Let $$A$$ be the hypothesis that $$P\{head\} = a$$ and let $$B$$ be the hypothesis that $$P\{head\} = b, 0 < a,b < 1$$. Let $$X_i$$ denote the outcome of the ith toss and let
$$Z_n = \frac{P\{X_1, \dots, X_n|A\}}{P\{X_1, \dots, X_n|B\}}$$
Show that if $$B$$ is true, then:
(a) $$Z_n$$ is a martingale, and
(b) $$\lim_{n \to \infty} Z_n$$ exists with probability 1.
(c) If $$b = \neq a$$, what is $$\lim_{n \to \infty} Z_n$$?

(a) Let $$X_i=1$$ if the outcome of the ith toss is head, $$X_i = 0$$ if it is tail. From independence,
$$Z_n = \frac{P\{X_1, \dots, X_n|A\}}{P\{X_1, \dots, X_n|B\}} = Z_{n-1}\frac{P\{X_n|A\}}{P\{X_n|B\}}$$ Now,
$$E[\frac{P\{X_n|A\}}{P\{X_n|B\}}] = E[\frac{a^{X_n}(1-a)^{1-X_n}}{b^{X_n}(1-b)^{1-X_n}}] = \frac{a}{b}b + \frac{1-a}{1-b}(1-b) = 1$$
(b) Since $$Z_n$$ is a nonnegative martingale, from Corollary 6.4.7, the limit exists.
(c) From (a), it is clear that $$Z_n$$ can have a finite (random or constant) non-zero limit only if
$$\lim_{n \to \infty}\frac{P\{X_n|A\}}{P\{X_n|B\}} = 1$$
with probability 1. However for $$a\neq b$$ it is not possible. Thus the limit is 0.

6.25 Let $$Z_n, n \geq 1$$, be a sequence of random variables such that $$Z_1 \equiv 1$$ and given $$Z_1, \dots, Z_{n-1}, Z_n$$ is a Poisson random variable with mean $$Z_{n-1}, n > 1$$. What can we say about $$Z_n$$ for $$n$$ large?

$$Z_n$$ is nonnegative martingale, and $$E[Z_n] =E[Z_1] = 1$$. Thus the limit of $$Z_n$$ exists.

6.26 Let $$X_1, X_2, \dots$$ be independent and such that
$$P\{X_i = -1\} = 1 – 1/2^i\\ P\{X_i = 2^i – 1\} = 1/2^i, \quad i \geq 1$$
Use this sequence to construct a zero mean martingale $$Z_n$$ such that $$\lim_{n \to \infty}Z_n = -\infty$$ with probability 1. (Hint: Make use of the Borel-Cantelli lemma)

It is easy to see that $$Z_n = \sum_{i=1}^nX_i$$ is a martingale with mean 0. And
$$\sum_{i=1}^{\infty} P\{X_i = 2^i – 1\} = 1/4 < \infty$$
From Borel-Cantelli lemma,
$$P\{X_i \text{ is positive occurs infinite}\} = 0\\ P\{X_i \text{ is negative occurs infinite}\} = 1$$
Thus, $$\lim_{n \to \infty}Z_n = -\infty$$

A continuous-time process $$\{X(t), t \geq 0\}$$ is said to be a martingale if $$E[|X(t)|] < \infty$$ for all $$t$$ and, for all $$s < t$$,
$$E[X(t)|X(u), 0 \leq u \leq s] = X(s)$$

6.27 Let $$\{X(t), t \geq 0\}$$ be a continuous-time Markov chain with infinitesimal transition rates $$q_{ij}, i \neq j$$. Given the conditions on the $$q_{ij}$$ that result in $$\{X(t), t \geq 0\}$$ being a continuous-time martingale.

Do Problem 6.28-6.30 under the assumptions that (a) the continuous-time analogue of the martingale stopping theorem is valid, and (b) any needed regularity conditions are satisfied.

6.28 Let $$\{X(t), t \geq 0\}$$ be a nonhomogeneous Poisson process with intensity function $$\lambda(t), t \geq 0$$. Let $$T$$ denote the time at which the nth event occurs. Show that
$$n = E[\int_0^T \lambda(t)dt]$$

6.29 Let $$\{X(t), t \geq 0\}$$ be a continuous-time Markov chain that will, in finite expected time, enter a an absorbing state $$N$$. Suppose that $$X(0) = 0$$ and let $$m_i$$ denote the expected time the chain is in state $$i$$. Show that for $$j \neq 0, j\neq N$$.
(a) $$E[$$number of times the chain leaves state $$j]=v_j m_j$$, where $$1/v_j$$ is the mean time the chain spends in $$j$$ during a visit.
(b) $$E[$$number of times it enters state $$j] = \sum_{i \neq j}m_i q_{ij}$$.
(c) Argue that
$$v_jm_j = \sum_{i \neq j}m_iq_{ij}, \quad j \neq 0 \\ v_0m_0 = 1 + \sum_{i \neq 0}m_iq_{i0}$$

6.30 Let $$\{X(t), t \geq 0\}$$ be a compound Poisson process with Poisson rate $$\lambda$$ and component distribution $$F$$. Define a continuous-time martingale related to this process.