### 金融高频数据计量——理论与实证（一）

Reference:
Box, G. E., Jenkins, G. M., Reinsel, G. C., & Ljung, G. M. (2015). Time series analysis: forecasting and control. John Wiley & Sons.
Engle, R. F. (2000). The econometrics of ultra‐high‐frequency data. Econometrica, 68(1), 1-22.
Hamilton, J. D. (1995). Time series analysis. Economic Theory. II, Princeton University Press, USA, 625-630.
Hautsch, N. (2011). Econometrics of financial high-frequency data. Springer Science & Business Media.

### 《随机过程》勘误

p9 例1.3(C)： 随机变量至少有一个值与其均值一样大

p30 题1.17： 在……的第i个最小者中

p31 题1.22： $$Var(X) = E[Var(X|Y)] + Var(E[E|Y])$$

p31 题1.23：以a记质点……

p33 题1.43：此题应该少了一个条件$$t \geq 0$$, 虽然本书和原书都没有这个条件，但当$$t < 0$$时，容易找出一个反例使该不等式不成立

p40 最后一段： ……中第k个最小值

p80 例3.5(C)第5行：直至投掷中出现两个反面

p82 第4行：其中初始分布式$$Y_D(t)$$的分布

p109 倒数第4行：令$$n$$趋向于0然后令$$M$$趋向$$\infty$$，导致……

p117 倒数第9行：且N是一个……停时

p128 第7行：则对$$j$$求和导致

p130 第2行：移动到它的叶子的概率

p131 定理4.7.2第2行：此处应删去多余的$$i_1, i_2$$

p146 例5.3(A)： 在群体中每个个体假定以指数率$$\lambda$$出生

p156 5.5节第4行： 则极限概率为$$P_j = \lim_{i \to \infty}P_{ij}^t$$

p185 鞅的更多例子(4)： 那么如1.9节所示

p192 例6.3(B)： 结束其空的状态

p215 第4行： $$P\{$$迟早越过$$A\} \leq e^{-\theta A}$$

p215 倒数第8行： $$X_{n+1} + \sum_{i=1}^{n-1}(Y_i – X_{i+1})$$

p217 第5行： $$S_n = \sum_{i=1}^{n}(Y_i – cY_i)$$

p223 第18行： 与过程在时刻$$t$$以前的一切值独立

p302 3.17答案第3行： $$g = h + h * F = (h + g*F)*F_2$$

p305 4.13答案应为3.33题答案

p305 4.13答案第5行：$$\lim_{k \to \infty}\frac{\text{直至}N_k+m\text{访问}j\text{的次数}}{n}\frac{n}{N_k + m}$$

$$\lim_{n \to \infty}\frac{\text{number of visits to } j\text{ by time }N_n + m}{n}\frac{n}{N_n + m}$$

p308 5.3答案：$$P\{N(t) \geq n\} \leq \sum_{j=n}^{\infty}e^{-Mt}\frac{M(t)^j}{j!}$$

p309 5.4答案最后1行：$$P_{ij}(t) = v_iP_{ij}t + o(t)$$

### Solutions to Stochastic Processes Ch.8

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

In Problem 8.1, 8.2 and 8.3, let $$\{X(t), t \geq 0\}$$ denote a Brownian motion process.

8.1 Let $$Y(t) = tX(1/t)$$.
(a) What is distribution of $$Y(t)$$?
(b) Compute $$Cov(Y(s), Y(t)$$
(c) Argue that $$\{Y(t), t \geq 0\}$$ is also Brownian motion
(d) Let $$T = inf\{t>0: X(t)=0\}$$. Using (c) present an argument that $$P\{T = 0\} = 1$$.

(a) $$X(1/t) \sim N(0, 1/t)$$ then $$Y(t) = tX(1/t) \sim N(0, t)$$
(b) \begin{align} Cov(Y(s), Y(t)) &= Cov(sX(1/s), tX(1/t)) \\ &= st\cdot Cov(X(1/s), X(1/t)) \\ &= st\cdot min(1/s, 1/t) = min(s, t) \end{align}
(c) Since $$\{X(t)\}$$ is a Gaussian process so is $$\{Y(t)\}$$. Further from parts (a) and (b) above $$\{Y(t)\}$$ is a Brownian motion.
(d) Since $$Y(t)$$ is Brownian motion then $$T_1 \equiv sup\{t: Y(t) = 0\} = \infty$$ with probability 1. Note $$\{T = 0\} = \{T_1 = \infty\}$$. Thus, $$P\{T = 0\} = 1$$

8.2 Let $$W(t) = X(a^2t)/a$$ for $$a > 0$$. Verify that $$W(t)$$ is also Brownian motion.

$$W(0) = X(0)/a = 0$$. Non-overlapping increments of $$W(t)$$ map to non-overlapping increments of $$X(t)$$. Thus increments of $$W(t)$$ are independent. Further, for $$s < t$$,
$$W(t) – W(s) = \frac{X(a^2t) – X(a^2s)}{a} \sim N(0, t-s)$$
Thus $$W(t)$$ has stationary increments with required distribution. Therefore, $$W(t)$$ is a Brownian motion.

8.5 A stochastic process $$\{X(t), t \geq 0\}$$ is said to be stationary if $$X(t_1), \dots, X(t_n)$$ has the same joint distribution as $$X(t_1+a), \dots, X(t_n +a)$$ for all $$n, a, t_1, \dots, t_n$$.
(a) Prove that a necessary and sufficient condition for a Gaussian process to be stationary is that $$Cov(X(s), X(t))$$ depends only on $$t-s, s \leq t$$, and $$E[X(t)] = c$$.
(b) Let $$\{X(t), t \geq 0\}$$ be Brownian motion and define
$$V(t) = e^{-\alpha t/2}X(\alpha e^{\alpha t})$$
Show that $$\{V(t), t \geq 0\}$$ is a stationary Gaussian process. It is called Ornstein-Uhlenbeck process.

(a) If the Gaussian process is stationary then for $$t > s, (X(t), X(s))$$ and $$(X(t-s), X(0))$$ have same distribution. Thus, $$E[X(s)] = E[X(0)]$$ for all $$s$$ and $$Cov(X(t), X(s)) = Cov(X(t-s), X(0))$$, for all $$t < s$$. Now, assume $$E[X(t)] = c$$ and $$Cov(X(t), X(s)) = h(t-s)$$. For any $$T = (t_1, \dots, t_k)$$ define vector $$X_T \equiv (X(t_1), \dots, X(t_k))$$. Let $$\tilde{T} = (t_1-a, \dots, t_k -a)$$. If $$\{X(t)\}$$ is a Gaussian process then both $$X_T$$ and $$X_{\tilde{T}}$$ are multivariate normal and it suffices to show that they have the same mean and covariance. This follows directly from the fact that they have the same element-wise mean $$c$$ and the equal pair-wise covariances, $$Cov(X(t_i-a), X(t_j -a)) = h(t_i-t_j) = Cov(X(t_i), X(t_j))$$
(b) Since all finite dimensional distributions of $$\{V(t)\}$$ are normal, it is a Gaussian process. Thus from part (a) is suffices to show the following:
(i) $$E[V(t)] = e^{-at/2}E[X(\alpha e^{\alpha t})] = 0$$. Thus $$E[V(t)]$$ is constant.
(ii) For $$s \leq t$$, \begin{align} Cov(V(s), V(t)) &= e^{-\alpha(t+s)/2}Cov(X(\alpha e^{\alpha s}), X(\alpha e^{\alpha t}))\\ &= e^{-\alpha(t+s)/2}\alpha e^{\alpha s} = \alpha e^{-\alpha(t-s)/2} \end{align}
which depends only on $$t-s$$.

8.8 Suppose $$X(1) = B$$. Characterize, in the manner of Proposition 8.1.1, $$\{X(t), 0 \leq t \leq 1\}$$ given that $$X(1) = B$$.

Condition on $$X(1) = B, X(t) \sim N(Bt, t(1-t))$$, then $$Z(t) \sim N(0, t(1-t))$$ which is a Brownian motion.

8.9 Let $$M(t) = max_{0 \leq s \leq t} X(s)$$ and show that
$$P\{M(t) > a| M(t) = X(t)\} = e^{-a^2/2t}, \quad a > 0$$

From Section 8.3.1, we get
$$P\{M(t) > y, X(t) < x\} = \int_{2y-x}^{\infty}\frac{1}{\sqrt{2\pi t}}e^{-u^2/2t}du$$
By using Jacobian formula, we can derive the density function of $$M(t)$$ and $$W(t) = M(t) – X(t)$$, which we denote by $$f_{MW}$$. Thus
$$f_W(w) = 2\int_0^{\infty}f_{MW}(m, w)dm \\ P\{M(t) > a | W(t) = 0\} = 1 – \int_0^a \frac{f_{MW}(m, 0)}{f_W(0)}dm$$
The last equation can be computed, which equal $$e^{-a^2/2t}$$

8.10 Compute the density function of $$T_x$$, the time until Brownian motion hits $$x$$.

\begin{align} f_{T_x}(t) &= F_{T_x}^{\prime}(t) = (\frac{2}{\sqrt{2\pi}}\int_{|x|/\sqrt{t}}^{\infty}e^{-y^2/2}dy)^{\prime} \\ &= -\frac{2}{\sqrt{2\pi}} \cdot e^{-x^2/2t} \cdot \frac{x}{2}t^{-3/2}\\ &= -\frac{x}{\sqrt{2\pi}}e^{-x^2/2t}t^{-3/2} \end{align}

8.11 Let $$T_1$$ denote the largest zero of $$X(t)$$ that is less than $$t$$ and let $$T_2$$ be the smallest zero greater than $$t$$. Show that
(a) $$P\{T_2 < s\} = (2/\pi)\arccos\sqrt{t/s}, s> t$$.
(b) $$P\{T_1 < s, T_2 > y\} = (2/\pi)\arcsin\sqrt{s/y}, s < t< y$$.

(a) \begin{align} P\{T_2 < s\} &= 1 – P\{\text{no zeros in } (t, s)\} \\ &= 1 – \frac{2}{\pi}\arcsin\sqrt{t/s} \\ &= (2/\pi)\arccos\sqrt{t/s} \end{align}
(b) $$P\{T_1 < s, T_2 > y\} = P\{\text{no zeros in } (s, y)\} = \frac{2}{\pi}\arcsin\sqrt{s/y}$$

8.12 Verify the formulas given in (8.3.4) for the mean and variance of $$|X(t)|$$.

$$f_Z(y) = (\frac{2}{\sqrt{2\pi t}}\int_{-\infty}^y e^{-x^2/2t}dx – 1)^{\prime} = \frac{2}{\sqrt{2\pi t}}e^{-y^2/2t}\\ E[Z(t)] = \int_{0}^{\infty}yf_Z(y)dy = -\frac{2t}{\sqrt{2\pi t}}e^{-y^2/2t}|_0^{\infty} = \sqrt{2t/\pi} \\ Var(Z(t)) = E[Z^2(t)] – E^2[Z(t)] = E[X^2(t)] – E^2[Z(t)] = (1 – \frac{2}{\pi})t$$

8.13 For Brownian motion with drift coefficient $$\mu$$, show that for $$x>0$$
$$P\{max_{0 \leq s \leq h} |X(s)| > x\} = o(h).$$

8.18 Let $$\{X(t), t \geq 0\}$$ be a Brownian motion with drift coefficient $$\mu, \mu < 0$$, which is not allowed to become negative. Find the limiting distribution of $$X(t)$$.

8.19 Consider Brownian motion with reflecting barriers of $$-B$$ and $$A, A >0, B > 0$$. Let $$p_t(x)$$ denote the density function of $$X_t$$.
(a) Compute a differential equation satisfied by $$p_t(x)$$.
(b) Obtain $$p(x) = \lim_{t \to \infty} p_t(x)$$.

8.20 Prove that, with probability 1, for Brownian motion with drift $$\mu$$.
$$\frac{X(t)}{t} \to \mu, \quad \text{ as } t \to \infty$$

8.21 Verify that if $$\{B(t), t \geq 0\}$$ is standard Brownian motion then $$\{Y(t), t \geq 0\}$$ is a martingale with mean 1, when $$Y(t) = exp\{cB(t) – c^2t/2\}$$

\begin{align} E[Y(t)] &= \int_{-\infty}^{\infty} exp\{cx – c^2t/2\}\frac{1}{\sqrt{2\pi t}}exp\{-x^2/2t\}dx\\ &= \int_{-\infty}^{\infty} \frac{1}{\sqrt{2\pi t}}exp\{-(x-ct)^2/2t\}dx = 1 \end{align} \begin{align} E[Y(t)|Y(u), 0 \leq u \leq s] &= Y(s) + E[Y(t) – Y(s)|Y(u), 0 \leq u \leq s ]\\ &= Y(s) \cdot E[exp\{c(B(t) – B(s)) – c^2(t-s)/2\}] \\ &= Y(s) \cdot E[Y(t-s)] = Y(s) \end{align}

8.22 In Problem 8.16, find $$Var(T_a)$$ by using a martingale argument.

8.23 Show that
$$p(x,t;y) \equiv \frac{1}{\sqrt{2\pi t}}e^{-(x – y – \mu t)^2/2t}$$
satisfies the backward and forward diffusion. Equations (8.5.1) and (8.5.2)

Just do it : )

8.24 Verify Equation (8.7.2)

Let $$f(s, y) = [\phi(se^{-\alpha y}) – 1]dy$$, then
\begin{align} E[X(t)] &=\frac{d}{ds}E[exp\{sX(t)\}]|_{s=0} \\ &= exp\{\lambda\int_0^t f(0, y)dy\} \lambda \int_0^t \frac{d}{ds}f(s, y)|_{s=0} dy \\ &= \lambda E[X](1 – e^{-\alpha t})/\alpha\\ Var(X(t)) &= E[X^2(t)] – E^2[X(t)] \\ &= \frac{d^2}{ds^2}E[exp\{sX(t)\}]|_{s=0} – E^2[X(t)] \\ &= \lambda E[X^2](1 – e^{-2\alpha t})/2\alpha \end{align}

8.25 Verify that $$\{X(t) = N(t + L) – N(t), t \geq 0\}$$ is stationary when $$\{N(t)\}$$ is a Poisson process.

For any $$t, X(t) = N(t + L) – N(t) = N(L)$$, thus
$$E[X(t)] = E[N(L)] = \lambda L\\ Cov(X(t), X(t+s)) = Var(N(L)) = \lambda L$$

8.26 Let $$U$$ be uniformly distributed over $$(-\pi, \pi)$$, and let $$X_n = cos(nU)$$. By using trigonometric identity
$$\cos x \cos y = \frac{1}{2} [\cos(x+y) + \cos(x-y)]$$
verify that $$\{X_n, n \geq 1\}$$ is a second-order stationary process.

\begin{align} E[X_n] &= \frac{1}{n\pi}\int_{-n\pi}^{n\pi} \cos xdx = 0\\ Cov(X_{n+L}, X_n) &= E[X_{n+L}X_n] – E[X_{n+L}]E[X_n] \\ &= \frac{1}{2}E[X_{2n+L} + X_L] = 0 \end{align}

8.27 Show that
$$\sum_{i=1}^n \frac{R(i)}{n} \to 0 \quad \text{implies} \quad {\sum\sum}_{i < j < n}\frac{R(j-i)}{n^2} \to 0$$
thus completing the proof of Proposition 8.8.1

8.28 Prove the Cauchy-Schwarz inequality:
$$(E[XY])^2 \leq E[X^2]E[Y^2]$$
(Hint: Start with the inequality $$2|xy| \leq x^2 + y^2$$ and then substitute $$X/\sqrt{E[X^2]}$$ for $$x$$ and $$Y/\sqrt{E[Y^2]}$$ for $$y$$)

Since $$2xy \leq x^2 + y^2$$, then
\begin{align} 2\frac{X}{\sqrt{E[X^2]}}\frac{Y}{\sqrt{E[Y^2]}} &\leq \frac{X^2}{E[X^2]} + \frac{Y^2}{E[Y^2]} \\ E[2\frac{X}{\sqrt{E[X^2]}}\frac{Y}{\sqrt{E[Y^2]}}] &\leq E[\frac{X^2}{E[X^2]} + \frac{Y^2}{E[Y^2]}]\\ 2\frac{E[XY]}{\sqrt{E[X^2]E[Y^2]}} &\leq 2\\ (E[XY])^2 &\leq E[X^2]E[Y^2] \end{align}

8.29 For a second-order stationary process with mean $$\mu$$ for which $$\sum_{i=0}^{n-1}R(i)/n \to 0$$, show that for any $$\varepsilon > 0$$
$$\sum_{i=0}^{n-1}P\{|\bar{X_n} – \mu| > \varepsilon \} \to 0 \quad \text{as } n \to \infty$$

### Solutions to Stochastic Processes Ch.7

Solutions to Stochastic Processes Sheldon M. Ross Second Edition(pdf)
Since there is no official solution manual for this book, I
handcrafted the solutions by myself. Some solutions were referred from web, most copyright of which are implicit, can’t be listed clearly. Many thanks to those authors! Hope these solutions be helpful, but No Correctness or Accuracy Guaranteed. Comments are welcomed. Excerpts and links may be used, provided that full and clear credit is given.

7.1 Consider the following model for the flow of water in and out of a dam. Suppose that, during day $$n, Y_n$$ units of water flow into the dam from outside sources such as rainfall and river flow. At the end of each day, water is released from the dam according to the following rule: If the water content of the dam is greater than $$a$$, then the amount of $$a$$ is released. If it is less than or equal to $$a$$, then the total contents of the dam are released. The capacity of the dam is $$C$$, and once at capacity any additional water that attempts to enter the dam is assumed lost. Thus, for instance, if the water level at the beginning of day $$n$$ is $$x$$, then the level at the end of the day (before any water is released) is $$min(x + Y_n, C)$$. Let $$S_n$$ denote the amount of water in the dam immediately after the water been released at the end of day $$n$$. Assuming that the $$Y_n, n \geq 1$$, are independent and identically distributed, show that $$\{S_n, n \geq 1\}$$ is a random walk with reflecting barriers at 0 and $$C-a$$.

7.2 Let $$X_1, \dots, X_n$$ be equally likely to be any of the $$n!$$ permutations of $$(1,2,\dots, n)$$. Argue that,
$$P\{\sum_{j=1}^njX_j \leq a\} = P\{\sum_{j=1}^njX_j\geq n(n+1)^2/2 -a\}$$

\begin{align} P\{\sum_{j=1}^njX_j \leq a \} &= P\{nS_n – \sum_{i=1}^{n-1}S_i \leq a\} \\ &= P\{\sum_{i=1}^nS_i \geq (n+1)S_n – a\} \\ &= P\{\sum_{j=1}^njX_j\geq n(n+1)^2/2 -a\} \end{align}

7.3 For the simple random walk compute the expected number of visits to state $$k$$.

Suppose $$p \geq 1/2$$, and starting at state $$i$$. When $$i \leq k$$,
$$p_{ik} = 1 \\ f_{kk} = 2 -2p\\ E = 1 + \frac{1}{2p – 1}$$
When $$i > k$$
$$p_{ik} = (\frac{1-p}{p})^{i – k}\\ f_{kk} = 2 – 2p\\ E = p_{ik}(1 + \frac{1}{2p-1})$$

7.4 Let $$X_1, X_2, \dots, X_n$$ be exchangeable. Compute $$E[X_1|X_{(1)}, X_{(2)}, \dots, X_{(n)}]$$, where $$X_{(1)} \leq X_{(2)} \leq \dots \leq X_{(n)}$$ are the $$X_i$$ in ordered arrangement.

Since $$X_i$$ is exchangeable, $$X_1$$ can be any of $$X_{(i)}$$ with equal probability. Thus,
$$E[X_1|X_{(1)}, X_{(2)}, \dots, X_{(n)}] = \frac{1}{n}\sum_{i=1}^nX_{(i)}$$

7.6 An ordinary deck of cards is randomly shuffled and then the cards are exposed one at a time. At some time before all the cards have been exposed you must say “next”, and if the next card exposed is a spade then you win and if not then you lose. For any strategy, show that at the moment you call “next” the conditional probability that you win is equal to the conditional probability that the last card is spade. Conclude from this that the probability of winning is 1/4 for all strategies.

Let $$X_n$$ indicate if the nth card is a spade and $$Z_n$$ be the proportion of spades in the remaining cards after the $$n$$ card. Thus $$E|Z_n| < \infty$$ and
$$E[Z_{n+1}|Z_1, \dots , Z_n] = \frac{(52 -n)Z_n – 1}{52 -n-1}Z_n + \frac{(52-n)Z_n}{52-n-1}(1 – Z_n) = Z_n$$
Hence $$Z_n$$ is a martingale.
Note that $$X_52 = Z_51$$. Thus
\begin{align} E[X_{n+1}|X_1, \dots, X_n]&= E[X_{n+1}|Z_1, \dots, Z_n] = Z_n\\ &= E[Z_51|Z_1, \dots, Z_n] = E[X_52|X_1, \dots, X_n] \end{align}
Finally, let $$N$$ be the stopping time corresponding to saying “next” for a given strategy.
\begin{align} P\{\text{Win}\} &= E[X_{N+1}] = E[E[X_{N+1}|N]] \\ &= E[Z_N] = E[Z_1] = 1/4 \end{align}

7.7 Argue that the random walk for which $$X_i$$ only assumes the values $$0, \pm 1, \dots, \pm M$$ and $$E[X_i] = 0$$ is null recurrent.

7.8 Let $$S_n, n \geq 0$$ denote a random walk for which
$$\mu = E[S_{n+1} – S_n] \neq 0$$
Let, for $$A >0, B > 0$$,
$$N = min\{n: S_n \geq A \text{ or } S_n \leq -B\}$$
Show that $$E[N] < \infty$$. (Hint: Argue that there exists a value $$k$$ such that $$P\{S_k > A +B\} > 0$$. Then show that $$E[N] \leq kE[G]$$, where $$G$$ is an appropriately defined geometric random variable.)

Suppose $$\mu > 0$$, and let $$k > (A+B)/\mu$$, then
\begin{align} k\mu – A -B &= E[S_k – A – B] \\ &= E[S_k – A – B|S_k > A + B]P\{S_k > A+B\} \\&+ E[S_k – A – B|S_k \leq A + B]P\{S_k \leq A+B\}\\ &\leq E[S_k – A – B|S_k > A + B]P\{S_k > A+B\} \end{align} Thus, there exists $$k > (A+B)/\mu, p = P\{S_k > A +B\} > 0$$. Let $$Y_i = \sum_{j = ik+1}^{(i+1)k} X_j$$, then $$P\{Y_i > A + B\} = p$$. And it’s obviously that if any of $$Y_i$$ exceeds $$A+B$$, $$S_N$$ occurs. Hence, $$E[N] \leq k/p$$

7.10 In the insurance ruin problem of Section 7.4 explain why the company will eventually be ruined with probability 1 if $$E[Y] \geq cE[X]$$.

7.11 In the ruin problem of Section 7.4 let $$F$$ denote the interarrival distribution of claims and let $$G$$ be the distribution of the size of a claim. Show that $$p(A)$$, the probability that a company starting with $$A$$ units of assets is ever ruined, satifies
$$p(A) = \int_0^{\infty}\int_0^{A + ct}p(A + ct -x)dG(x)dF(t) + \int_0^{\infty}\bar{G}(A+ct)dF(t)$$

Condition on the first claim, then
\begin{align} p(A) &= P\{\text{ruined at first claim}\} + P\{\text{ruined after first claim}\} \\ &= \int_0^{\infty}\bar{G}(A+ct)dF(t) + \int_0^{\infty}\int_0^{A + ct}p(A + ct -x)dG(x)dF(t) \end{align}

7.12 For a random walk with $$\mu = E[X] > 0$$ argue that, with probability 1,
$$\frac{u(t)}{t} \to \frac{1}{\mu} \quad \text{as } t \to \infty$$
where $$u(t)$$ equals the number of $$n$$ for which $$0 \leq S_n \leq t$$.

7.13 Let $$S_n = \sum_{i=1}^n X_i$$ be a random walk and let $$\lambda_i, i > 0$$, denote the probability that a ladder height equals $$i$$ — that is, $$\lambda_i = P$${first positive value of $$S_n$$ equals $$i$$}.
(a) Show that if
$$P\{X_i = j\} = \left\{\begin{array}{ll} q \quad j = -1 \\ \alpha_j \quad j \geq 1 \\ \end{array}\right. \\ q + \sum_{j=1}^{\infty} \alpha_j = 1$$
then $$\lambda_i$$ satisfies
$$\lambda_i = \alpha_i + q(\lambda_{i+1} + \lambda_1\lambda_i) \quad i > 0$$
(b) If $$P\{X_i = j\} = 1/5, j = -2,-1,0,1,2$$, show that
$$\lambda_1 = \frac{1+\sqrt{5}}{3+\sqrt{5}} \quad \lambda_2 = \frac{2}{3+\sqrt{5}}$$

7.14 Let $$S_n, n\geq 0$$, denote a random walk in which $$X_i$$ has distribution $$F$$. Let $$G(t,s)$$ denote the probability that the first value of $$S_n$$ that exceeds $$t$$ is less than or equal to $$t+s$$. That is,
$$G(t,s) = P\{\text{first sum exceeding } t \text{ is } \leq t+s\}$$
Show that
$$G(t, s) = F(t + s) – F(t) + \int_{-\infty}^t G(t-y, s)dF(y)$$

$$S_n|X_1$$ is distributed as $$X_1 + S_{n-1}$$. Thus if $$A$$={first sum exceeding $$t$$ is $$\leq t + s$$},
\begin{align} G(t,s) &\equiv P\{A\} = E[P\{A|X_1\}] \\ &= F(t+s) – F(t) + \int_{-\infty}^t G(t-y, s)dF(y) \end{align}