Walking randomly again

This is a follow-up to An exercise in walking randomly, but can be read independently.

Suppose I randomly and blindly walk back and forth along this hallway, which may or may not happen in reality. The safest place to start is in the middle, next to the department’s main office. How long will it take before I bump into a wall at one of the ends?

Carnegie Building

(Visitors are advised to not walk blindly near the ongoing construction.)

Formally, this is a random walk: I start at the midpoint of interval [-n,n] and move to the left or right with probability 1/2. The walk ends when I reach either n or -n. What is the probability that the walk will last exactly k steps?

The answer (found in the earlier post) is given by the coefficient of y^k in the power series for \mathcal T_n(1/y)^{-1}, where \mathcal T_n is the Chebyshev polynomial of degree n. The plot below shows the exit time distribution for n=10, with an extraneous but nice pattern.

Exit time distribition

In a comment, Lianxin He pointed out an alternative approach via reflection principle: it gives the answer as a sum of binomial coefficients, which can be terminated to yield a simple and effective approximation. I describe his approach below.

To begin with, assume that k+n is even, otherwise the probability is zero. Out of 2^k possible walks, \displaystyle \binom{k}{(k+n)/2} make n more steps to the right than to the left. Dividing this binomial coefficient by 2^k and multiplying by two (to account for both exits), we conclude that \displaystyle p_0(k)=2^{1-k} \binom{k}{(k+n)/2} is the probability of reaching an end of the hallway at time k. The graph of p_0 is in red below:

One binomial coefficient

This is not good at all. We neglected to exclude the walks that hit a wall before time k. How many of them are there? Let’s consider only the walks that end at n at times k. Formally, S_k=n where S_j denotes the position at time j.

  1. The walks that hit -n before time k can be reflected about -n the first time they get there. The reflection puts them in bijection with the walks such that S_k=-3n. There are \displaystyle \binom{k}{(k+3n)/2} of those.
  2. If a walk hits n before time k, we flip a fair coin deciding whether to reflect it about n. As a result of our intervention, such walks will have S_{k-1}=n-1 or S_{k-1}=n+1 with equal probability. But on the other hand, our intervention did not change the distribution of walks. Hence, the number of walks hitting n before time k is twice the number of walks with S_{k-1}=n+1. Namely, \displaystyle 2\binom{k-1}{(k+n)/2}.

Excluding both types 1) and 2) counted above, we arrive at the new probability estimate

\displaystyle p_1(k)=2^{1-k} \left\{\binom{k}{(k+n)/2}-2\binom{k-1}{(k+n)/2}-\binom{k}{(k+3n)/2}\right\}

Let’s see how it works…

Three binomial coefficients

Looks very good at the beginning, but for large k we are seriously undercounting. Indeed, the events 1) and 2) described above are not mutually exclusive. For example, a walk could hit n, then -n, and then arrive to n again at time k. At present we are excluding such a walk twice. Let’s fix this by adding them back.

  • The walks that hit n and then -n before time k can be reflected twice, putting them in bijection with the walks such that S_k=5n. There are \displaystyle \binom{k}{(k+5n)/2} of those.
  • If a walk hits -n and then n before time k, we reflect at -n and then flip a coin when the walk reaches -3n. Following the reasoning in 2) above, the count is \displaystyle 2\binom{k-1}{(k+3n)/2}.

Our new estimate, then, is \displaystyle p_2(k)=p_1(k)+2^{1-k} \left\{2\binom{k-1}{(k+3n)/2} + \binom{k}{(k+5n)/2}\right\}. Here it is.

Five binomial coefficients

Looks good! But of course now we overcount the walks that hit the walls three times before time k. Fortunately, the pattern is now clear: the next approximation will be \displaystyle p_3(k)=p_2(k)-2^{1-k} \left\{ 2\binom{k-1}{(k+5n)/2} + \binom{k}{(k+7n)/2}\right\}.

Seven binomial coefficients

You already can’t tell this approximation from the exact distribution. To summarize (pun intended), the exact probability of exiting (-n,n) at time k is

\displaystyle p(k)=2^{1-k}\left\{\binom{k}{(k+n)/2} +\sum_{r=1}^\infty (-1)^r \left[2\binom{k-1}{(k+(2r-1)n)/2}+\binom{k}{(k+(2r+1)n)/2}\right] \right\}

where the series is actually finite.

2 thoughts on “Walking randomly again”

  1. Thanks for this post! It explains everything very clearly.
    I just have a follow-up thought on your power series approach. You calculated the Laplace transform E(e^{-sT}) for s \ge 0, so denote f(T) as the probability density function,
    \displaystyle  E(e^{-sT}) = \int_0^\infty e^{-sT} f(T) dT = \frac{1}{T_N(e^s)}
    How hard is it to perform the reverse transformation on Chebyshev polynomials to get f(T) back?
    Also from the moment generating function for t\le 0,
    \displaystyle E(e^{tT}) = \frac{1}{T_N(e^{-t})}
    We get the ith moment equal
    \displaystyle \frac{ d^i E(e^{tT})}{d t^i} (0) = \frac{ d^i}{d t^i} |_{t=0} \frac{1}{\cosh( N \mathrm{arccosh}\, (e^{-t}))}
    For the first moment (mean),
    \displaystyle  \frac{ d E(e^{tT})}{d t} (0) =  \frac{-\sinh( N \mathrm{arccosh}\, e^0) N \frac{1}{\sinh (\mathrm{arccosh}\,(e^0))} (-e^0)}{[\cosh( N \mathrm{arccosh}\, (e^{-0}))]^2} = N^2
    The additional N comes from l’hopital’s rule in evaluating \displaystyle  \frac{-\sinh( N \mathrm{arccosh}\, e^0)}{\sinh (\mathrm{arccosh}\,(e^0))}=N. The result of mean of N^2 conforms with our martingale argument.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s