Serieus Business


Everybody knows the Fibonacci series. It begins with a 0 and a 1 and the next element is generated by summing the previous two. You can find the first few elements of the sequence easily, even by hand. They go 0, 1, 1, 2, 3, 5, 8, 13, 21, and so on. But what if you wanted the 100th element?  Easy, write a computer program to do it for you. But then what of the thousandth element? Or the ten thousandth element? Using this basic recursive definition, computing elements of the series way down the line becomes impossibly computationally intensive, even for computers. There must be a better way of finding elements to this, and other similar series.

The Process

Consider the following matrix

A = \left[\begin{array}{cc} 1 & 1 \\ 1 & 0 \end{array} \right]

Not a very interesting matrix, on its own, but let’s pretend we have a column matrix, \mathbf{x}, composed of two consecutive members of the Fibonacci sequence, and then compute A\mathbf{x}.

A\mathbf{x} = \left[\begin{array}{cc} 1 & 1 \\ 1 & 0 \end{array}\right] \left[\begin{array}{c} F_{i+1} \\ F_{i} \end{array} \right] = \left[\begin{array}{c} F_{i+1} + F_{i} \\ F_{i+1} \end{array} \right] = \left[\begin{array}{c} F_{i+2} \\ F_{i+1} \end{array}\right]

Interesting. Apparently, multiplying x by the matrix A discards the earlier Fibonacci number and adds a new one. It’s like being presented with an infinite tape of Fibonacci numbers, but only having a small window through which you can see only two numbers at a time. Multiplying by A shifts the tape over one place. You could multiply by A as many times as you wish to shift it over to any position you desire. You know what would be great? A simple formula for the k-th power of A, that is, the matrix that does the same thing as multiplying by A k times. Luckily, we can do this if we remember that if A is a diagonalizable, P is the diagonalizing matrix and D is the corresponding diagonal matrix, then:

A^k = PD^kP^{-1}

If we want a formula for A^k all we need to do is diagonalize it. So, let’s solve the characteristic polynomial.

\begin{array}{lcl}\left|\lambda I - A \right| & = & 0 \\ \left| \begin{array}{cc} \lambda - 1 & -1 \\ -1 & \lambda \end{array} \right| & = & 0 \\ \lambda^2 - \lambda - 1 & = & 0 \end{array}

The solutions two the characteristic equation are \frac{1 \pm \sqrt{5}}{2}. The positive root I shall denote \phi, because it happens to be the golden ratio, and I shall denote the negative root \sigma because I like using Greek letters.

When you solve for corresponding eigenvectors, you’ll see that \left[\begin{array}{c} \phi \\ 1 \end{array}\right] is an eigenvector for \phi and \left[\begin{array}{c} \sigma \\ 1 \end{array}\right] is an eigenvector for \sigma . So, if we let P = \left[\begin{array}{cc} \phi & \sigma \\ 1 & 1 \end{array}\right] then we know that D = \left[\begin{array}{cc} \phi & 0 \\ 0 & \sigma \end{array} \right]

Now, let’s go back to the equation from before. We know that A^k = PD^kP^{-1}. So if \mathbf{x} represents the first two elements (the 0-th and the 1-th) of the Fibonacci sequence then we can shift run through our infinite tape to find the k-th and (k+1)-th element by computing A^k \mathbf{x} which equals PD^k P^{-1}\mathbf{x}

After finding P^{-1} and simplifying the expression, you’ll find that
A^k \mathbf{x} = \left[\begin{array}{c} \frac{\phi^{k+1} - \sigma^{k+1}}{\phi - \sigma} \\ \frac{\phi^{k} - \sigma^{k}}{\phi - \sigma} \end{array}\right]

You can confirm this formula works on a calculator yourself. Now, it’s not hard to see that this expression is a bit superfluous. We don’t need to calculate two Fibonacci numbers at once; we only need one. Note that if we pay attention to only the bottom element of the matrix, we’ll have an expression for the k-th element of the Fibonacci series. So we really only need that part and we have a simple, easy to calculate expression for Fibonacci numbers. Computers do exponentiation with ease so someone could easily compute nearly any Fibonacci number he wanted.

But now, one is left to ask, what if it’s not Fibonacci numbers you want, but a different sequence? Is there a way to generalize this result? Well of course. This wouldn’t be nearly as fun if there wasn’t.


Consider a series defined like so:
s_n = a_1 s_{n-1} + a_2 s_{n-2} ... + a_m s_{n-m} = \displaystyle \sum_{i=1}^{m} a_i s_{n-i}

The resulting series obviously depends on what the initial values from 0 to k – 1 are set to.  Let \mathbf{x} be the column matrix composed of these values and we lets make a matrix A such that:

A = \left[\begin{array}{cccc} a_1 & a_2 & \cdots & a_m \\ 1 & \cdots & 0 & 0 \\ \vdots & \ddots & \vdots & \vdots \\ 0 & \cdots & 1 & 0 \end{array} \right] = \left[\begin{array}{c|c} \multicolumn{2}{c}{\mathbf{a}} \\ \hline \ I & \mathbf{0} \end{array}\right]

That is, make it so A has a top row of all the coefficients, in order, used to calculate the next element of the series, has an (m-1) x (m – 1) identity matrix in the bottom left and whose other elements are all 0. If you multiply \mathbf{x} by this matrix the result will have the next element of the sequence on the top and all other elements will be shifted down and the very bottom element will be discarded. Just like before, we are running through an infinite tape containing the elements of the series. Only this time, the window allows us to see not only 2 elements at a time but m elements at once.

Now, we have to diagonalize this general matrix. The characteristic equation for A is:

\begin{array}{rcl} \lambda^m - a_1\lambda^{m-1} - a_2\lambda^{m-2} ... - a_m & = & 0 \\ \lambda^m & = & \displaystyle \sum_{i=1}^{m} a_i \lambda^{m-i} \end{array}

Although I won’t prove it here, the proof is easy can be done with induction. Now we need to do is find the corresponding eigenvectors to the eigenvalues. Let’s take an eigenvalue of A, \lambda  and  make a column matrix \mathbf{u} ,

\mathbf{u} = \left[\begin{array}{c} \lambda^{m-1} \\ \lambda^{m-2} \\ \vdots \\ 1 \end{array} \right]

It is easy to see that
A \mathbf{u} = \left[ \begin{array}{c} a_1 \lambda^{m-1} + a_2 \lambda^{m-2} + \cdots a_m \\ \lambda^{m-1} \\ \vdots \\ \lambda \end{array} \right] = \left[ \begin{array}{c} \lambda^{m} \\ \lambda^{m-1} \\ \vdots \\ \lambda \end{array} \right] = \lambda \mathbf{u}

which means that \mathbf{u} is an eigenvector corresponding to \lambda. Using this fact, if lambda_i represents an eigenvalue, we can create a diagonalizing matrix P,

P = \left[\begin{array}{cccc} \lambda_1^{m-1} & \lambda_2^{m-1} & \cdots & \lambda_m^{m-1} \\ \lambda_1^{m-2} & \lambda_2^{m-2} & \cdots & \lambda_m^{m-2} \\ \vdots & \vdots & \vdots & \vdots \\ 1 & 1 & 1 & 1 \end{array}\right]

And the the diagonalized matrix D becomes
D = \left[ \begin{array}{ccc} \lambda_1 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \lambda_m \end{array} \right]

Alright, we’re almost done. We just look back on the equation A^k \mathbf{x} = PD^k P^{-1} \mathbf{x}. Now, as before, this expression is superfluous. We only need the bottom row of the resulting matrix. But the bottom row of PD^k P^{-1} \mathbf{x} is the same as the bottom row of PD^k times the result of P^{-1} x. And to find the bottom row of PD^k we just multiply the bottom row of P with D^k which evaluates to
\left[ \begin{array} {cccc} \lambda_1^{k} & \lambda_2^{k} & \cdots & \lambda_m^{k} \end{array} \right]

which I’ll call \mathbf{\Lambda}^{\mathbf{T}}. Finally if I \mathbf{S} = P^{-1}\mathbf{x} then the k-th element of the series S can be found with
\mathbf{\Lambda}^{\mathbf{T}} \mathbf{S} = \mathbf{S} \cdot \mathbf{\Lambda}

And that’s it. Simply compute \mathbf{S} from the initial values and the matrix P, and you’ll have an easy expression for the k-th element of the series.

One more thing

I’m sure some of you have been wondering, “What happens if A isn’t diagonalizeable?” Well, then you’re pretty much screwed. Really, I’m not sure what to do when your matrix isn’t diagonalizeable. If anyone thinks of anything, let me know.


About rioghasarig

Just a high school student with an interest in math, science, programming, philosophy, religion, that's pretty much it.
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s