Square roots (Part VI)

I’ve discussed algorithms for computing square roots a couple of times already, and then some. While sorting notes, I’ve came across something interesting: Archytas’ method for computing square roots.

Archytas’ method is basically the old Babylonian method, where you first set



and iterate

\displaystyle a'=\frac{a+b}{2},

\displaystyle b'=\frac{n}{a'}=\frac{2n}{a+b},

until desired precision is achieved (or the scribe is exhausted).

This procedure has a nice geometric interpretation. You start with a 1\times{n} rectangle (which obviously has area n). Then you draw (or imagine will suffice) another rectangle whose first side’s length is the average of the sides of the previous rectangle: \frac{1}{2}(n+1). To keep a constant area of n, the other side must be \frac{n}{\frac{1}{2}(n+1)}=\frac{2n}{n+1}. That’s still not a square, but the sides are starting to be more equal. If we repeat this a large number of times, both sides will converge to the same length, \sqrt{n}.

Let’s see how this goes. Again,



Then, if we iterate, we get

\displaystyle a_1=\frac{n+1}{2},

\displaystyle a_2=\frac{n^2+6n+1}{4n+4},

\displaystyle a_3=\frac{n^4+28n^3+70n^2+28n+1}{8n^3+56n^2+56n+8},


\displaystyle b_1=\frac{2n}{n+1},

\displaystyle b_2=\frac{4n^2+4n}{n^2+6n+1},

\displaystyle b_3=\frac{8n^4+56n^3+56n^2+8n}{n^4+28n^3+70n^2+28n+1},


So let’s see how this method behaves in practice. First, we notice we don’t really have to find nor compute explicitly a_i and b_i (and get those cumbersome rational functions), but merely iterate with a_i=\frac{1}{2}(a_{i-1}+b_{i-1}) and b_i=\frac{n}{a_i}. Let’s say we limit the number of iteration to 2:

With only two iterations, the as and bs diverge, but the average of the two is not too bad. Clearly two iterations are insufficient. What if we used an adaptive number of iterations to get in the right magnitude?

So having a number of iterations proportional to \log_2 n seems to be working. But is it that precise? Too precise?

So maybe \log_2 n is a bit too much.

* *

This method is essentially equivalent to Newton’s method, and therefore converges at the same speed. Indeed, Newton’s method iterates

\displaystyle x_i=\frac{1}{2}\left(x_{i-1}+\frac{n}{x_{i-1}}\right),

and if we look what happens with the a_i,

\displaystyle a_1=\frac{n+1}{2}=\frac{1}{2}\left(a_0+\frac{n}{a_0}\right),

\displaystyle a_2=\frac{n^2+6n+1}{4n+4}=\frac{1}{2}\left(\frac{n+1}{2}+\frac{n}{\frac{n+1}{2}}\right),


* *

The a_i reveal rational functions that could be used to kick-start Newtons’ method with much better estimates than the usual \frac{1}{2}n.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: