I was reading ZFS‘s specs, and it struck me how Sun deliberately over-engineered the filesystem. To make sure that ZFS would be able to deal with just any storage device (or pool), they made ZFS a 128 bits filesystem. Now, 2128 blocks (2137 bytes) is a truly enormous quantity of data. In fact, it is so enormous that Jeff Bonwick explains how populating such a capacious device it would require more energy than is needed to boil all of Earth‘s oceans, give or take a few cubic miles.
The quote, which I find thoroughly amusing, is reproduced here:
Although we’d all like Moore’s Law to continue forever, quantum mechanics imposes some fundamental limits on the computation rate and information capacity of any physical device. In particular, it has been shown that 1 kilogram of matter confined to 1 litre of space can perform at most 1051 operations per second on at most 1031 bits of information. A fully populated 128-bit storage pool would contain 2128 blocks = 2137 bytes = 2140 bits; therefore the minimum mass required to hold the bits would be (2140 bits) / (1031 bits/kg) = 136 billion kg. […] To operate at the 1031 bits/kg limit, however, the entire mass of the computer must be in the form of pure energy. By E=mc2, the rest energy of 136 billion kg is 1.2×1028 J. The mass of the oceans is about 1.4×1021 kg. It takes about 4,000 J to raise the temperature of 1 kg of water by 1 degree Celsius, and thus about 400,000 J to heat 1 kg of water from freezing to boiling. The latent heat of vaporization adds another 2 million J/kg. Thus the energy required to boil the oceans is about 2.4×106 J/kg * 1.4×1021 kg = 3.4×1027 J. Thus, fully populating a 128-bit storage pool would, literally, require more energy than boiling the oceans. [ref]
Even though this quote reeks of hubris—although I’m not expecting vengeful bolts of lightning striking Bonwick any time soon—it got me thinking about the fundamental limits of computation, and how he could derive such a result.
After a bit of research, I found that Seth Lloyd answered the question in his Nature paper (Ultimate Physical Limits to Computation, Nature, vol 406, p. 1047-1054, doi:10.1038/35023282) where he discusses the fundamental limits of computation in term of quantum physics. He discuss the ultimate laptop, the conveniently sized 1L computer that computes at the limit of what is physically possible. However, to perform its 1050 operations per second on at most 1031 bits, the ultimate computer runs hot. Not Intel P4 hot. Big Bang! hot. 109 Kelvins. From the calculations, we see what we can exchange speed (and heat) for more storage, so the ultimate computer is amenable to trade-offs, depending on whether you need more storage or more speed.
But the ultimate computer is weird, quantum, and possibly amorphous; it is not a conventional computer at all. Although Lloyd describes it as a binary computer, he doesn’t quite describe how bits can be found, operated upon, and stored back, but one may conclude that this is included in the 1050 operations. Indeed, one can understand reads and writes as logical operations, and this means that somehow, bits can be stored and retrieved deterministically.
But any practical computer must run much cooler than the 109 K of Lloyd’s computer. Current computers runs at about 3×102 K, perform about 1010 operations per second on much less than 1010 bits, so they are very far from the theoretical limits of computation, giving much hope for Moore’s law to run still for a while before we run into real problems. Well, it is expected to run for another decade at least before we need to venture into the realm of quantum-scale computers.