# Really, really, really big numbers, I: Bremermann’s Limit

Wait, what was the question?

Bremermann’s Limit is the absolute theoretical limit to the computational ability of an object in the universe.  Using Einstein‘s famous

$E=mc^{2}$

and the Heisenberg Uncertainty Principle, Bremermann computed this theoretical limit of computation.  Of course, any actual computing device is far, far below this limit.  It’s main importance is to give some sort of concrete number when you are talking to what’s potentially computable.  So if you’re making a secure code, you want it to be above Bremermann’s Limit; if you’re trying to use a computer to solve a problem, it had better be way, way below Bremermann’s Limit.

Built by Cray?

This is of particular importance if you have built the Earth to be a large supercomputer in order to determine the The Ultimate Question to which 42 is the answer.

Bremermann’s Limit is described by David Foster Wallace in Everything and More:

H. Bremermann proved in 1962 that “No data processing system, whether artificial or living, can process more than 2 x 1047 bits per second per gram of its mass,” which means that a hypothetical supercomputer the size of earth (= about 6 x 1027 grams) grinding away for as long as the earth has existed (= about 1010 years) can have processed at most 2.56 x 2092 bits, which number is known as Bremermann’s Limit.

Calculations involving numbers larger than 2.56 x 2092 are called transcomputational problems, meaning they’re not even theoretically doable; and there are plenty of such problems in statistical physics, complexity theory, fractals, etc.

Everything and More by David Foster Wallace