Wait, what was the question?

Bremermann’s Limit is the absolute theoretical limit to the computational ability of an object in the universe. Using Einstein‘s famous

and the Heisenberg Uncertainty Principle, Bremermann computed this theoretical limit of computation. Of course, any actual computing device is far, far below this limit. It’s main importance is to give some sort of concrete number when you are talking to what’s potentially computable. So if you’re making a secure code, you want it to be above Bremermann’s Limit; if you’re trying to use a computer to solve a problem, it had better be way, way below Bremermann’s Limit.

Built by Cray?

This is of particular importance if you have built the Earth to be a large supercomputer in order to determine the The Ultimate Question to which 42 is the answer.

Bremermann’s Limit is described by David Foster Wallace in *Everything and More*:

H. Bremermann proved in 1962 that “No data processing system, whether artificial or living, can process more than 2 x 10^{47} bits per second per gram of its mass,” which means that a hypothetical supercomputer the size of earth (= about 6 x 10^{27} grams) grinding away for as long as the earth has existed (= about 10^{10} years) can have processed at most 2.56 x 20^{92} bits, which number is known as Bremermann’s Limit.

Calculations involving numbers larger than 2.56 x 20^{92} are called transcomputational problems, meaning they’re not even theoretically doable; and there are plenty of such problems in statistical physics, complexity theory, fractals, etc.

– *Everything and More* by David Foster Wallace

### Like this:

Like Loading...

*Related*

Can any of you guys take a look at this, and see if it’s correct, or even close to being correct?

http://frothygirlz.com/2010/01/14/big-numbers-part-2/