How HotBits Works


Tunnelling to Freedom

Face it, folks: Nature is a lazy Mother. If there's any way at all a physical system: subatomic particle, nucleus, atom, molecule, star, or galaxy can reduce its energy without violating a law of physics, quantum mechanics tells us it will. What it doesn't tell us is when. Why is this, and how can we exploit this physical principle to generate random numbers?

Nuclear Decay the Beta Way

Consider the atoms of Cæsium-137 that make up the HotBits radiation source. Due to details of how the atomic nucleus is structured which we thankfully don't need to get into here, it turns out that if one of the neutrons in the nucleus were to turn into a proton, the resulting Barium-137 nucleus would have less binding energy. Now a neutron just can't turn into a proton willy-nilly: that would violate the law of charge conservation since a neutron with a charge of 0 would be changing into a proton with a charge of +1. Physicists believe charge conservation is never violated: even a black hole bears the net charge of all the particles it has devoured. But there's a way around this—if the neutron changes into a proton and simultaneously spits out an electron, the charge before and after is the same; before we had a neutron with a charge of 0, afterward a proton with a charge of +1 and an electron with a charge of −1. +1 + −1 = 0: the books balance! In the world of atomic physics, this is called beta decay and the electron that flies out of the nucleus a beta particle.

(A beta particle is an electron, pure and simple, and all electrons are absolutely identical. The reason an electron which happens to begin its career by being shot out of an atomic nucleus as opposed to, say, boiled out of the hot metal filament in the other end of your computer monitor, is called a “beta particle” is historical. It took a while for physicists to figure out that “beta rays” and electrons were one and the same thing, and by that time the name had stuck.)

Anyway, we can write the formula for the beta decay of Cæsium-137 as:

^{137}\mathrm{Cs}\ \stackrel{\scriptscriptstyle 30.17\mathrm{y}}{\longrightarrow}\ ^{137\mathrm{m}}\mathrm{Ba}+\beta^-+\overline{\nu_e}\ \stackrel{\scriptscriptstyle 156\mathrm{s}}{\longrightarrow}\ ^{137}\mathrm{Ba}+\gamma

The Cæsium-137 nucleus (the 137 means there is a total of 137 protons and neutrons in the atom) spontaneously turns into a metastable nucleus of the element Barium which still has a sum of 137 protons and neutrons, and a beta particle (electron) flies out, resulting in no net difference in charge. Shortly thereafter (the half-life is just 156 seconds), the excited Barium nucleus emits a gamma ray and becomes the stable ground state of Barium-137. “Gamma rays” turn out to be nothing other than photons—particles of light, just carrying a lot more energy than visible light. They're called “gamma rays” instead of “photons” for same reason beta particles aren't just called electrons. Nuclear reactions release a lot of energy: photons of visible light have an energy between 1 and 10 electron volts. The electrons in your computer monitor or television have energies between 10,000 and 20,000 electron volts (the high voltage needed to impart this energy to them is why it's a poor idea to stick your hand inside a television set). By comparison, the photon that flies out of the decaying Barium-137m nucleus has 661,660 electron volts (662 keV) of energy, and the electron from Cesium-137 has a maximum energy of 1,176,000 electron volts (1.176 MeV). (Beta decay of a neutron into a proton emits both an electron and an electron antineutrino, which together carry away the total energy of 1.176 MeV; the antineutrino passes through the detector (and for that matter, the entire Earth) without any interaction, so the energy of the electron, which is detected, varies depending upon what fraction of the total energy it happens to carry.) Particles with this kind of energy behave differently than the kind we usually encounter, which is why it took a while for physicists to figure out they really were just very energetic photons and electrons. This also explains why “nuclear radiation” is more dangerous than daylight and why nuclear bombs make so big a bang compared to the same amount of dynamite.

First Uncertainty Bank: Energy Loans for Needy Nuclei

Now you'd think that given the chance to reduce its energy by more than a million electron volts, a Cæsium-137 nucleus would be just itching to heave that electron out the door. But there's a catch. Even though the final result of emitting the electron reduces the energy of the nucleus, the process of emitting it requires more energy than the nucleus has lying around. Think of the poor Cæsium nucleus as being trapped on a hillside like this:

If it manages somehow to get the energy to make it over the little bump to the right, it can slide all the way down to the bottom and turn into Barium, but otherwise it's stuck where it is. If quantum mechanics did not govern the universe, the Cæsium-137 nucleus would be stable. But, of course, without quantum mechanics atoms wouldn't be stable, so neither you nor I nor anything else made of atoms would exist, so despite all its complexity, fuzziness, uncertainty, and spooky action-at-a-distance, quantum mechanics is probably a Good Thing. However, I must note that quantum mechanics also permits Microsoft Windows to exist.

What our Cæsium atom stuck on its energy ledge needs is a loan of energy to escape. Once over the hill, it will gladly repay its debt with the ample energy it releases as it skids down the slope into the valley of Barium. We could loan the energy to the nucleus by hitting it with a gamma ray, but thanks to the uncertainty principle of Heisenberg, that isn't necessary! The nucleus can, in effect, borrow the energy from the vacuum, momentarily violating the law of conservation of energy, and then, from the energy released by the decay, repay the loan before the conservation cops arrive.

Heisenberg's uncertainty principle provides, described in very broad brushstrokes to avoid getting bogged down in detail, that while any given physical quantity: the position of a particle for example, can be measured as precisely as you wish, the more uncertain a complementary quantity, momentum in the case of position, becomes. The same uncertainty relation applies to time and energy. You can measure the energy of a system as precisely as you like, but there is a minimum time required to measure its energy to a given precision. Conversely, the energy of a system can be said to fluctuate to an increasing extent as you observe it over shorter and shorter intervals.

On the scale of atoms and subatomic particles, the results of this uncertainty have profound effects. For the uncertainty of energy at very short time intervals means there is a nonzero probability that, at a given instant, the Cæsium-137 nucleus will have enough energy to surmount the hill that is confining it. Once pushed over the edge, the energy released pays back the uncertainty principle's “energy loan” in a time less than would be required to measure the momentary non-conservation of energy. One can also view the confined Cæsium nucleus as “tunnelling through” the barrier confining it—in fact, this process is called “quantum tunnelling”.

But even though the energy loan which triggers a beta decay is not detectable, the decay that results most definitely is and, being impossible in the absence of the uncertainty principle, demonstrates its essential role in nuclear and atomic physics. Note that once the Cæsium nucleus beta decays into Barium-137, it finds itself at the bottom of a valley with steep walls on either side. There is no place to tunnel to—it is trapped since the energy it would need to jump back up to the Cæsium-137 level could be “borrowed” only for an interval less than the time needed to transform a proton into a neutron. As a consequence, Barium-137 is a stable isotope—it is not radioactive. We could, however, give it the energy required to jump it back onto the Cæsium-137 ledge. By bombarding it with energetic electrons (beta particles) in a particle accelerator or nuclear reactor, occasionally an electron will strike a Barium nucleus with sufficient energy to convert a proton into a neutron—reversing the arrow in the beta decay equation, transforming it back into Cæsium-137 through the process of inverse beta decay. Once transformed, it is, of course, doomed to eventually tunnel its way back to the bottom of the valley.

Physicists: please excuse my glossing over details such as the weak interaction, W bosons, u and d quarks, cross sections, etc. etc. and the very sloppy description of the uncertainty principle. I'm afraid if I go into any more detail, I'll lose the entire audience before we get to the good stuff—half life and the no-hidden-variables nature of quantum theory.

Get a (Half-)Life

Barium is stable because the energy valley that contains it is so deep it can't borrow the energy needed to tunnel out for long enough to complete the process. The barrier confining Cæsium-137 is sufficiently high that a given nucleus has only a 50% chance of tunnelling through in a period of 30.17 years—eternity on the time scale of most nuclear events. This is called its half-life, since if you start out with a given large number of Cæsium-137 nuclei, every 30.17 years you'll find that, on the average, half of the number present at the start of the period have decayed into Barium. What happens if the barrier is higher or lower, as is the case for other nuclei prone to beta decay? Well, if the barrier is lower, it means less energy needs to be borrowed to surmount it, and as a result the energy can be borrowed for a longer multiple of the time needed to “do the deal”. As a result, the probability of the nucleus decaying in a given period of time is increased or, in other words, the half-life is decreased. The nucleus with a lower barrier will be more radioactive. Sodium-35 perches precariously on a ledge with a tiny barrier compared to the one that confines Cæsium-137. As a result, its half-life is only 1.5 milliseconds—one and a half thousandths of a second. On the other hand, Indium-115 has an energy barrier so high that you have to wait 4.4×1014: 440 million million years for half the nuclei in a sample to decay. It kind of takes your breath away to discover a mundane physical process which occurs at rates varying over 24 orders of magnitude—from about a thousand times a second to a thousand times the age of the universe, but many things about quantum mechanics take your breath away once you invest the effort to appreciate (if not understand) them.

What's interesting, and ultimately useful in our quest for random numbers, is that even though we're absolutely certain that if we start out with, say, 100 million atoms of Cæsium-137, 30.17 years later we'll have about 50 million, 30.17 years after that 25 million, and so on, there is no way even in principle to predict when a given atom of Cæsium-137 will decay into Barium. We can say that it has a fifty/fifty chance of doing so in the next 30.17 years, but that's all we can say. Ever since physicists realised how weird some of the implications of quantum mechanics were, appeals have been made to “hidden variables” to restore some of the sense of order on which classical physics was based. For example, suppose there's a little alarm clock inside the Cæsium-137 nucleus which, when it rings, causes the electron to shoot out. Even if we had no way to look at the dial of the clock, it's reassuring to believe it's there—it would mean that even though our measurements show the universe to be, at the most fundamental level, random, that's merely because we can't probe the ultimate innards of the clockwork to expose its hidden deterministic destiny.

But hidden variables aren't the way our universe works—it really is random, right down to its gnarly, subatomic roots. In 1964, the physicist John Bell proved a theorem which showed hidden variable (little clock in the nucleus) theories inconsistent with the foundations of quantum mechanics. In 1982, Alain Aspect and his colleagues performed an experiment to test Bell's theoretical result and discovered, to nobody's surprise, that the predictions of quantum theory were correct: the randomness is inherent—not due to limitations in our ability to make measurements. So, given a Cæsium-137 nucleus, there is no way whatsoever to predict when it will decay. If we have a large number of them, we can be confident half will decay in 30.17 years; but if we have a single atom, pinned in a laser ion trap, all we can say is that is there's even odds it will decay sometime in the next 30.17 years, but as to precisely when we're fundamentally quantum clueless. The only way to know when a given Cæsium-137 nucleus decays is after the fact—by detecting the ejecta. A Cæsium-137 nucleus which has “beat the reaper” by surviving a century, during which time only one in a thousand of its litter-mates haven't taken the plunge and turned into Barium, has precisely the same chance of surviving another hundred years as a newly-minted Cæsium-137, fresh from the reactor core.

Bit from It

This inherent randomness in decay time has profound implications, which we will now exploit to generate random numbers—HotBits. For if there's no way to know when a given Cæsium-137 nucleus will decay then, given an collection of them, there's no way to know when the next one of them will shoot its electron bolt and settle down to a serene eternity as Barium. That's uncertainty, with its origins in the deepest and darkest corners of creation—precisely what we're looking for to make genuinely random numbers.

If we knew the precise half-life of the radioactive source driving our detector (and other details such as the solid angle to which our detector is sensitive, the energy range of decay products and the sensitivity of the detector to them, and so on), we could generate random bits by measuring whether the time between a pair of beta decays was more or less than the time expected based on the half-life. But that would require our knowing the average beta decay detection time, which depends on a large number of parameters which can only be determined experimentally. Instead, we can exploit the inherent uncertainty of decay time in a parameter-free fashion which requires less arm waving and fancy footwork.

The trick I use was dreamed up in a conversation in 1985 with John Nagle, who is doing some fascinating things these days with artificial animals. Since the time of any given decay is random, then the interval between two consecutive decays is also random. What we do, then, is measure a pair of these intervals, and emit a zero or one bit based on the relative length of the two intervals. If we measure the same interval for the two decays, we discard the measurement and try again, to avoid the risk of inducing bias due to the resolution of our clock.

To create each random bit, we wait until the first count occurs, then measure the time, T1, until the next. We then wait for a second pair of pulses and measure the interval T2 between them, yielding a pair of durations. If they're the same, we throw away the measurement and try again. Otherwise if T1 is less than T2 we emit a zero bit; if T1 is greater than T2, a one bit. In practice, to avoid any residual bias resulting from non-random systematic errors in the apparatus or measuring process consistently favouring one state, the sense of the comparison between T1 and T2 is reversed for consecutive bits.

For example, you might worry about the fact that the intensity of the radiation source is slowly decreasing over time. Cæsium-137's 30.17 year half-life isn't all that long. One half-life in the future, we'll measure T1 and T2 intervals, on the average, twice as long as today. This means, then, that even on consecutive measurements there is a small bias in favour of T2 being longer than T1. How serious is this? Well, expressed in seconds, the half-life is about 9.5×108 and we receive count pulses at a rate of 1000 per second or so. So the time needed to perform the measurements to produce one random bit is on the order of 10−12 half-lives, and T2 will then tend to be longer by a factor of the same magnitude. Since the inter-count interval is around a millisecond, this means T2 will be, on average, 10−15 seconds longer than T1. This is comparable to the long-term accuracy of the best atomic time standards and is entirely negligible for our purposes. The crystal oscillator which provides the time base for the computer making the measurement is only accurate to 100 parts per million, or one part in ten thousand, and thus can induce errors ten million times as large as those due to the slow decay of the source. (This is, again, unlikely to be a real problem because most computer clocks, while prone to drifting as temperature and supply voltage vary, do not change significantly on the millisecond scale. Still, jitter due to where the clock generator happens to trigger on the oscillator waveform will still dwarf the effects of decay of the source during one measurement.)

The eminent physicist John Archibald Wheeler has speculated that, at the deepest level, the universe is made of information, and that all the complexity we see from the subatomic to the cosmic scale is an emergent property of this underlying simplicity, just as the simplest computer can, given enough time, faithfully simulate physical processes far more complicated than itself. Wheeler calls this “it from bit”—matter, energy, and the universe as a whole may be the consequences of the exchange and processing of information. This may or may not be true, but in any case HotBits brings the converse to your virtual desktop: information generated by a fundamental, inherently unpredictable, subatomic process delivered directly to you over the Web. Bit from it!

Serving 'em Hot and Fresh…

Finally, how does your request for HotBits get processed? You request HotBits by filling out and transmitting a request form, which is sent by your World-Wide Web browser in HTTP to our Web server, www.fourmilab.ch. Your request form is processed by a CGI program written in Perl which, after validating the request, forwards it in HTTP format to a dedicated HotBits server machine which is connected to the HotBits generation hardware via the COM1 port

Why the indirection? Timing the intervals between decay events without the kind of special-purpose hardware I used in my original 1986 design requires locking out interrupts and dedicating the CPU to measuring the time between counts, since otherwise other processes which use the CPU, even those as innocent as a screen saver, could introduce nonrandom periodicities in the bitstream. Dedicating a machine permits us to prevent interrupts and obtain maximum-resolution measurements of the inter-count delays without compromising response time for requests from the outside world.

To provide better response, each dedicated HotBits server machine maintains an inventory of sixty-seven million (8192 kilobytes) random bits, and services requests from this inventory whenever possible. The server rebuilds inventory in the background, between user requests for HotBits.

Interposing the main www.fourmilab.ch servers also makes it possible to maintain a separate inventory of HotBits on that machine, refreshed by periodically drawing down the inventory on the HotBits server when it is full. A separate inventory on the Web server permits faster response to requests since there is no need to contact a dedicated HotBits machine as long as the request can be filled from local inventory. Further, it allows uninterrupted service even when all primary HotBits generator machines are down for maintenance. The main server inventory is maintained by a HotBits Proxy Server running on that machine which communicates in the same HTTP protocol as the dedicated HotBits machines. Source code for the HotBits Generator and Proxy Server is in the public domain and is available for downloading; it is intended to run on a Linux system and has been developed without concern for portability to other Unix environments.

HotBits Main Page

HotBits Hardware Description

HotBits Software Driver

Statistical Tests of HotBits Data


Valid XHTML 1.0
by John Walker