Quantum Noise and Information

H. J. Bremermann

University of California, Berkeley

1. Introduction

Existing computers are too slow, have too little storage and not enough processing capacity to cope with certain tasks. The following are typical of such tasks: inspection of all branches of the "tree" of all possible move sequences of a game, such as chess, optimization of (nonlinear) functions of many variables, certain decision and cognition problems.

It is the contention of this paper that speed, memory, and processing capacity of any possible future computer equipment are limited by certain physical barriers: the light barrier, the quantum barrier, and the thermodynamical barrier. These limitations imply, for example, that no computer, however constructed, will ever be able to examine the entire tree of possible move sequences of the game of chess.

Some mathematicians (for example, the intuitionist school) object to certain kinds of "existence proofs" and favor "constructive proofs." Finite problems including problems such as examination of the chess treeā€“are considered as trivial in this context. In view of the physical barriers to computation, however, many finite problems are transcomputational.

In order to have a computer play a perfect or nearly perfect game (chess, go, and so forth) it will be necessary either to analyze the game completely (as, for example, "Nim" has been analyzed cf. Wang [23]) or to analyze the game in an approximate way and combine this with a limited amount of tree searching. Such an approach has been pioneered, for example, by Samuel [18] for checkers, Gelernter [8] for theorem. proving, Slagle [21] for evaluating integrals, Raphael [14] for question answering. A theoretical understanding of such heuristic programming, however, is still very much wanting.

Some further aspects of the physical limits of computation have been discussed in Bremermann [4]. A preliminary announcement of the results of this paper was made in Bremermann [3].

2. The light barrier

Signals travel no faster than the speed of light. In one nanosecond (10-9 sec) light travels a distance of about one foot. A random access memory that is to deliver information to a given point at nanosecond speeds must thus have a diameter no larger than about a foot. For a combined consequence of the light barrier and the (following) quantum barrier see Bledsoe [1] and Bremermann [3].

3. The quantum barrier

In the following we assume Shannon's definition [19] for the capacity of a continuous channel. The capacity of a data processing system we define as the sum of the capacities of all its input, output and internal channels (channels between processing, control, memory units, and so forth).

- PROPOSITION. The capacity of any closed information transmission or processing system does not exceed mc2/h bits per second, where m is the mass of the system, c the light velocity, h Planck's constant.

Note that the system is assumed to be closed. Its mass consists of the mass of the particles making up its structure plus the mass equivalent of energy employed in signals, and so forth. No matter how the total mass is distributed, the mass equivalent of the signals is bounded by m and the signaling energy by mc2.

According to quantum mechanics, electromagnetic oscillations are quantized and so are all other signals. With every moving particle a wave is associated which is quantized such that the energy E of one quantum is E = hn, where n is the frequency of the wave. In order that a signal with which a frequency n is associated can be observed, at least one quantum of the signal is required. This condition limits the frequency band that can be used for signaling to nmax = mc2/h since above nmax the energy of one quantum would exceed mc2.

The capacity C of a band limited channel is given by a formula due to Shannon [19]

(3.1)
 
where S and N are the signal and noise power.

To compute the noise energy we assume that our signal is represented by a time varying complex valued function f(t). (If the physical signal is a vector, spinor or other nonscalar quantity, then we choose a suitable representation and consider each scalar component as a separate channel.) Suppose f(t) is observed for a time interval DT. In this interval we represent f(t) by means of a Fourier series

(3.2)
 
where w = 2p/DT and
(3.3)
 

If the spectrum of f(t) is band limited, then an = 0 for |nw| > wmax which is equivalent to |n| > nmaxDT.  According to the rules of quantum mechanics, phase cannot be measured, only amplitudes. Moreover, energy measurement is governed by the uncertainty principle; DE³ h/DT. The partial waves are independent. Measurement of f(t) amounts to a measurement of the energies of the partial waves, each of which contributes an uncertainty of DE. We interpret this as quantum noise with energy DE which perturbs the signal. There are nmaxDT partial waves of different frequency (not counting phase). Thus the total noise energy is DEnmaxDT = mc2. It is equal to the signal energy. Hence the signal to noise power ratio is one. Hence by Shannon's formula the capacity is C= nmax log2 2 = nmax = mc2/h. Thus the capacity is proportional to the energy allotted to a channel. Thus, if the total energy is split up between several channels, the total capacity is the same as in the case when the total energy is allotted to a single channel. Hence our proposition follows. It is dependent, of course, upon the validity of quantum mechanics, which, as physical theories in general, is subject to modification if empirical evidence contradicting the theory should be found.

4. Interpretations of the quantum barrier

The quantity h/c2 is the mass equivalent of a quantum of an oscillation of one cycle per second. Our proposition can be stated as follows: information transmission is limited to frequencies such that the mass equivalent of a quantum of the employed frequency does not exceed the mass of the entire transmission or computing system. Put in a different way: each bit transmitted in one second requires a mass of at least the mass equivalent of a quantum of oscillation of one cycle per second.

The mass of a hydrogen atom is about 1.67 ×10-24 gm, while c2/h = 1.35 ×1047 gm-1 sec-1. Thus our proposition implies that per mass of a hydrogen atom no more than 2.3 ×1023 bits can be transmitted per second. It appears that our limit is quite a generous one.

On the other hand, the number of protons in the universe is estimated as about 1073. Thus, if the whole universe were dedicated to data processing, and not counting other factors that tend to restrict data processing, no more than 2.3 ×1096 bits per second, or 7 ×10103 bits per year could be processed. Note that this figure falls short of the number of 10120possible move sequences of the game of chess (compare Bremermann [4]).

Another way of looking at the quantum barrier is the following. The size of the nucleus of the hydrogen atom is about 10-12 cm. Light travels one centimeter in 3 ×10-11 sec, thus it takes 3 × 10-11 sec to travel the distance of the size of a proton. Thus the quantum barrier is equivalent to processing about 7 bits per proton mass in the time it takes light to traverse the diameter of a proton.

The quantum barrier was announced in the form of a conjecture in Bremermann [3]. W. W. Bledsoe [1] derived from it an absolute bound for the speed of serial machines. The notion of quantum noise apparently was coined by Gabor [6], [7]. Quantum effects in communication channels have also been studied by Bolgiano and Gottschalk [2], Lasher [13], and Gordon [9], [10]. The latter gives a formula for quantum noise in a transmission line that operates at a fundamental frequency n. For small noise power (from other sources) the quantum effect amounts to an equivalent noise power of hnB, where B is the number of modes (harmonies) that are excited. Kompfner [12] has pointed out that Gordon's results imply that quantum noise constitutes a problem in optical communications, for example, if a laser beam is used as carrier. Gordon's results imply that for a frequency of 6 ×1014 cycles per second (that is, one half micron wavelength, in the visible spectrum) quantum noise is about 100 times as large as thermal noise at room temperature (~300° K). Thermal noise, in general, will be discussed in the following sections. For a further bibliography on quantum noise effects, see Gordon [10]. Note that quantum effects discussed in the literature are mostly concerned with special cases. In contrast, our quantum barrier is an upper bound on data transmission that for most any specific case. could be substantially improved, which, however, has the advantage of being universal.

5. The thermodynamical barrier

The quantum barrier is comparable to the first law of thermodynamics; it establishes a mass energy equivalent for the bit rate of a signal. It does not take into account entropy changes.

The second law of thermodynamics asserts that the state of an isolated system changes in such a way that the entropy increases. According to Boltzmann-Planck the entropy change of a system is equal to k ln (P1/P2), where P1 and P2 are the probabilities of the initial and final states.

Brillouin [5] distinguishes between free and bound information. Information theory is an abstract theory; the symbols and their probabilities are abstract quantities. When they are represented by physical states or events the information becomes bound. We are concerned with bound information.

If a quantity I (bits) of information is encoded in terms of physical markers, the probability of the state of the system is decreased by a factor of 2-I and thus the entropy is decreased by k ln 2I = Ik ln 2. If the system is isolated, there must be a compensating increase in entropy to offset the decrease.

If the total system is composed of an information system in contact with a thermostat, then if there are no other entropy changes (for example, chemical), then the thermostat absorbs DQ = TDS units of heat, that is kT ln 2 units per bit of information (compare Brillouin [5], J. Rothstein [15], [16], [17] and Setlow-Pollard [20]). This calculation applies only to processes where quantum effects are negligible. Brillouin ([5], p. 185) has shown that in the special case where the physical marker is a harmonic oscillator of frequency n the amount of heat generated is kT ln 2 in the thermal range, that is, for hn < kT but hn when hn > kT. Thus when the rate of information processing is rapid, we may expect quantum effects that increase the amount of heat generated above kT ln 2 per bit. Here, obviously, are open problems. There also remains the problem whether and under what conditions the receiver of bound information can utilize the negentropy conveyed.

In Bremermann [4] it was shown that microorganisms (E. Coli) produce bound information as they grow and do so about as efficiently as the kT ln 2 per bit rule will permit.

6. Efficiency of the brain

According to von Neumann [22] the human brain dissipates about 10 watts of heat. If we assume that there are 1010 neurons and that each neuron processes 100 bits per second (which would seem a generous estimate) at 310°K, we have 1012 sec-1 kT ln 2 » 3 ×10-2 ergs/sec = 3 ×10-9 watts. Thus if for each bit processed kT ln 2 ergs would actually have to be dissipated, the brain would still be inefficient by a factor of about 3.3 ×109 » 3 ×109.  Thus data processing in the brain is thermally inefficient unless processing occurs also at the molecular level in neurons and glial cells as has been suggested by Hydén [11].

NOTE ADDED IN PROOF. The author has become only recently aware of the work of D. S. Lebedev and L. B. Levitin which is concerned with closely related questions [24], [25], [26], [27].

This work was supported in part by the Office of Naval Research under contracts NONR 222(85) and NONR 3656(08).

REFERENCES

  1. W. W. Bledsoe, "A basic limitation of the speed of digital computers," IRE Trans. Electr. Comp., Vol. EC-10 (1961), p. 530.
  2. L. P. Bolgiano and W. M. Gottschalk, "Minimum power set by quantum effects," Proceed. IRE, Vol. 49 (1961), pp. 813-814.
  3. H. J. Bremermann, "Optimization through evolution and recombination," Self-Organizing Systems--1962 (edited by M. C. Yovits, G. T. Jacobi, and G. D. Goldstein), Washington, Spartan Books, 1962.
  4. --------, "Quantitative aspects of goal-seeking, self-organizing systems," to appear in Progress in Theoretical Biology, Vol. 1, New York, Academic Press, 1967.
  5. L. Brillouin, Science and Information Theory, New York, Academic Press, 1962 (2nd ed.).
  6. D. Gabor, "Communication theory and physics," yl. Mag., Vol. 41 (1950), pp. 1161-1187.
  7. ---------, "Lectures on communication theory," Technical Report No. 238, Research Laboratory of Electronics, Massachusetts Institute of Technology, 1952.
  8. H. Gelernter, J. R. Hansen, and D. W. Loveland, "Empirical exploration of the geometry theorem machine," Proceedings of the 1960 Western Joint Computer Conference, New York, Association of Computing Machinery; IRE, 1960, pp. 143-147.
  9. J. P. Gordon, "Information capacity of a communications channel in the presence of quantum effects," Advances in Quantum Electronics (edited by J. R. Singer), New York, Columbia University Press, pp. 509-519, 1961.
  10. ------------, "Quantum effects in communication systems," Proceed. IRE, Vol. 50 (1962), pp. 1898-1908.
  11. H. Hydén, "Satellite cells in the nervous system," Sci. Amer., Vol. 213 (1965), pp. 98-106.
  12. R. Kompfner, "Optical communications," Science, Vol. 150 (1965), pp. 149-155.
  13. G. J. Lasher, "A quantum statistical treatment of the channel capacity problem of information theory," Advances in Quantum Electronics (edited by J. R. Singer), New York, Columbia University Press, 1961, pp. 520-536.
  14. B. Raphael, "SIR: a computer program for semantic information retrieval," Ph.D. thesis, Mathematics Department, Massachusetts Institute of Technology, 1964.
  15. J. Rothstein, "Information, measurement and quantum mechanics," Science, Vol. 114 (1951), pp. 171-175.
  16. ----------, "Information and thermodynamics," Phys. Rev., Vol. 85 (1952), p. 135. (Correspondence.)
  17. ----------, "On fundamental limitations of chemical and bionic information storage systems," IEEE Trans. Mil. Electron., Vol. MIL-7 (1963), pp. 205-208.
  18. A. L. Samuel, "Some studies in machine learning using the game of checkers," Computers and Thought (edited by E. A. Feigenbaum and Julian Feldman), New York, McGraw Hill, 1963, pp. 71-106.
  19. C. E. Shannon, "A mathematical theory of communication," Bell Syst. Tech. J., Vol. 27 (1948), pp. 379-423; pp. 623--625.
  20. R. B. Setlow and E. C. Pollard, Molecular Biophysics, Reading, Addison-Wesley, 1962.
  21. J. Slagle, "A heuristic program that solves symbolic integration problems in freshman -calculus," J. Assoc. Comput. Mach., Vol. 10 (1963), pp. 507-520.
  22. J. von Neumann, The Computer and the Brain, New Haven, Yale University Press, 1958.
  23. H. Wang, "Games, logic and computers," Sci. Amer., Vol. 213 (1965), pp. 98-106.
  24. D. S. Lebedev and L. B. Levitin, "Information transmission by electromagnetic field," Information and Control, Vol. 9 (1966), pp. 1-22.
  25. L. B. Levitin, "Ideal physical transmission channel," Information Transmission Problems, Vol. 1 (1965), pp. 122-124.
  26. ---------, "Transmission of information by an ideal photon channel," Information Transmission Problems, Vol. 1 (1965), pp. 71-80.
  27. ---------, 'Photon channels with small capacities," Information Transmission Problems, Vol. 2 (1966), pp. 60-68.


Created: November 17, 1998
Last Modified: January 3, 2001
HTML Editor: Robert J. Bradbury