Hendrik Schon of Bell Labs made a transistor from a molecule, Ike Chuang of MIT relies on quantum logic, and John Reif of Duke is into DNA.

Courtesy MIT & IBM research; Lucent Technologies/Bell Labs; Duke University photography

Save ThisEmail ThisPrint ThisMost Popular#

The Goal: Computers millions of times faster. The research into single-molecule transistors, DNA strands, and quantum effects provides tantalizing clues.

by Daniel Tynan

1 | 2 | 3 | 4 | 5 | 6 | 7

What uses might nanoscale computing be put to? We probably won't need massively parallel processing for our PDAs or cellphones anytime soon, but the limits of classical computing are already being felt in, for example, the field of data encryption. Encryption is not only a national security prerequisite but also a critical foundation of Internet commerce and data exchange. The ability of future computers to perform simultaneous calculations on a massive scale could help break, or protect, seemingly unbreakable codes.

In biochemical research, non-chip-based computers could potentially process massive amounts of data simultaneously, seeking critical genetic patterns that might lead to new drugs. Nanocomputers may also hold promise for managing vast databases, solving complex problems such as long-range weather forecasting, and�because they can theoretically be integrated into nanomachines�monitoring or even repairing our bodies at the cellular level. All this remains highly speculative, of course, because nanocomputing research is at the pressure cooker stage.

The Size Barrier
Computers have shrunk so much, and become so madly fast, that one might wonder why they couldn't just keep shrinking. The first general-purpose computer, the Electronic Numerical Integrator and Computer (ENIAC), took up an entire room at the University of Pennsylvania (see "More Brawn Than Brains"). It weighed 30 tons and employed more than 17,000 vacuum tubes. When scientists turned it on, parts of Philadelphia went dark. The ENIAC was a 4-bit computer that ran at a now-paltry 20,000 cycles per second�about the computing power found in an electronic greeting card that plays a silly song when opened.

The ENIAC and its descendants are basically a collection of binary on/off switches that precisely modify information. A bit, the most basic unit of information, is expressed within a circuit by a voltage: high voltage means the bit has a value of 1; low voltage means it equals 0. These bits flow through simple logic gates constructed from switches, which together perform the tasks that the computer is directed to do. The ENIAC's vacuum tube switches became obsolete in 1947 with the advent of transistors, solid-state devices that remain the fundamental component of integrated circuits and microprocessors. With each new generation, switches have grown smaller, enabling engineers to fit more of them into the same space, but their essential function hasn't varied.

As circuits shrink, electrons can make more trips around the chip, distributing more binary code and handling more tasks. Today's Pentium IV processor is the size of a dime, and sends electrons zipping around its 55 million transistors at 2 gigahertz, or 2 billion times per second. In 10 years the average silicon chip will likely contain a billion or more transistors and run at speeds exceeding 25 billion cycles per second. Already, exotic, high-performance chips, such as one made from silicon germanium that was recently announced by IBM, can exceed speeds of 100 gigahertz.

1 | 2 | 3 | 4 | 5 | 6 | 7

Try 2 issues of Popular Science FREE!