Next Up Previous Contents Index
Next: CPU Performance Up: Nanotechnology in Manufacturing Previous: Nanotechnology in Manufacturing

What's Been Happening

Let's start by looking at the key trend that's driven the entire silicon revolution. It's really very simple: making things smaller, or in engineer-speak, device scaling.

In the early days of transistors, it was observed that when you made a device smaller, you got the best of everything. It ran faster, it used less power and ran cooler, and since you could pack a lot more of them on a single wafer and the cost of processing a wafer was the same regardless of what was on it, the price per device went down.

After the first primitive integrated circuits were made in the late 1950s, the incentive to miniaturise became even more compelling. By reducing the size of devices further, more complex circuitry could be packed onto each chip. And remember, all chips cost roughly the same to make, regardless of what they do or how complicated they are.

Integrated circuits started out as devices with just two transistors on a die, and progressed to building blocks for larger systems. Finally, in the mid 1970s, practical microprocessors, entire computers on a single chip, appeared. These initial devices were crude, which led many observers to dismiss them as toys. Indeed, none of the major computer manufacturers of the time played a significant role in the development of what is now the universal way of making computers.

It was the chip makers who pioneered microprocessors and developed them to their current state. Why? Because they were familiar with the inexorable consequences of device scaling, and they knew how far, in time, it would carry the microprocessor.


Editor: John Walker