Back to Clippings Index

After Moore's Law expires
It's getting impossible to create silicon chips that can hold more circuits, so what's next?
Jun. 5, 2005
Toronto Star

We all marvel at the ever-increasing capacity of the silicon computer chip and the corresponding decrease in the size of the devices dependent upon it. The new Mac Mini, for instance, is smaller than a hardcover book yet faster and more powerful than most of the beige boxes sitting under desks across Canada.

Now imagine a computer chip so small you can't even see it with the naked eye. Imagine how powerful and quick computers would be if their chips worked at the molecular level.

Researchers in nanotechnology — the science of the very small — are already building parts for such a system. Earlier this week, Canadians with the National Institute for Nanotechnology and the University of Alberta announced they'd made a tiny switch using a styrene molecule. Such as switch could hypothetically work at least 100 times faster than a traditional transistor.

Other scientists hope to make computers faster and smaller through quantum computing, a branch of science that harnesses the perplexing power of the atom in entirely new ways.

"It could change the way we live our lives," says Raymond LaFlamme, director of the University of Waterloo's Institute for Quantum Computing.

Researchers are working on the radical new systems for one simple reason: they are finally running out of ways to increase the capacity of traditional silicon computer chips.

Forty years ago, Gordon Moore made some startling predictions about the future of electronics. Integrated circuits were still in their infancy, but he believed they'd be able to pack a lot more onto a chip in no time.

"With unit cost falling as the number of components per circuit rises, by 1975 economics may dictate squeezing as many as 65,000 components on a single silicon chip," wrote Moore, then director of Fairchild Semiconductor's research and development labs, in his famous 1965 paper.

That figure was a big jump from 50 components per chip, which was considered pretty darn good at the time. But Moore, who went on to co-found Intel Corp., understood that he was witnessing the beginning of a boom.

As costs came down and manufacturing processes improved, he reckoned the number of components that could economically fit on a single chip would grow "at a rate of roughly a factor of two per year."

For decades, the semiconductor business built itself around the notion of what came to be known as Moore's Law. It has been a guiding light and industry driver. Unfortunately, it can't go on forever.

"Moore's Law is soon going to collide with the laws of physics," says Duncan Stewart, an engineering physicist for Hewlett-Packard Development Corp.

Heat, in particular, is a big problem with today's technology. On any computer circuit board, components are going to get warm. Engineers have to design computers with special parts that draw the heat away from the board. But as more and more tiny components get crammed onto the board, driving away the heat becomes increasingly difficult.

Dan Henes, Celestica's general manager of global engineering services, likens the situation to a kitchen with a toaster.

"Try to picture one toaster on your countertop going," says Henes.

No problem there. One toaster isn't going to generate enough heat to significantly increase the temperature in the kitchen.

"Now add 10,000 toasters. The heat just becomes impossible to dissipate."

Like a kitchen full of toasters, each computer chip is made up of millions of transistors that give off heat as they rapidly switch on and off. As the transistors get smaller, the speed of the switch (and hence the computer) increases. But the temperature also rises.

Henes says engineers can compensate by lowering the voltage. But that can lead to other problems: At a certain point, it's difficult to tell the difference between the on and off states of the transistors, and the processor can get a little unpredictable.

"As we make silicon transistors smaller and smaller, they are going to fail," Stewart says. "They just won't work properly anymore."

The distance between transistors also presents a challenge.

Researchers at Intel realize that, to keep up with Moore's Law, they'll have to find some way to decrease the length of those tiny copper wires that connect transistors. The longer the wire, the longer it takes for the signal to get from one place to another and the slower the data is processed.

One solution involves something called "3D packing" — in other words, building machines that make better use of space.

"In order to reduce the signal delay, you could actually put two chips face-to-face on top of each other so that the distance that a signal has to cover can be reduced," says Rob Willoner, a technical analyst, at Intel Corp. in Santa Clara, Calif.

But building such a board presents manufacturing problems. When two chips face each other, it's hard to get the necessary debugging equipment in contact with all the electrical points on the chip.

Manufacturing very small components is another limiting factor to Moore's Law. We can only make things so small with existing processes.

Photolithography, for example, is used to print the tiny features of integrated circuits. The process basically involves writing with light. The higher the wavelength of light, the finer the detail. Ultraviolet light and X-rays can be used to print smaller features than regular light. But even with the high wavelengths, there are limits.

Henes likens the limits of existing manufacturing processes to those of a digital camera. "If you take a photo with your camera and you blow it up enough, it starts to get fuzzy," he says.

Given the limitations of the traditional chip, scientists such as Stewart, at HP's Quantum Science Research lab, are researching new ways to make computers. Stewart is trying to push the limits of classical computing by building the machines one molecule at a time.

One way to extend Moore's Law would be to use a beam made of electrons or ions to print the features onto the chips, Stewart says. Such particle beams can draw patterns so small that some of the features are just 10 atoms across. This nano-lithographic process is still in the early stages of research, however.

Stewart is also investigating whether tiny electrically conductive wires could be made reliably using "chemical self-assembly."

It's a bit like mixing together a chemical soup made of incredibly small particles that can be measured on the nano scale. (One nanometre is equal to one billionth of a metre.)

As the chemicals react to each other, they form a structure.

"You start with chemically-assembled nanoscale things, and you hope that they assemble themselves into an ordered structure that you can use," Stewart says.

Unfortunately, chemical self-assembly is prone to errors.

"To a chemist, perfect is anything over 95 per cent," he says. "But inside that little Pentium 4, there's only one error in a billion."

Researchers are still investigating ways to make the chemical system more fault-tolerant.

While Stewart's facility is known as the Quantum Science Research lab, he notes that they aren't studying what most researchers would call quantum computing.

Stewart does have to deal with quantum mechanical theory, a branch of science that helps to explain the behaviour of atoms. But a true quantum computer would take things further by using the perplexing quantum properties of particles to store information and process data.

Today's computers store information in bits. Each bit can be either a one or a zero — either on or off, so to speak. Quantum computers store information in qubits, which can also exist as either a one or a zero but have the added capacity of being both at the same time.

"It is mind-boggling for even the people who work in the field," says LaFlamme of the Institute for Quantum Computing. Think of the surface of the Earth, he suggests. When you point toward the North Pole, that's the zero (or off) state. When you point toward the South Pole, that's the one (or on) state.

Those are the two options for the traditional chip, but the qubit can be a point at any place in between the two poles.

Researchers believe that all those extra states for storing data could result in a system that is far more efficient at manipulating data. A quantum computer would be especially good at factoring large numbers or creating encryptions for securing data, LaFlamme says. But it's still early days yet.

According to LaFlamme, today's quantum computers are a lot like computers in 1933 — the kind with vacuum tubes.

"They weren't very reliable," he says. "Similarly, today with quantum computers we are just learning how to control the system."

Fundamentally, quantum computing involves controlling a force of nature. It's a big leap from today's equipment. But while such a leap presents a significant challenge, it could also bring massive change.

LaFlamme notes that, when humans mastered a force of nature in the past, a revolution usually followed.

"When humans were able to control fire, they could start to build tools by hand," he says.

"Then when they learned how to control steam in the mid-1700s that led to industrial revolution and people could build machines to make products instead of making them by hand. Steam-engine trains meant people could get resources from far away, too."

Similarly, control over electromagnetic forces led to the communications revolution.

A quantum-computing revolution would likely bring far more than better and faster factoring.

"This could really change the structure of society," says LaFlamme.

"People have focused on cryptography and factoring numbers because this is a place where we know today there is a drastic difference between a classical computer and a quantum computer."

But, he says, focusing on those two things alone demonstrates a lack of imagination.

"I believe we will have a completely new world of materials and chemistry; we'll have new ways of putting chemicals together," says LaFlamme.

We would be able to process data so much more efficiently that it would likely have a profound effect on communications of all kinds.

Among other things, speech recognition and intelligent machine vision would finally become possible.

More efficient data-processing would also mean size reductions so great that the underlying nature of computing could change, much as it did when we went from computers the size of large rooms to desktop devices.

Imagine sensors the size of dust mites that covered everything. Each surface could become part of a vast computing system, collecting data about changes in its environment. If someone touched a table top, the sensors would detect the warmth of their hand. If a poisonous gas was released into the room, the sensors might pick that up too. The idea of being connected everywhere, anytime would take on a whole new meaning as our ability to gather and process data grew.

LaFlamme says it will be much easier to see the shape of things to come in 10 years, when he predicts the computer industry will start to invest heavily in the concept of quantum computing.

Until then, we'll just have to make do with the technology from the last great computing revolution.

Back to Clippings Index