It’s been happening for so long now that we have come to expect it: each year, computing and other electronic devices shrink in size and drop in price, while growing ever more powerful.
But if you are expecting this embodiment of technology's most famous dictum – Moore's Law – to continue on forever, get ready to think again, says Robert Colwell, director of the Microsystems Technology Office at the Defence Advanced Research Projects Agency (Darpa) and Intel's former chief architect. In a recent speech at the Hot Chips conference at Stanford University in California, Colwell argued that Moore's Law, the engine that has driven the digital age, will reach the end of its life in less than a decade.
Formulated by Intel co-founder Gordon Moore in the 1960s, the law states that the number of transistors that will fit on an integrated circuit (a computer chip, or processor) will double about every two years.
The prediction was so accurate that it has been used as a roadmap for chip innovation for decades.
"He foresaw what is really a cost-based argument. If you can reduce the size of the transistor, you can get more functionality. That's a function per cost argument. And that gives you incredible growth," says Prof Jim Greer, head of graduate studies at Cork's Tyndall Institute.
According to Colwell, twin problems will bring about the demise of Moore’s Law: a range of technological impasses that will limit how much more computing power can be packed on to a circuit and how much smaller it can become, and the exorbitant costs of creating chips using some possible alternative technologies.
Chipmaking is already an expensive business. Fabrication plants, or fabs, for the complex process, such as Intel’s in Leixlip, cost billions. A new Intel fab under construction in New York state is predicted to cost about $6 billion. But Moore’s Law has made chip production – even with falling costs to the end-user – a lucrative business.
The end of Moore’s Law has been predicted before, notes Greer, but new technological processes have always come to the rescue. Processes that were used in the 1980s were becoming obsolete by the mid-1990s, he says, but new materials and slightly different structures for transistors injected new life into the law.
Clock speed
Some limits with current chips have already been reached. Clock speed – the frequency at which a processor performs calculations – "has plateaued", Greer says, remaining the same for some time now.
To squeeze out greater performance, chipmakers instead place several processors, or "cores", on a single chip – first two, then four. But getting more cores on to a chip is running into physical barriers, making it more difficult to maintain Moore's law and shrink the size of processors while increasing performance at lower cost. "The reason why this trend is inevitably going to end is rooted in the atomic structure of materials like silicon," says Dr Jiri Vala, SFI research fellow at NUI Maynooth and an expert in quantum computation.
Colwell predicts processors, typically made from silicon crystal, cannot go below about five to seven manometres in size, a barrier that he says will be reached by the end of this decade. A nanometre is a billionth of a metre; a human hair measures 100,000nm while current Intel processors measure 22nm.
“Since the separation of silicon atoms in the crystal is about 0.5nm, this length-scale corresponds to about 10-14 atoms of silicon,” says Vala.
At this level, the strange world of quantum physics takes over, he notes (see panel), making this size “the fundamental limitation to conventional electronics”.
So what happens next? One possibility is quantum computing – computers powered by processors constructed at the atomic and subatomic level, and harnessing the odd properties of quantum physics – which hold out the promise of vast computing power at low cost.
But Greer feels that's not a fix-all, especially not in the time frame for the end of Moore's Law. "Quantum computing is very good at some things, but not necessarily the best for all computing. And it's also more than 10 years away."
Cores could perhaps be stacked into layers, but then problems arise in wiring them together without them generating currently unworkable levels of heat, he says. Overcoming such issues “is a daunting challenge”.
Intel Ireland's head of new business development, Leonard Hobbs, notes, "The demise of Moore's Law has been predicted by many commentators over several years. Our own visibility into the future of Moore's Law extends about 10 years and, while challenging, we don't see any roadblocks to the continuation of scaling in this timeframe. "What happens after that will depend on the results from ours and others' research, which today is offering more choices than ever for future devices, connections, and architectures. Manufacturing in our fabs will continue to be a complex and demanding discipline but no major shifts are anticipated."
And perhaps a slowdown in Moore’s Law won’t matter that much to most people, as long as their hardware and software gets the job done.
Reducing cost
"Moore's Law is and was fundamentally about reducing the cost of computing hardware. As we hit physical limits, the exponential decrease in cost will end. But the value of digital computing hardware is in what it can do," says Dr Michael Peter Kennedy, professor of microelectronic engineering at UCC. "Raw computing power, as in traditional supercomputers, is a specialist business. It's about computing faster with less power, where the result is not necessarily time-sensitive. By contrast, most consumers want solutions to real problems in real time. Each solution involves a digital computing engine, 'mixed-signal' interfaces to the analogue real world, and software running on the hardware, which ultimately solve the customer's problem."
Kennedy predicts that the next wave of development will be across a number of areas that will continue to improve performance: “High-bandwidth communication circuits, mixed-signal interfaces [that] interact with real world physical and chemical signals, packaging, algorithms and software.”
Greer feels the end of Moore’s law will probably arrive in the next 10 to 15 years, while Vala notes, “When exactly this happens may depend on future advances we have not made yet.”
But Greer remains fundamentally optimistic. "You can be sure the large manufacturers are confident there are solutions. You do not invest $6 billion in a factory to provide performance nobody wants."
Silicon solution Quantum computing to the rescue?
One possible solution to the demise of Moore's Law – the promise that computing power will continue to increase as processors shrink – may eventually be found in quantum computing.
This is computing at the atomic and subatomic level, where the laws of conventional physics disappear, replaced by bizarre quantum properties theorised by physicist Albert Einstein. These effects would enable extraordinarily powerful and economical computers. Traditional computers process information represented as strings of 1s and 0s (bits), transmitted to the machine as the on (1) or off (0) state of an electrical charge or current.
Quantum computers, first envisioned by physicist Richard Feynman, would process information using the quantum states of atoms or subatomic particles.
Such quantum objects, acting as quantum bits (qubits) can store both 0 and 1 at the same time, and “entangle” with each other, amplifying their abilities. This ramps up computing power, as all the possible 0 and 1 combinations can be considered simultaneously in a calculation.
“Quantum computation represents an extremely powerful computation paradigm. Quantum computation is able to provide immense computing power that reaches beyond the capabilities of any conventional computer,” notes Dr Jiri Vala, SFI research fellow at NUI Maynooth, who studies quantum computation.
Moore’s Law may not even be adequate to encompass the technological improvements available through quantum computing, he says.
"This power derives from exotic properties of the quantum world that allows the processing of information in a massively parallel way. In other words, the main reason for pursuing quantum computation is that it can
solve in a short time, let us
say a few minutes, certain computational problems which on a conventional computer (even parallel systems using numerous processors) would take time as long as the age of the universe."
Once the stuff of science fiction, scientists have begun to prove many of the underlying assumptions of quantum computing, Vala says. “Quantum computation has been demonstrated in many different proof-of-principle experiments in the past 15 years. Quantum information and communication technologies are even becoming commercially available. To provide a couple of examples, quantum communication and cryptography devices are produced by the company MagiQ Technologies, and also the first commercial quantum computing system has recently been introduced by D-Wave Systems.”
Safely reading and storing data generated by a quantum computer, however, poses other problems. Says Vala: “The main challenge on the way to realisation of [large scale] quantum computing is the protection of fragile quantum information from errors.”