Moore’s Law is 50 years old but will it continue?

It’s been 50 years since Gordon Moore, one of the founders of the microprocessor company Intel, gave us Moore’s Law. This says that the complexity of computer chips ought to double roughly every two years.


Now the current CEO of Intel, Brian Krzanich, is saying the days of Moore’s Law may be coming to an end as the time between new innovation appears to be widening:

The last two technology transitions have signalled that our cadence today is closer to two and a half years than two.

 

So is this the end of Moore’s Law?

 

Moore’s Law has its roots in an article by Moore written in 1965, in which he observed the complexity of component development was doubling each year. This was later modified to become:

The number of transistors incorporated in a chip will approximately double every 24 months.

 

This rate was again modified to a doubling over roughly 18 months.

 

Gordon Moore. Intel

In its 24 month guise, Moore’s Law has continued unabated for 50 years, with an overall advance of a factor of roughly 231, or 2 billion. That means memory chips today story around 2 billion times as much data as in 1965. Or, in more general terms, computer hardware today is around 2 billion times as powerful for the same cost.

 

It is hard to comprehend Moore’s Law. Imagine airline technology advancing from 1965 to 2015 to travel nearly at the speed of light (1,080 million kph or 670 million mph), yet capacious enough to contain the entire world’s population. Or imagine the cost of a jet airliner dropping from US$100 million to one dollar. Yet even these analogies fall far short of a factor of 2 billion.

 

Moore was originally embarrassed by his eponymous “law”. This is in part because it is not at all a law in the sense a law of physics, but instead merely an observation. But on the 40th anniversary, Intelwas happy to celebrate it and Moore was pleased to note that it still seemed to be accurate.

 

The end is nigh?

A few months ago though, Moore observed:

The original prediction was to look at 10 years, which I thought was a stretch […] The fact that something similar is going on for 50 years is truly amazing. […] But someday it has to stop. No exponential like this goes on forever.

 

There have been numerous other predictions that Moore’s Law was soon to end.

 

In 1999, physicist and best-selling author Michio Kaku declared that the “Point One barrier” (meaning chip features 0.1 micron or 100 nanometers in size) would soon halt progress.

 

Yet the semiconductor industry sailed through the 0.1 micron level like a jetliner passing through a wispy cloud. Devices currently in production have feature sizes as small as 10 or 14 nanometers, and IBM has just announced chip with 7 nanometer features.

 

By comparison, a helical strand of DNA is 2.5 nanometers in diameter, thus commercial semiconductor technology is now entering the molecular and atomic realm.

 

A speed barrier

Not all is roses, though. By one measure – a processor’s clock speed – Moore’s Law has already stalled.

 

Today’s state-of-the-art production microprocessors typically have 3 GHz clock rates, compared with 2 GHz rates five or ten years ago – not a big improvement.

 

But the industry has simply increased the number of processor “cores” and on-chip cache memory, so that aggregate performance continues to track or exceed Moore’s Law projections. There are many, many software challenges to make sure this remains relevant.

 

Hewlett Packard Laboratories is hard at work developing new approaches for microelectronics. Its nanotechnology research group has developed a “crossbar architecture”, a design where a set of parallel “wires” a few nanometers in width are crossed by a second set of “wires” at right angles. Where the “wires” intersect forms an electronic switch, which can be configured for either logic or memory storage use.

 

It is also investigating nanoscale photonics (light-based devices), which can be deployed either for conventional electronic devices or for emerging quantum computing devices.

 

Moore’s Law is a gift to science

Moore’s Law has been a great blessing to science and mathematics research. Modern laboratories are loaded with high-tech measurement and analysis devices, which become more powerful and cheaper ever year.

 

In addition, a broad range of modern science, mathematics and engineering has benefited from Moore’s Law in the form of scientific supercomputers, which are used for applications as diverse as supernova simulation and protein folding to product design and the processing of microwave background radiation from the cosmos.

 

Software running these computers has advanced abreast with Moore’s Law.

 

For example, the fast Fourier transform algorithm, which is used extensively in scientific computation, and magnetic resonance imaging (MRI), both involve substantial computation that would not be possible without Moore’s Law advances.

 

It is not entirely coincidental that both of these algorithmic advances arose roughly 50 years ago, the same time Moore’s Law was first observed.

 

How much more for Moore’s Law?

Intel’s CEO, Brian Krzanich, said the company would “strive to get back to two years” for innovation to keep Moore’s Law on track.

 

If Moore’s Law does continue for just two or three more decades, typical handheld devices may well exceed the human brain in intelligence. Some, such as author James Barrat, declare that artificially intelligent computers will be the “final invention” of mankind, after which humans may become irrelevant.

 

We do not subscribe to such pessimism. Rather we see a promising future with scientific knowledge, among other things, increasing at an exponential rate.

 

Time will tell. As physicist Richard Feynman wrote in 1959, referring to the potential for ever finer control of nature at the microscopic level, there still appears to be plenty of room at the bottom.

 

The Conversation

Jonathan Borwein (Jon) is Laureate Professor of Mathematics at University of Newcastle.
David H. Bailey is PhD; Lawrence Berkeley Laboratory (retired) and Research Fellow at University of California, Davis.

This article was originally published on The Conversation. Read the original article.

COMMENTS