It’s been a rough year for Moore’s Law.
Wired, Ars Technica, The Economist, and Forbes have all written eulogies to this prophetic observation, made over 50 years ago.
In case you aren’t familiar, In 1965 Intel founder Gordon Moore noticed that the number of transistors on integrated circuits were doubling roughly every 12 months. Though the interval was later extended to 18-24 months on average, his prediction remained more or less on point for decades. This largely explains the rapid growth in computing power and subsequent decrease in cost over this time.
Given M-Law’s track record, why all the current hate? Well, much of it comes down to physics; electronic circuits will soon become so dang tiny (like just a few nanometers) that unruly electrons will no longer stay in their respective silicon pathways, in an orgy-like phenomena known as quantum tunneling.
The challenge of working with such small circuitry is quickly becoming apparent as Intel is forced to push back their next generation 10 nm Cannon Lake processors until late 2018 — a year later than expected.
But does this really mean you can finally stop upgrading your computer — is Moore’s Law toast?
Not necessarily, buck-a-roo. While it does seem likely that transistors have gotten nearly as small as they can, the latest (and probably last) major annual report by the International Technology Roadmap for Semiconductors (ITRS) suggests that Moore’s party ain’t over just yet. There’s hope on the horizon, mainly in the form of 3D circuitry, where transistors are stacked vertical as well as horizontal. While technically taking up more space, and most certainly putting off more heat, this would allow CPUs to continue to become more powerful. At least for a time.
Newly engineered materials might also keep Moore’s prophecy on track for many years — if not decades — to come. Graphene and carbon nano-tubes are leading contenders. These carbon-based pathways could reduce energy use substantially, making 3D chips much less likely to melt a hole through your motherboard.
There are actually many different possibilities and pathways computing can take as this point. And it’s quite likely an entirely new method of crunching numbers will someday usurp our current paradigm… perhaps quantum computing or something else just beyond the horizon.
But just in case these are Moore’s Law’s final years, rather than shed little silicon tears, we should instead share a toast to the idea it represents, and appreciate just how far and how fast we’ve come.