Moore’s Law is the best-known precept of the Information Age. It is the basis of a thousand business plans and tens of thousands of news articles, and the core of the greatest, farthest-reaching, and most transformational wealth creation ever.
In 1965, one of the Intel cofounders noticed that the density of transistors on a semiconductor doubles every 18 to 24 months without an increase in costs. That has remained the case for most of the ensuing decades, though some would argue that has slowed of late (it turns out this stuff gets hard to engineer when you’re moving individual atoms around).
It is the reason why computers and phones become cheaper, yet do more; why the Internet holds more information and becomes more pervasive; why software and hardware are said to be the #1 and #2 top-paying industries in the U.S. In other words, Moore’s Law is one of those “cornerstone of the modern world” kind of things.
So why do we say we’re missing something here? It’s because Moore’s Law is not really a “law,” as in “law of nature.” If you don’t believe me, leave a semiconductor on a table for two years and I guarantee it will have the same number of transistors when you come back. Moore’s Law is rather an economic and sociological observation about what information technology does.
Even as Moore’s Law may have slowed, we are seeing that curve of ever-cheaper information processing proceeds in other ways: networking, to share messages between computers (and their human owners); open source software, to promulgate more and better computation in a greater number of places; virtualization, creating more computing output per computer at minimal costs; and cloud computing, a kind of massive virtualization across a global network, often leveraging open source.
If Moore’s Law is really about information technology moving from expensive mainframes, to smaller minicomputers, to personal computers, to phones — moving to more places, getting cheaper, trying to be everywhere — cloud is the fullest expression of that trend to date.