The post-exponential era of AI and Moores Law – TechCrunch

Posted: November 17, 2019 at 2:33 pm

My MacBook Pro is three years old, and for the first time in my life, a three-year-old primary computer doesnt feel like a crisis which must be resolved immediately. True, this is partly because Im waiting for Apple to fix their keyboard debacle, and partly because I still cannot stomach the Touch Bar. But it is also because three years of performance growth aint what it used to be.

It is no exaggeration to say that Moores Law, the mindbogglingly relentless exponential growth in our worlds computing power, has been the most significant force in the world for the last fifty years. So its slow deceleration and/or demise are a big deal, and not just because the repercussions are now making their way into every home and every pocket.

Weve all lived in hope that some other field would go exponential, giving us another, similar, era, of course. AI/machine learning was the great hope, especially the distant dream of a machine-learning feedback loop, AI improving AI at an exponential pace for decades. That now seems awfully unlikely.

In truth it always did. A couple of years ago I was talking to the CEO of an AI company who argued that AI progress was basically an S-curve, and we had already reached its top for sound processing, were nearing it for image and video, but were only halfway up the curve for text. No prize for guessing which one his company specialized in but he seems to have been entirely correct.

Earlier this week OpenAI released an update to their analysis from last year regarding how the computing power used by AI1 is increasing. The outcome? It has been increasing exponentially with a 3.4-month doubling time (by comparison, Moores Law had a 2-year doubling period). Since 2012, this metric has grown by more than 300,000x (a 2-year doubling period would yield only a 7x increase).

Thats a lot of computing power to improve the state of the AI art, and its clear that this growth in compute cannot continue. Not will not; can not. Sadly, the exponential growth in the need for computing power to train AI has happened almost exactly contemporaneously with the diminishment of the exponential growth of Moores Law. Throwing more money at the problem wont help again, were talking about exponential rates of growth here, linear expense adjustments wont move the needle.

The takeaway is that, even if we assume great efficiency and performance improvements to reduce the rate of doubling, AI progress seems to be increasingly compute-limited at a time when our collective growth in computing power is beginning to falter. Perhaps therell be some sort of breakthrough, but in the absence of one, it sounds a whole lot like were looking at AI/machine-learning progress leveling off, not long from now, and for the foreseeable future.

1It measures the largest AI training runs, technically, but this seems trend-instructive.

Read more here:

The post-exponential era of AI and Moores Law - TechCrunch

Related Posts