Yesterday's column by John Naughton in the Observer revisited Nathan Myhrvold's 1997 prediction that when Moore's Law runs out -- that is, when processors stop doubling in speed every 18 months through an unbroken string of fundamental breakthroughs -- that programmers would have to return to the old disciplines of writing incredibly efficient code whose main consideration was the limits of the computer that runs on it.
I'd encountered this idea several times over the years, whenever it seemed that Moore's Law was petering out, and it reminded me of a prediction I'd made 15 years ago: that as computers ceased to get faster, they would continue to get wider -- that is, that the price of existing processors would continue to drop, even if the speed gains petered out -- and that this would lead programmers towards an instinctual preference for solving the kinds of problems that could be solved in parallel (where the computing could be done on several processors at once, because each phase of the solution was independent of the others) and an instinctual aversion for problems that had to be solved in serial (where each phase of the solution too the output of the previous phase as its input, meaning all the steps had to be solved in order).
That's because making existing processors more cheaply only requires minor, incremental improvements in manufacturing techniques, while designing new processors that are significantly faster requires major breakthroughs in material science, chip design, etc. These breakthroughs aren't just unpredictable in terms of when they'll arrive, they're also unpredictable in terms of how they will play out. One widespread technique deployed to speed up processors is "branch prediction," wherein processors attempt to predict which instruction will follow the one it's currently executing and begin executing that without waiting for the program to tell it to do so. This gave rise to a seemingly unstoppable cascade of ghastly security defects that the major chip vendors are still struggling with.
So if you write a program that's just a little too slow for practical use, you can't just count on waiting a couple of months for a faster processor to come along.
But cheap processors continue to get cheaper. If you have a parallel problem that needs a cluster that's a little outside your budget, you don't need to rewrite your code -- you can just stick it on the shelf for a little while and the industry will catch up with you.
Reading Naughton's column made me realize that we were living through a parallel computation bubble. The period in which Moore's Law had declined also overlapped with the period in which computing came to be dominated by a handful of applications that are famously parallel -- applications that have seemed overhyped even by the standards of the tech industry: VR, cryptocurrency mining, and machine learning.
Now, all of these have other reasons to be frothy: machine learning is the ideal tool for empiricism-washing, through which unfair policies are presented as "evidence-based"; cryptocurrencies are just the thing if you're a grifty oligarch looking to launder your money; and VR is a new frontier for the moribund, hyper-concentrated entertainment industry to conquer.
It's possible that this is all a coincidence, but it really does feel like we're living in a world spawned by a Sand Hill Road VC in 2005 who wrote "What should we invest in to take advantage of improvements in parallel computing?" on top of a white-board.
That's as far as I got. Now what I'm interested in is what would a contrafactual look like? Say (for the purposes of the thought experiment) that processors had continued to gain in speed, but not in parallelization -- that, say, a $1000 CPU doubled in power every 18 months, but that there weren't production lines running off $100 processors in bulk that were 10% as fast.
What computing applications might we have today?
(Image: Xiangfu, CC BY-SA)
Amazon has reinstated FedEx as a ground delivery carrier for Prime members shipments. The online retailer said today the shipper consistently met its delivery requirements, after suspending it last month.
Comments filed with the FCC by AT&T, Frontier, Windstream and Ustelcom (an industry group representing telcoms companies) have asked the FCC to change the rules for its next, $20.4 billion/10 year rural broadband subsidy fund to allow them to offer slower service than the (already low) speeds the FCC has proposed.
If you are trying to find work in South Korea, you are likely to be interviewed by a bot that uses AI to scan your facial expressions to determine whether or not you are right for the job.
If you love wine and we mean, really love wine its a personal thing. You know what foods your favorite wine likes to mingle with and the ones they dont. You have a favorite time to drink. Youve read a thing or two about its history, maybe visited where it was made. Its []
If youre working with databases, youre working with SQL. Even in the changing world of the web, there are some classics that endure, and SQL (along with its database management system MySQL) is one of them. Millions of websites and databases have been built using SQL code as their foundation, and theyre still being built []
Do you know Python? If youre interested in any aspect of web development, data analytics or the Internet of Things, you should. Python is the computer language used to drive everything from that voice recognition software on your phone to the gaming apps you use to kill time. So yes, theres a market for those []