Since 1965, the computer industry has relied on Moores Law to accelerate innovation, pushing more transistors into integrated circuits to improve computation performance. Making transistors smaller helped lift all boats for the entire industry and enable new applications. At some point, we will reach a physical limit that is, a limit stemming from physics itself. Even with this setback, improvements kept on pace thanks to increased parallelism of computation and consolidation of specialized functions into single chip packages, such as systems on chip).
In recent years, we are nearing another peak. This article proposes to improve computation performance not only by building better hardware, but by changing how we use existing hardware. More specifically, the focusing on how we use existing processor types. I call this approach Compute Orchestration: automatic optimization of machine code to best use the modern datacenter hardware (again, with special emphasis on different processor types).
So what is compute orchestration? It is the embracing of hardware diversity to support software.
There are many types of processors: Microprocessors in small devices, general purpose CPUs in computers and servers, GPUs for graphics and compute, and programmable hardware like FPGAs. In recent years, specialized processors like TPUs and neuromorphic processors for machine learning are rapidly entering the datacenter.
There is potential in this variety: Instead of statically utilizing each processor for pre-defined functions, we can use existing processors as a swarm, each processor working on the most suitable workloads. Doing that, we can potentially deliver more computation bandwidth with less power, lower latency and lower total cost of ownership).
Non-standard utilization of existing processors is already happening: GPUs, for example, were already adapted from processors dedicated to graphics into a core enterprise component. Today, GPUs are used for machine learning and cryptocurrency mining, for example.
I call the technology to utilize the processors as a swarm Compute Orchestration. Its tenets can be described in four simple bullets:
Compute orchestration is, in short, automatic adaptation of binary code and automatic allocation to the most suitable processor types available. I split the evolution of compute orchestration into four generations:
Compute Orchestration Gen 1: Static Allocation To Specialized Co-Processors
This type of compute orchestration is everywhere. Most devices today include co-processors to offload some specialized work from the CPU. Usually, the toolchain or runtime environment takes care of assigning workloads to the co-processor. This is seamless to the developer, but also limited in functionality.
Best known example is the use of cryptographic co-processors for relevant functions. Being liberal in our definitions of co-processor, Memory Management Units (MMUs) to manage virtual memory address translation can also be considered an example.
Compute Orchestration Gen 2: Static Allocation, Heterogeneous Hardware
This is where we are at now. In the second generation, the software relies on libraries, dedicated run time environments and VMs to best use the available hardware. Lets call the collection of components that help better use the hardware frameworks. Current frameworks implement specific code to better use specific processors. Most prevalent are frameworks that know how to utilize GPUs in the cloud. Usually, better allocation to bare metal hosts remains the responsibility of the developer. For example, the developer/DevOps engineer needs to make sure a machine with GPU is available for the relevant microservice. This phenomenon is what brought me to think of Compute Orchestration in the first place, as it proves there is more slack in our current hardware.
Common frameworks like OpenCL allow programming compute kernels to run on different processors. TensorFlow allows assigning nodes in a computation graph to different processors (devices).
This better use of hardware by using existing frameworks is great. However, I believe there is a bigger edge. Existing frameworks still require effort from the developer to be optimal they rely on the developer. Also, no legacy code from 2016 (for example) is ever going to utilize a modern datacenter GPU cluster. My view is that by developing automated and dynamic frameworks, that adapt to the hardware and workload, we can achieve another leap.
Compute Orchestration Gen 3: Dynamic Allocation To Heterogeneous Hardware
Computation can take an example from the storage industry: Products for better utilization and reliability of storage hardware have innovated for years. Storage startups develop abstraction layers and special filesystems that improve efficiency and reliability of existing storage hardware. Computation, on the other hand, remains a stupid allocation of hardware resources. Smart allocation of computation workloads to hardware could result in better performance and efficiency for big data centers (for example hyperscalers like cloud providers). The infrastructure for such allocation is here, with current data center designs pushing to more resource disaggregation, introduction of diverse accelerators, and increased work on automatic acceleration (for example: Workload-aware Automatic Parallelization for Multi-GPU DNN Training).
For high level resource management, we already have automatic allocation. For example, project Mesos (paper) focusing on fine-grained resource sharing, Slurm for cluster management, and several extensions using Kubernetes operators.
To further advance from here would require two steps: automatic mapping of available processors (which we call the compute environment) and workload adaptation. Imagine a situation where the developer doesnt have to optimize her code to the hardware. Rather, the runtime environment identifies the available processing hardware and automatically optimizes the code. Cloud environments are heterogeneous and changing, and the code should change accordingly (in fact its not the code, but the execution model in the run time environment of the machine code).
Compute Orchestration Gen 4: Automatic Allocation To Dynamic Hardware
A thought, even a possibility, can shatter and transform us. Friedrich Wilhelm Nietzsche
The quote above is to say that there we are far from practical implementation of the concept described here (as far as I know). We can, however, imagine a technology that dynamically re-designs a data center to serve needs of running applications. This change in the way whole data centers meet computation needs as already started. FGPAs are used more often and appear in new places (FPGAs in hosts, FPGA machines in AWS, SmartNICs), providing the framework for constant reconfiguration of hardware.
To illustrate the idea, I will use an example: Microsoft initiated project Catapult, augmenting CPUs with an interconnected and configurable compute layer composed of programmable silicon. The timeline in the projects website is fascinating. The project started off in 2010, aiming to improve search queries by using FPGAs. Quickly, it proposed the use of FPGAs as bumps in the wire, adding computation in new areas of the data path. Project Catapult also designed an architecture for using FPGAs as a distributed resource pool serving all the data center. Then, the project spun off Project BrainWave, utilizing FPGAs for accelerating AI/ML workloads.
This was just an example of innovation in how we compute. Quick online search will bring up several academic works on the topic. All we need to reach the 4th generation is some idea synthesis, combining a few concepts together:
Low effort HDL generation (for example Merlin compiler, BORPH)
In essence, what I am proposing is to optimize computation by adding an abstraction layer that:
Automatic allocation on agile hardware is the recipe for best utilizing existing resources: faster, greener, cheaper.
The trends and ideas mentioned in this article can lead to many places. It is very likely, that we are already working with existing hardware in the optimal way. It is my belief that we are in the midst of the improvement curve. In recent years, we had increased innovation in basic hardware building blocks, new processors for example, but we still have room to improve in overall allocation and utilization. The more we deploy new processors in the field, the more slack we have in our hardware stack. New concepts, like edge computing and resource disaggregation, bring new opportunities for optimizing legacy code by smarter execution. To achieve that, legacy code cant be expected to be refactored. Developers and DevOps engineers cant be expected to optimize for the cloud configuration. We just need to execute code in a smarter way and that is the essence of compute orchestration.
The conceptual framework described in this article should be further explored. We first need to find the killer app (what type of software we optimize to which type of hardware). From there, we can generalize. I was recently asked in a round table what is the next generation of computation? Quantum computing? Tensor Processor Units? I responded that all of the above, but what we really need is better usage of the existing generation.
Guy Harpak is the head of technology at Mercedes-Benz Research & Devcelopment in its Tel Aviv, Israel facility. Please feel free to contact him on any thoughts on the topics above at harpakguy@gmail.com. Harpak notes that this contributed article reflects his personal opinion and is in no way related to people or companies that he works with or for.
Related Reading: If you find this article interesting, I would recommend researching the following topics:
Some interesting articles on similar topics:
Return Of The Runtimes: Rethinking The Language Runtime System For The Cloud 3.0 Era
The Deep Learning Revolution And Its Implications For Computer Architecture And Chip Design (by Jeffrey Dean from Google Research)
Beyond SmartNICs: Towards A Fully Programmable Cloud
Hyperscale Cloud: Reimagining Datacenters From Hardware To Applications
Read more:
Disrupt The Datacenter With Orchestration - The Next Platform
- The Quantum Computer Revolution Is Closer Than You May Think - National Review [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Time Crystals Could be the Key to the First Quantum Computer - TrendinTech [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- quantum computing - WIRED UK [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Chinese scientists build world's first quantum computing machine - India Today [Last Updated On: May 3rd, 2017] [Originally Added On: May 3rd, 2017]
- Here's How We Can Achieve Mass-Produced Quantum Computers - ScienceAlert [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- D-Wave partners with U of T to move quantum computing along - Financial Post [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Team develops first blockchain that can't be hacked by quantum computer - Siliconrepublic.com [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Telstra just wants a quantum computer to offer as-a-service - ZDNet [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Research collaborative pursues advanced quantum computing - Phys.Org [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Quantum Computing Market Forecast 2017-2022 | Market ... [Last Updated On: June 6th, 2017] [Originally Added On: June 6th, 2017]
- Quantum Computing Is Real, and D-Wave Just Open ... - WIRED [Last Updated On: June 7th, 2017] [Originally Added On: June 7th, 2017]
- FinDEVr London: Preparing for the Dark Side of Quantum Computing - GlobeNewswire (press release) [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Purdue, Microsoft to Collaborate on Quantum Computer - Photonics.com [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Scientists May Have Found a Way to Combat Quantum Computer Blockchain Hacking - Futurism [Last Updated On: June 9th, 2017] [Originally Added On: June 9th, 2017]
- Microsoft and Purdue work on scalable topological quantum computer - Next Big Future [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- HYPRES Expands Efforts in Quantum Computing with Launch of European Subsidiary SeeQC - Business Wire (press release) [Last Updated On: June 12th, 2017] [Originally Added On: June 12th, 2017]
- From the Abacus to Supercomputers to Quantum Computers - Duke Today [Last Updated On: June 13th, 2017] [Originally Added On: June 13th, 2017]
- Accenture, Biogen, 1QBit Launch Quantum Computing App to ... - HIT Consultant [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- The US and China "Quantum Computing Arms Race" Will Change Long-Held Dynamics in Commerce, Intelligence ... - PR Newswire (press release) [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- Quantum Computing Technologies markets will reach $10.7 billion by 2024 - PR Newswire (press release) [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- A Hybrid of Quantum Computing and Machine Learning Is Spawning New Ventures - IEEE Spectrum [Last Updated On: June 14th, 2017] [Originally Added On: June 14th, 2017]
- KPN CISO details Quantum computing attack dangers - Mobile World Live [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Get ahead in quantum computing AND attract Goldman Sachs - eFinancialCareers [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Accenture, 1QBit partner for drug discovery through quantum ... - ZDNet [Last Updated On: June 16th, 2017] [Originally Added On: June 16th, 2017]
- Toward optical quantum computing - MIT News [Last Updated On: June 17th, 2017] [Originally Added On: June 17th, 2017]
- Quantum computing, the machines of tomorrow | The Japan Times - The Japan Times [Last Updated On: June 17th, 2017] [Originally Added On: June 17th, 2017]
- Its time to decide how quantum computing will help your ... [Last Updated On: June 18th, 2017] [Originally Added On: June 18th, 2017]
- Israel Enters Quantum Computer Race, Placing Encryption at Ever-Greater Risk - Sputnik International [Last Updated On: June 20th, 2017] [Originally Added On: June 20th, 2017]
- Prototype device enables photon-photon interactions at room ... - Phys.Org [Last Updated On: June 20th, 2017] [Originally Added On: June 20th, 2017]
- Dow and 1QBit Announce Collaboration Agreement on Quantum Computing - Business Wire (press release) [Last Updated On: June 21st, 2017] [Originally Added On: June 21st, 2017]
- Imperfect crystals may be perfect storage method for quantum computing - Digital Trends [Last Updated On: June 21st, 2017] [Originally Added On: June 21st, 2017]
- Dow Chemical, 1QBit Ink Quantum Computing Development Deal - Zacks.com [Last Updated On: June 22nd, 2017] [Originally Added On: June 22nd, 2017]
- Google on track for quantum computer breakthrough by end of 2017 - New Scientist [Last Updated On: June 22nd, 2017] [Originally Added On: June 22nd, 2017]
- USC to lead project to build super-speedy quantum computers - USC News [Last Updated On: June 24th, 2017] [Originally Added On: June 24th, 2017]
- The Quantum Computer Factory That's Taking on Google and IBM ... - WIRED [Last Updated On: June 24th, 2017] [Originally Added On: June 24th, 2017]
- The weird science of quantum computing, communications and encryption - C4ISR & Networks [Last Updated On: June 27th, 2017] [Originally Added On: June 27th, 2017]
- Multi-coloured photons in 100 dimensions may make quantum ... - Cosmos [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Global Quantum Computing Market Growth at a CAGR of 35.12 ... - PR Newswire (press release) [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Qudits: The Real Future of Quantum Computing? - IEEE Spectrum - IEEE Spectrum [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- New method could enable more stable and scalable quantum ... - Phys.Org [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Quantum computers are about to get real | Science News - Science News Magazine [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Quantum Computing - Scientific American [Last Updated On: June 30th, 2017] [Originally Added On: June 30th, 2017]
- Australia's ambitious plan to win the quantum race - ZDNet [Last Updated On: July 3rd, 2017] [Originally Added On: July 3rd, 2017]
- How quantum mechanics can change computing - The Conversation - The Conversation US [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- UNSW joins with government and business to keep quantum computing technology in Australia - The Australian Financial Review [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- UNSW launches Australia's first hardware quantum computing company with investments from federal and NSW ... - OpenGov Asia [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- Finns chill out quantum computers with qubit refrigerator to cut out errors - ZDNet [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- Hype and cash are muddying public understanding of quantum ... - The Conversation AU [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- IEEE Approves Standards Project for Quantum Computing ... - insideHPC [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- Silicon Quantum Computing launched to commercialise UNSW ... - ZDNet [Last Updated On: August 24th, 2017] [Originally Added On: August 24th, 2017]
- The Era of Quantum Computing Is Here. Outlook: Cloudy ... [Last Updated On: January 30th, 2018] [Originally Added On: January 30th, 2018]
- The Era of Quantum Computing Is Here. Outlook: Cloudy | WIRED [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- Quantum computing in the NISQ era and beyond [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- What is quantum computing? - Definition from WhatIs.com [Last Updated On: February 6th, 2018] [Originally Added On: February 6th, 2018]
- Quantum computers - WIRED UK [Last Updated On: February 19th, 2018] [Originally Added On: February 19th, 2018]
- Is Quantum Computing an Existential Threat to Blockchain ... [Last Updated On: February 21st, 2018] [Originally Added On: February 21st, 2018]
- What is Quantum Computing? Webopedia Definition [Last Updated On: March 25th, 2018] [Originally Added On: March 25th, 2018]
- Quantum Computing Explained - WIRED UK [Last Updated On: April 15th, 2018] [Originally Added On: April 15th, 2018]
- Quantum computing: A simple introduction - Explain that Stuff [Last Updated On: June 2nd, 2018] [Originally Added On: June 2nd, 2018]
- What are quantum computers and how do they work? WIRED ... [Last Updated On: June 22nd, 2018] [Originally Added On: June 22nd, 2018]
- How Quantum Computers Work [Last Updated On: July 22nd, 2018] [Originally Added On: July 22nd, 2018]
- The reality of quantum computing could be just three years ... [Last Updated On: September 12th, 2018] [Originally Added On: September 12th, 2018]
- The 3 Types of Quantum Computers and Their Applications [Last Updated On: November 24th, 2018] [Originally Added On: November 24th, 2018]
- Quantum Computing - VLAB [Last Updated On: January 27th, 2019] [Originally Added On: January 27th, 2019]
- Quantum Computing | Centre for Quantum Computation and ... [Last Updated On: January 27th, 2019] [Originally Added On: January 27th, 2019]
- Microsofts quantum computing network takes a giant leap ... [Last Updated On: March 7th, 2019] [Originally Added On: March 7th, 2019]
- IBM hits quantum computing milestone, may see 'Quantum ... [Last Updated On: March 7th, 2019] [Originally Added On: March 7th, 2019]
- Quantum technology - Wikipedia [Last Updated On: March 13th, 2019] [Originally Added On: March 13th, 2019]
- Quantum Computing | D-Wave Systems [Last Updated On: April 18th, 2019] [Originally Added On: April 18th, 2019]
- Microsoft will open-source parts of Q#, the programming ... [Last Updated On: May 7th, 2019] [Originally Added On: May 7th, 2019]
- What Is Quantum Computing? The Complete WIRED Guide | WIRED [Last Updated On: May 8th, 2019] [Originally Added On: May 8th, 2019]
- The five pillars of Edge Computing -- and what is Edge computing anyway? - Information Age [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Moore's Law Is Dying. This Brain-Inspired Analogue Chip Is a Glimpse of What's Next - Singularity Hub [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Experts Gather at Fermilab for International Workshop on Cryogenic Electronics for Quantum Systems - Quantaneo, the Quantum Computing Source [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Princeton announces initiative to propel innovations in quantum science and technology - Princeton University [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Detecting Environmental 'Noise' That Can Damage The Quantum State of Qubits - In Compliance [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- Quantum Computing beginning talks with clients on its quantum asset allocation application - Proactive Investors USA & Canada [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- What is quantum computing? The next era of computational evolution, explained - Digital Trends [Last Updated On: October 1st, 2019] [Originally Added On: October 1st, 2019]
- IT sees the Emergence of Quantum Computing as a Looming Threat to Keeping Valuable Information Confidential - Quantaneo, the Quantum Computing Source [Last Updated On: October 23rd, 2019] [Originally Added On: October 23rd, 2019]
- More wrong answers get quantum computers to find the right one - Futurity: Research News [Last Updated On: October 23rd, 2019] [Originally Added On: October 23rd, 2019]