HPE’s next frontier: Space travel & memory-driven computing – IT Brief Australia

56 years ago, President Kennedy issued his famousmoonshotaddress to Congress. Just over 8 years later, Neil Armstrong and Buzz Aldrin touched down on the surface of the Moon with the help of technology no more powerful than a calculator.

Weve improved a lot since thenthe smartphone in your hand would have been considered a super computer beyond any rocket scientists dreams back then. But when we think aboutexploring our next frontiers, our excitement must also be tempered with reality.

While computing technology has improved exponentially since the Moon landing, the fundamental architecture underlying it all hasnt actually changed much in the last 60 years.

Andthatis quickly becoming a problem. As a computer engineer and researcher,thisis the thing that keeps me up at night: the idea that our current technology wont be able to deliver on our expectations for the future.

Blame it on the data. More datahas been created in the past two years than in the entire history of the human race. And yet,less than 1% of that data is ever analyzed.

By the year 2020, our digital universe will contain nearly as many bits of data as there arestars in the universe, with at least 20billionmobile devices and 1trillionapplications creating and transmitting information.

Well have smart cars, smart homes, smart factories, even smart bodies. As a species, well create staggering amounts of data every day.

The question is what are we going to do with it all?

Before we can answer that question, its important to understand our current limitationsand why were pushing up against them now, after 60 plus years of progress.

Starting around the 1950sin business and in sciencewe began automating the dreary job of number crunching. Think of a business doing payroll at the end of the month or closing the books at the end of the quarter.

Computing made this hand-to-pencil-to-ledger process faster, more efficient and automatic. It was accurate and reliable, but it sometimes took a few days or weeks to complete.

But then the 1990s gave us the web. And the 2000s gave us mobile. The amount of data we created grew exponentially, and our appetite for real-time, always-on information grew to match.

That 24x7 access stretched networks and infrastructure to new limits, so we pulled out all the stops to scale. We consolidated, moved to the cloud and eked out the last nanometers of transistor efficiency.

Now we are on the cusp of an entirely new era, driven by the Internet of Things and what we callthe Intelligent Edge.

In this new era of smart everything, we will demand much more from our computing systems. We will expect them to process and learn from zettabytes of sensor data and take action immediately. Speed, accuracy, reliability and security willallbe mission critical. A millisecond delay or a minor miscalculation could genuinely mean the difference between life and death.

But the fact is, right now, the incremental increases we are seeing in our computing power will not meet the exponential demands of our future challenges. We need Memory-Driven Computing.

Themission to Marsis a perfect way to illustrate the magnitude of this problem.

At 20 light-minutes away, Mars is too far to rely on communication from Earth for real-time support. Where ground control once helped guide Armstrong and Aldrin to the Moon, Mars astronauts will be guided by a computer capable of performing extraordinary tasks:

In short, the Mars spacecraft will be a smart city, an intelligent power grid, and a fleet of autonomous vehicles all-in-one. And it will be controlled bythe most powerful computing system the world has ever seen.

But heres the rub. Right now, with existing technology, wed need a massive data center attached to a nuclear power plant to achieve the computing power a Mars mission would demand, and thats never going to fit in the cargo hold! What weve got today is just too big, too heavy, too slow, too inflexible and too power hungry.

We need a 21st century computer to solve 21st century problems. At Hewlett Packard Enterprise, weve spent the past three years developing exactly that.

Memory-Driven Computing is the answer

In 2014, we introduced the largest and most complex research project in our companys historywith the goal of creating an entirely new kind of computer:

One that wasnt constrained by traditional trade-offs. One that eliminated performance bottlenecks. One that threw off 60 years of convention and compromise.

We call itThe Machine research projectand its mission is to deliver the worlds firstMemory-Driven Computingarchitecture. Its more than an idea, it is the way the world will work in the future.

Without getting into too many of the technical details, let me quickly explain.

As much as90 percentof the work a computer does is simply moving information between memory and storage. That busy work wastes time and energy. And the more information we try to process, the slower the system gets and the more energy it consumes.

A huge amount of science and engineering effort has gone into working around this problem.It has to change. If youre familiar with Moores Law, you know that up until now we could count on chips to get better year after year, but that era is over.

For 60years we focused on running a tiny bit of data through a faster calculator. With Memory-Driven Computing we end the work-arounds by inverting the model. Breaking down the memory wall, accessing all the data, and bring just the right compute.

Last November, we delivered the worlds firstMemory-Driven Computingprototype. In just six months, we scaled the prototype 20-fold.

Today, Im thrilled to tell you thatHPE has created a computer with the largest single-memory system the world has ever seen, capable of holding 160 terabytes of data in memory.

To put it in context, thats enough memory to simultaneously work with the data held in approximately 160 million booksfive times the amount of books in the Library of Congress. And its powerful enough to reduce the time needed to process complex problems from days to hours. No computer on Earth can manipulate that much data in a single place at once.

But thats only the beginning of Memory-Driven Computings potential. Were engineeringMemory-Driven Computerswith up to 4,096 yottabytes of data. Thats more than 250,000 timesthe size of our digital universe today.

When we can analyze that much data at once, we canbegin to discover correlations we could have never conceived before. And that ability will open up entirely new frontiers of intellectual discovery.

The implications for an endeavor like the mission to Mars are huge.

Now think about the mission to Mars as a metaphor for life here on Earth.

In a world where everything is connected and everything computesour cars, our homes, our factories, our bodieswere going to need to take that computing power with us everywhere we go. And were going to want to discover those correlations that were never before possible.

To do that, we needMemory-Driven Computing.

That is our mission at HPE: to enable a world where everything computes.

To bring real-time intelligence to every edge of the Earth and beyond. To help the world harness that intelligence to answer some of our biggest questions. To solve some of our toughest challenges and help us better understand the world around us.

Memory-Driven Computing will benefit us, our children and their children.

It's a new world. It's here now. Welcome!

Article by Kirk Bresniker, chief architect, Hewlett Packard Labs

See the original post here:

HPE's next frontier: Space travel & memory-driven computing - IT Brief Australia

Related Posts

Comments are closed.