FutureFive New Zealand - Consumer technology news & reviews from the future
Story image
HPE’s next frontier: Space travel & memory-driven computing
Fri, 9th Jun 2017
FYI, this story is more than a year old

56 years ago, President Kennedy issued his famous “moonshot” address to Congress. Just over 8 years later, Neil Armstrong and Buzz Aldrin touched down on the surface of the Moon with the help of technology no more powerful than a calculator.

We've improved a lot since then—the smartphone in your hand would have been considered a super computer beyond any rocket scientist's dreams back then. But when we think about exploring our next frontiers, our excitement must also be tempered with reality.

While computing technology has improved exponentially since the Moon landing, the fundamental architecture underlying it all hasn't actually changed much in the last 60 years.

And that is quickly becoming a problem. As a computer engineer and researcher, this is the thing that keeps me up at night: the idea that our current technology won't be able to deliver on our expectations for the future.

What's the problem?

Blame it on the data. More data has been created in the past two years than in the entire history of the human race. And yet, less than 1% of that data is ever analyzed.

By the year 2020, our digital universe will contain nearly as many bits of data as there are stars in the universe, with at least 20 billion mobile devices and 1 trillion applications creating and transmitting information.

We'll have smart cars, smart homes, smart factories, even smart bodies. As a species, we'll create staggering amounts of data every day.

The question is… what are we going to do with it all?

Pushing up against our current limitations

Before we can answer that question, it's important to understand our current limitations—and why we're pushing up against them now, after 60 plus years of progress.

Starting around the 1950's—in business and in science—we began automating the dreary job of number crunching. Think of a business doing payroll at the end of the month or closing the books at the end of the quarter.

Computing made this hand-to-pencil-to-ledger process faster, more efficient and automatic. It was accurate and reliable, but it sometimes took a few days or weeks to complete.

But then the 1990's gave us the web. And the 2000's gave us mobile. The amount of data we created grew exponentially, and our appetite for real-time, always-on information grew to match.

That 24x7 access stretched networks and infrastructure to new limits, so we pulled out all the stops to scale. We consolidated, moved to the cloud and eked out the last nanometers of transistor efficiency.

Now we are on the cusp of an entirely new era, driven by the Internet of Things and what we call “the Intelligent Edge.

In this new era of smart everything, we will demand much more from our computing systems. We will expect them to process and learn from zettabytes of sensor data and take action immediately. Speed, accuracy, reliability and security will all be mission critical. A millisecond delay or a minor miscalculation could genuinely mean the difference between life and death.

But the fact is, right now, the incremental increases we are seeing in our computing power will not meet the exponential demands of our future challenges. We need Memory-Driven Computing.

The mission to Mars illustrates the problem

The mission to Mars is a perfect way to illustrate the magnitude of this problem.

At 20 light-minutes away, Mars is too far to rely on communication from Earth for real-time support. Where ground control once helped guide Armstrong and Aldrin to the Moon, Mars astronauts will be guided by a computer capable of performing extraordinary tasks:

  • Monitoring onboard systems the way a smart city would monitor itself—anticipating and addressing problems before they threaten the mission.
  • Tracking minute-by-minute changes in astronaut health—monitoring vitals and personalizing treatments to fit the exact need in the exact moment.
  • Coordinating every terrestrial, deep space, Martian orbital and rover sensor available, so crew and craft can react to changing conditions in real time.
  • And, perhaps most importantly… combining these data sets to find the hidden correlations that can keep a mission and crew alive.

In short, the Mars spacecraft will be a smart city, an intelligent power grid, and a fleet of autonomous vehicles all-in-one. And it will be controlled by the most powerful computing system the world has ever seen.

But here's the rub. Right now, with existing technology, we'd need a massive data center attached to a nuclear power plant to achieve the computing power a Mars mission would demand, and that's never going to fit in the cargo hold! What we've got today is just too big, too heavy, too slow, too inflexible and too power hungry.

We need a 21st century computer to solve 21st century problems. At Hewlett Packard Enterprise, we've spent the past three years developing exactly that.

Memory-Driven Computing is the answer

In 2014, we introduced the largest and most complex research project in our company's history—with the goal of creating an entirely new kind of computer:

One that wasn't constrained by traditional trade-offs. One that eliminated performance bottlenecks. One that threw off 60 years of convention and compromise.

We call it The Machine research project and its mission is to deliver the world's first Memory-Driven Computingarchitecture. It's more than an idea, it is the way the world will work in the future.

Without getting into too many of the technical details, let me quickly explain.

As much as 90 percent of the work a computer does is simply moving information between memory and storage. That busy work wastes time and energy. And the more information we try to process, the slower the system gets and the more energy it consumes.

A huge amount of science and engineering effort has gone into working around this problem. It has to change. If you're familiar with Moore's Law, you know that up until now we could count on chips to get better year after year, but that era is over.

For 60 years we focused on running a tiny bit of data through a faster calculator. With Memory-Driven Computing we end the work-arounds by inverting the model. Breaking down the memory wall, accessing all the data, and bring just the right compute.

Memory-Driven Computing… We need it!

Last November, we delivered the world's first Memory-Driven Computing prototype. In just six months, we scaled the prototype 20-fold.

Today, I'm thrilled to tell you that HPE has created a computer with the largest single-memory system the world has ever seen, capable of holding 160 terabytes of data in memory.

To put it in context, that's enough memory to simultaneously work with the data held in approximately 160 million books—five times the amount of books in the Library of Congress. And it's powerful enough to reduce the time needed to process complex problems from days to hours. No computer on Earth can manipulate that much data in a single place at once.

But that's only the beginning of Memory-Driven Computing's potential. We're engineering Memory-Driven Computers with up to 4,096 yottabytes of data. That's more than 250,000 times the size of our digital universe today.

What does it mean for the mission to Mars and life on Earth?

When we can analyze that much data at once, we can begin to discover correlations we could have never conceived before. And that ability will open up entirely new frontiers of intellectual discovery.

The implications for an endeavor like the mission to Mars are huge.

  • It means we can shrink the time needed to process complex problems from days to seconds, delivering real-time intelligence at the very edge of exploration.
  • It means we can consolidate a massive data center to go anywhere computing is needed.

Now think about the mission to Mars as a metaphor for life here on Earth.

In a world where everything is connected and everything computes—our cars, our homes, our factories, our bodies—we're going to need to take that computing power with us everywhere we go. And we're going to want to discover those correlations that were never before possible.

To do that, we need Memory-Driven Computing.

That is our mission at HPE: to enable a world where everything computes.

To bring real-time intelligence to every edge of the Earth and beyond. To help the world harness that intelligence to answer some of our biggest questions. To solve some of our toughest challenges and help us better understand the world around us.

Memory-Driven Computing will benefit us, our children and their children.

It's a new world. It's here now. Welcome!

Article by Kirk Bresniker, chief architect, Hewlett Packard Labs