Hewlett-Packard Enterprise unveiled on May 16 a new type of computer, called The Machine, that relies on memory to crunch data 10,000 times faster than traditional systems.
The Machine massively outperforms existing technology, by placing extra reliance on memory to perform calculations, with less dependence on computer processors, said HP Labs chief architect Kirk Bresniker.
While The Machine’s development has been announced previously, this is the first time HPE has put together the world’s largest single-memory computer boasting 160TB of memory, which is capable of simultaneously working with the data held in every book in the American Library of Congress five times over – or approximately 160 million books.
At the core is photonics which is the transmission of information via light, rather than the electrons of conventional computers. Bresniker said this help processors speed up the access of data from a massive memory pool.
By eliminating the inefficiencies of how memory, storage and processors interact in current conventional systems, memory-driven computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds – to deliver real-time insights.
But do not expect The Machine to be available in the near future. Software and applications need to be written for it although Bresniker said commercial versions will be available “sooner than 2030”.
In a press release, HPE chief executive Meg Whitman said that the secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data created every day.
“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era,” she said.
Bresniker, who spoke in a phone interview from California where he is based, said: “We’ve shown the advantages and efficacy of the approach. It is now in the hands of HPE’s application research team to learn what’s it like to process trillions terabytes of data.”
In the lab, HP has emulated and simulated applications that requires vast computing transactions. The Machine can, for example, sieve through the large volume of network traffic for signs of abnormal activity that indicate a cyberattack of advanced persistent threat nature.
News reports last year said that current systems can analyse 50,000 events a second and compare them with five minutes of prior events. With the Machine, 10 million events a second could be compared with 14 days of prior activity, detecting many more problems.
It is also expected to bring more than hundred-fold speed-ups to chores such as helping airlines respond to service disruptions, prevent credit card fraud and assess the risk of financial portfolios.
Further development could see the uncoupling of memory from the computation device, added Bresniker, which will see it used in the Internet of Things such as consumer products and the internet-linked cameras and lighting systems.
Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory such as 4,096 yottabytes which is 250,000 times the size of the entire digital universe today.
With that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time – getting to answers and uncovering new opportunities at unprecedented speeds.
Development on The Machine began in 2014 and involved hundreds of HPE engineers as well as members of the open source community since the new computer runs a variant of the Linux operating system. HPE has shown working models of The Machine, most recently at the Cebit IT show in Germany.
Bresniker explained that much of the computer technology people and businesses relied on so far was created in the 1950s and 1960s. The technology was based around microprocessors but because of physics, the tiny chips cannot be significantly shrunk any more to bring dramatic jumps in speed and energy efficiency.
“We’re at the crossroads of computing. By 2020, people will be carrying 20 billion devices and there will be 100 billion intelligent things like sensors, all of them running trillions of applications,” said Bresniker.
He added: “Between now and then, we will be creating more new information than mankind has ever recorded. We believe the memory-driven computing will be able to crunch all this data.”
Grace Chng is a veteran tech writer.