Is HP Labs’ supercomputer the new hope for supersized data? – SiliconANGLE (blog)

With practically limitless data and applications demanding microseconds-fast insight, its poor timing that Moores law of perpetually increasing processor power is now AWOL.

How do we get back exponential scaling on supply to meet this unending, exponential demand? askedKirk Bresniker (pictured, right), fellow, vice president and chief architect at HP Labs, at Hewlett Packard Enterprise Co.

We will not regain it through the familiar technologies of the past three decades, nor a single point solution, Bresnikerstated in an interview duringHPE Discover in Las Vegas, Nevada.

This is borne out each day in HP Labs generally and in the companysongoing work on The Machine, its memory-driven compute program, according toAndrew Wheeler (pictured, left), fellow, vice president and deputy director of HP Labs.

Bresniker and Wheeler spoke withJohn Furrier (@furrier) andDave Vellante (@dvellante),co-hosts of theCUBE, SiliconANGLE Medias mobile live streaming studio, during HPE Discover.(* Disclosure below.)

After some mixed press for The Machine last December, HPE has been doggedly pushing it closer to prime time production, Wheelerexplained.

There are a lot of moving parts around it, whether its around the open-source community and kind of getting their head wrapped around, what does this new architecture look like? Wheeler said.

The Machine will require a chain of partners and ancillary parts to yield real use-cases, Wheeler added.

We had the announcement around DZNEas kind of an early example, he said, referring to the German Center for Neurodegenerative Diseases use of The Machine in analyzing massive medical data.

The Machine has also materialized what HPE calls the Computer Built for the Era of Big Data, a massive system running on a single memory.

Internet of Things data and, specifically, the intelligent edge are calling out for data training abilities like those in this supercomputer, according toBresniker.Presently, almost all data ingested at the edge is thrown away before its analyzed, let alone monetized, he added.

The first person who understands, OK, Im going to get one percent more of that data and turn it into real-time intelligence, real-time action that will unmake industries, and it will remake new industries, Bresniker concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLEs and theCUBEs independent editorial coverage of HPE Discover US 2017.(* Disclosure: TheCUBE is a paid media partner for HPE Discover US 2017. Neither Hewlett Packard Enterprise Co. nor other sponsors have editorial control on theCUBE or SiliconANGLE.)

Continued here:

Is HP Labs' supercomputer the new hope for supersized data? - SiliconANGLE (blog)

Related Posts

Comments are closed.