IBM launches advanced artificial intelligence server
Built specifically for compute-intensive AI workloads, the new Power9 systems are said to be capable of improving the training times of deep learning frameworks by nearly 4x allowing enterprises to build more accurate AI applications, faster.
The new Power9-based AC922 Power Systems are the first to embed PCI-Express 4.0, next-generation NVIDIA NVLink and OpenCAPI, which combined can accelerate data movement, calculated at 9.5x faster than PCI-E 3.0 based x86 systems.
As a result, data scientists can build applications faster, ranging from deep learning insights in scientific research, real-time fraud detection and credit risk analysis.
Power9 is at the heart of the soon-to-be most powerful data-intensive supercomputers in the world, the U.S. Department of Energy’s “Summit” and “Sierra” supercomputers, and has been tapped by Google.
“Google is excited about IBM's progress in the development of the latest Power technology," said Bart Sano, VP of Google Platforms "The Power9 OpenCAPI Bus and large memory capabilities allow for further opportunities for innovation in Google data centres."
"We’ve built a game-changing powerhouse for AI and cognitive workloads,” said Bob Picciano, SVP of IBM Cognitive Systems. “In addition to arming the world’s most powerful supercomputers, IBM Power9 Systems is designed to enable enterprises around the world to scale unprecedented insights, driving scientific discovery enabling transformational business outcomes across every industry.”
Accelerating the future with Power9
Deep learning is a fast-growing machine learning method that extracts information by crunching through millions of processes and data to detect and rank the most important aspects of the data.
To meet these growing industry demands, four years ago IBM set out to design the Power9 chip on a blank sheet to build a new architecture to manage free-flowing data, streaming sensors and algorithms for data-intensive AI and deep learning workloads on Linux.
With PowerAI, the company has optimised and simplified the deployment of deep learning frameworks and libraries on the Power architecture with acceleration, allowing data scientists to be up and running in minutes.
IBM Research is developing a wide array of technologies for the Power architecture. Researchers have already cut deep learning times from days to hours with the PowerAI Distributed Deep Learning toolkit.
Building an open ecosystem to fuel innovation
The era of AI demands more than tremendous processing power and unprecedented speed; it also demands an open ecosystem of innovative companies delivering technologies and tools.
IBM describes itself as a catalyst for innovation to thrive, fueling an open, fast-growing community of more than 300 OpenPower Foundation and OpenCAPI Consortium members.