Ever since developing their supercomputer Watson, five years ago, IBM has worked hard to stay ahead in the race of artificial intelligence. The Watson brand has expanded to a cognitive computing package with hardware and software that can do anything diagnosing diseases, explore for oil and gas, run scientific computing models, and allow cars to drive autonomously.
CIO UK, the leading brand for chief information officers just released an article revealing that IBM will release new hardware and software packages. Initially, Watson applies advanced algorithms and natural language interfaces in order to find and narrate answers, and at that time, Watson was a ‘supercomputer’. Now, AI systems are deployed on a much greater scale, and all tech-giants, such as Facebook, Google, and Amazon apply AI in order to use image recognition and speech analysis to masses of data.
IBM’s goal is to bring AI to other companies, and is now releasing a more powerful hardware to increase the speed of deep learning systems, while at the same time analysing data in order to answer complex questions. IBM is bringing these superfast systems together with new software tools, called Power AI.
Power AI is used to train AI software in doing tasks such as image and speech recognition, as the accuracy of the results improves by learning, and the training hardware is now available. This could be the key to make Watson technologies more and easier accessible for companies either via the cloud, or on premises, CIO reports.
IBM’s first set of hardware is a collaboration with the NVIDIA Tesla GPUs, called the Power 8, Sumit Gupta, IBM’s Vice President of high performance computing and analytics told CIO. “The hardware is the fastest deep-learning system available”, Gupta said. “The Power8 CPUs and Tesla P100 GPUs are among the fastest chips available, and both are linked via the NVLink interconnect, which outperforms PCI-Express 3.0. Nvidia’s GPUs power many deep-learning systems in
“The Power8 CPUs and Tesla P100 GPUs are among the fastest chips available, and both are linked via the NVLink interconnect, which outperforms PCI-Express 3.0. Nvidia’s GPUs power many deep-learning systems in companies like Google, Facebook, and Baidu”.
The Power8 is available via the Nimbix cloud providing bare metal access to hardware and an Infiniband backend. IBM does also have other projects in the pipeline, planning on hardware and software inferencing, which requires lighter processing on the edge or end device, CIO writes.
“The inferencing engine takes results from a trained model, adds additional data or input, and provides improvised results. Drones, robots, and autonomous cars use inferencing engines for navigation, image recognition, or data analysis”.
These chips are also applied in data centers in order to improve deep learning models, and Google has developed their own chip (TPU), with KnuEdge, Wave Computing, and GraphCore following close behind.
Gupta could not reveal any further details than saying that IBM is currently working on a different model for its inferencing hardware and software, the software beeing deemed the “glue” that puts IBM’s AI hardware and software together.
This article was first published at: http://www.cio.com/article/3140929/hardware/as-watson-matures-ibm-plans-more-ai-hardware-and-software.html
Photo Credit: Wikipedia