Facebook to open-source AI hardware design using NVIDIA’s new Tesla GPU

Facebook to open-source AI hardware design using NVIDIA’s new Tesla GPU

AI Business

December 18, 2015

3 Min Read
AI Business logo in a gray background | AI Business

NVIDIA announced last week that Facebook will power its next-generation computing system with the NVIDIA® Tesla® Accelerated Computing Platform, enabling it to drive a broad range of machine learning applications.

Facebook is the first company to adopt NVIDIA Tesla M40 GPU accelerators, introduced last month, to train deep neural networks. They will play a key role in the new “Big Sur” computing platform, Facebook AI Research’s (FAIR) purpose-built system designed specifically for neural network training.

“Deep learning has started a new era in computing,” said Ian Buck, vice president of accelerated computing at NVIDIA. “Enabled by big data and powerful GPUs, deep learning algorithms can solve problems never possible before. Huge industries from web services and retail to healthcare and cars will be revolutionized. We are thrilled that NVIDIA GPUs have been adopted as the engine of deep learning. Our goal is to provide researchers and companies with the most productive platform to advance this exciting work.”

We caught up with NVIDIA’s Will Ramey, Senior Product Manager for the new GPU to find out more on what this development means for the future of AI advancement and new practical applications which are starting to open up.

Will; why do you think we’re seeing such rapid improvement in AI technologies and applications?

“We’re in the middle of the Big Bang moment of AI; we now have the Deep Neural Networks, the explosion of Big Data, and now thanks to the leap in processing power with enhanced GPU’s we have the full package to see a real shift in the development of commercial real-world AI applications”.

 

We’re seeing more open-source projects appear from some big names in the industry including Google with Tensor Flow, Microsoft’s Computational Network Toolkit (CNTK), and obviously Watson’s expanding partner eco-system. Do you feel Open-source is the key to unlocking full AI?

“We’re still a long way off from full human level AI, the current technologies are still in the very early years of what is termed true AI – but progress is advancing rapidly. Open-source is a real opportunity to encourage more participants and thus continuous improvements to the Deep Neural Networks that are necessary to reach the end goal. The ability we have now to improve each of the 3 key ingredients means we have he best chance of realising the full potential of AI”.

The new M40 Tesla GPU has some significant improvements enabling much faster Machine Learning (10-20x), can you share some use cases outside of Facebook where this can have a real impact?

“There’s many examples where the faster processing power of the GPU is enabling a range of AI solutions within the Enterprise. We work with large organisations, which are end-users of the technologies such as JP Morgan, and also partner with 3rd party providers such as MetaMind who are revolutionising Image recognition for medical applications”.

You’ve mentioned how Open-sourcing AI is a real opportunity to speed the advancements of real-world applications, where is NVIDIA focusing efforts in the future?

“Collaboration is key, researchers must work together to develop new methods of Machine Learning and we must keep pace also with the huge increases in readily available data each day. Improving the design of the GPUs and offering a Deep Learning SDK meaning engineers can make the most of parallel computing will be a key focus in how we’re able to drive the future of AI”.

It’s clear to see that we’re at a tipping point technologically, with more and more use-cases of AI technology arising each day – it’s an extremely exciting time to watch how the practical applications unfold.

 

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like