Facebook joins AI chipset raceFacebook joins AI chipset race
Facebook joins AI chipset race
February 19, 2019
MENLO PARK, CA - The race to develop powerful dedicated AI processors is heating up, as Facebook announces it will be joining the likes of Huawei, Baidu, Apple, and Arm in building a new class of semiconductors specifically for deep learning exercises.
AI is a central focus for Facebook, with implications for everything from the way users message and interact with brands to the structure of the content newsfeed and ad personalization. As demand for these AI tools has grown, the company is looking for new ways to continue building on their capabilities. This includes developing custom chips to support its AI programs, as Facebook's Chief AI Scientist and early AI pioneer Yann LeCun revealed this week.
Speaking with the FT ahead of the publication of a research paper exploring his vision for the future of AI hardware, LeCun suggested Facebook will begin working on its own custom 'ASIC' chips to support its AI ambitions. It will begin to collaborate with a number of chip companies on new designs as well as look at building and open sourcing its own architecture.
"Facebook has been known to build its hardware when required - build its own ASIC, for instance. If there's any stone unturned, we're going to work on it," LeCun told the FT.
The chips, explained LeCun, will play an integral role in Facebook's AI strategy by enabling the use of even more neural nets across the company's data centres. AI is crucial for Facebook in fighting against fake news, delivering speech translation, and facial / video recognition-based data analysis. Facebook's chips will even have a role to play in on-device AI, a new type of AI hardware which LeCun claims is critical to the future of the technology - and which requires a new generation of semiconductors.
Whereas the most common chips for training neural networks are GPUs (graphical processing units) from companies like NVIDIA, LeCun says that these are ill-suited for running AI algorithms once trained. He argued that future AI chipset designs would have to handle information more efficiently.
Source: The Financial Times