AI Business is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

Facebook joins AI chipset race

by Ciarán Daly
Article ImageMENLO PARK, CA - The race to develop powerful dedicated AI processors is heating up, as Facebook announces it will be joining the likes of Huawei, Baidu, Apple, and Arm in building a new class of semiconductors specifically for deep learning exercises.

AI is a central focus for Facebook, with implications for everything from the way users message and interact with brands to the structure of the content newsfeed and ad personalization. As demand for these AI tools has grown, the company is looking for new ways to continue building on their capabilities. This includes developing custom chips to support its AI programs, as Facebook's Chief AI Scientist and early AI pioneer Yann LeCun revealed this week.

Speaking with the FT ahead of the publication of a research paper exploring his vision for the future of AI hardware, LeCun suggested Facebook will begin working on its own custom 'ASIC' chips to support its AI ambitions. It will begin to collaborate with a number of chip companies on new designs as well as look at building and open sourcing its own architecture.

"Facebook has been known to build its hardware when required - build its own ASIC, for instance. If there's any stone unturned, we're going to work on it," LeCun told the FT.

The chips, explained LeCun, will play an integral role in Facebook's AI strategy by enabling the use of even more neural nets across the company's data centres. AI is crucial for Facebook in fighting against fake news, delivering speech translation, and facial / video recognition-based data analysis. Facebook's chips will even have a role to play in on-device AI, a new type of AI hardware which LeCun claims is critical to the future of the technology - and which requires a new generation of semiconductors.

Whereas the most common chips for training neural networks are GPUs (graphical processing units) from companies like NVIDIA, LeCun says that these are ill-suited for running AI algorithms once trained. He argued that future AI chipset designs would have to handle information more efficiently.

Source: The Financial Times

Image: Facebook


More EBooks

Latest video

More videos

Upcoming Webinars

More Webinars
AI Knowledge Hub

Research Reports

More Research Reports


Smart Building AI

Infographics archive

Newsletter Sign Up

Sign Up