The pair intend to put ML ‘within milliseconds of the global online population’
Content delivery network (CDN) operator Cloudflare is teaming up with Nvidia to bring AI to the edge of the network, at scale.
The pair revealed intentions to create a “massive platform” on which developers can deploy applications that use pre-trained or custom machine learning models.
“By leveraging the TensorFlow platform developers can use familiar tools to build and test machine learning models, and then deploy them globally onto Cloudflare’s edge network,” the partners said in the announcement.
Together, Cloudflare and Nvidia aim to put machine learning within milliseconds of the global online population, enabling high performance, low latency AI applications to be deployed by anyone.
Because the ML models themselves will remain in Cloudflare’s data centers, developers can deploy custom models without the risk of putting them on end-user devices.
"Cloudflare Workers is one of the fastest and most widely adopted edge computing products with security built into its DNA," Matthew Prince, co-founder and CEO of Cloudflare, said.
"Now, working with Nvidia, we will be bringing developers powerful artificial intelligence tools to build the applications that will power the future.”
Cloudflare already applies ML for business intelligence, bot detection, and anomaly identification, making use of Nvidia’s hardware accelerators to” speed up training and inference tasks, and will bring the same technology to any developer that uses Cloudflare Workers.”
The Cloudflare announcement is the latest in a batch of news from Nvidia’s GPU Technology Conference (GTC) 2021, which also saw the unveiling of ‘Grace’ – its first ever server CPU and the launch of an enterprise version of its Omniverse virtual environment platform.