AI Business is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 3099067.

IT & Data Center

Industrial AI Summit: AI at the edge is about data, not infrastructure

by Ben Wodecki
Article ImageDeciding whether to deploy AI at the edge of the network or in a cloud data center should be judged on use cases, with data being a key consideration, speakers told the audience at the Industrial AI Summit earlier this week.

During a panel discussion on the risks and opportunities of deploying AI in production, Data Analysis Bureau CEO Eric Topham emphasized that every decision should revolve around the application, and what that company is trying to achieve with its data.

"It really depends on what you define as the 'edge,'” he said, adding that the quality of the data being sought after "is the be-all and end-all."

Matteo Dariol, innovation strategist at Bosch Rexroth, described AI deployment as "the art of asking the right question."

“If you’re asking the wrong question, you’re never going to get the right result. It’s crucial for any AI project to have a lot of different stakeholders sitting at the table. It’s a team effort.”

All you need is good data, a good cloud, and a good edge

Dariol said that while some end-users may see edge deployments as just another aspect of IT to manage, more are realizing the need to take care of the data that goes in and out of their AI models.

“People from around the company need to come together to make a decision on what use cases that firm is trying to sort – rather than placing the onus on one person,” he added.

Omdia analyst Alex West, who chaired the discussion, said that the application and data would be the factors determining whether it should be a cloud or edge deployment.

“For some, sensitive data, such as custom or ingredient data, will have to stay local. However, with other datasets, e.g. asset health, there's a greater comfort level with using the cloud.”

West also invoked issues related to latency requirements, as well as the cost and level of analytics, as being important considerations.

"The different edges can include anything from a sensor with [an] embedded processor (e.g. Raspberry Pi), through controllers and then on to on-prem servers," he added.


More EBooks

Latest video

More videos

Upcoming Webinars

Archived Webinars

More Webinars
AI Knowledge Hub

Research Reports

More Research Reports


Smart Building AI

Infographics archive

Newsletter Sign Up

Sign Up