Regulatory and consumer considerations among ideas proposed by Google, Bill.com and Shell.

Ben Wodecki, Jr. Editor

June 20, 2022

2 Min Read

Regulatory and consumer considerations among ideas proposed by Google, Bill.com and Shell.

Computer vision is one of the core uses of AI. It allows systems to determine information from videos or images to provide users with insights.

It is used in everything from autonomous vehicles to Industry 4.0. For example, multinational energy giant Shell uses computer vision to monitor its structures.

Speaking on a panel at the AI Summit London, Amjad Chaudry, Shell’s capability center manager for data science and ML, said that his company uses computer vision to guide robots that navigate plants to monitor gauges.

Google and the Bank of England were among the other brands represented on the panel, which discussed future-proofing computer vision implementations and offering best practices to attendees.

Chaudry said the integral needs for his business are to embed security across a project and keep an eye on the ever-changing regulatory landscape.

Joining Chaudry on the panel, Arunita Roy, senior data and ML scientist at the Bank of England, said her team looks at other regulators to see what they’re doing and how they’re incorporating AI.

Impact: Safety and society

Also on the panel was Toju Duke, program manager for responsible AI at Google. Duke spoke about the need to ensure that the impact products that use AI and ML may have on society is not harmful.

She referenced one example of an action Google took to ensure safety that was related to facial recognition. Google users can turn off Face Grouping, its facial recognition software, in apps like Google Photos.

Duke said it is important to try and get things right, but that businesses cannot get it right all the time. She said being responsible needs transparency, explainability, maintaining data quality, and using tools and fairness metrics.

“We should be trying to be responsible not just because of regulation,” she said.

Eitan Anzenberg, chief data scientist at Bill.com, supported Duke’s point, saying businesses should not wait for governments to do something and that it's up to brands to "do the right thing."

Describing privacy considerations as a “complex issue,” Anzenberg said that he often tries to take consumers’ perspectives when building models, asking himself how they might feel about what he’s doing end to end.

“Personal responsibility is very important,” he concluded.

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like