London’s top cop shows a firm grasp of the subject
Metropolitan Police commissioner Cressida
Dick has issued a stark warning about the use of artificial intelligence in
She said reliance on technologies like facial
recognition could turn the UK into an “Orwellian, omniscient police state,” unless
the industry can resolve the issues around the ethics of AI.
She acknowledged that AI-based tools had
the potential to help fight crime but said the use of such tools had to be
accompanied by strict rules to prevent abuse.
Dick made the comments in a speech delivered
at the Lowy Institute, an independent Australian think tank that conducts policy-relevant
research about international political, strategic and economic issues.
“We’re now tiptoeing into a world of robotics, AI and machine learning ... the next step might be predictive policing,” she said, as reported by the Sydney Morning Herald.
“People are starting to get worried about
that ... particularly because of the potential for bias in the data or the
algorithm, [like] live facial recognition software."
Facial recognition technology is not currently employed by the Met, but such systems have started popping up on large swathes of privately-owned land, especially in central London.
For example, Argent, the firm responsible for the redevelopment of the area around the King’s Cross station, installed cameras with facial recognition back in 2015, but halted the controversial project in March 2018. Argent is currently under investigation by the UK’s Information Commissioner’s Office (ICO) to establish whether the project complied with the European Union’s General Data Protection Regulation (GDPR).
Canary Wharf, one of the capital’s two major financial districts, is working on getting its own facial recognition camera network but says it would only be used in cases when a specific threat has been identified.
Other countries are more accepting of the facial recognition tech: for example, China has about 200 million surveillance cameras deployed, and millions are being equipped with facial recognition software, making use of the state database that includes profiles on nearly every one of China’s 1.4 billion citizens.
Ken Marsh, the chairman of the Metropolitan Police Association, previously said that China’s use of facial recognition was “spot on” and should be replicated in London. It is important to note that Britain has more surveillance cameras per person than any other country except China.
“The real problem
with deep-learning AI solutions being used in policing is that they have the
potential to make discriminatory decisions without being detected,” explained Ben
Taylor, CTO at Rainbird, which develops AI-powered automation technologies.
Conventional Neural Networks are a network of weighted nodes used for many
‘image-recognition cameras, yet it’s difficult to know what features of an
image they extract in order to classify images. Because this method of AI is
based on its capability to learn from patterns in data without the need for
human intervention, it does not leave behind it a methodology for how it made a
“The only solution is
to put humans back in the loop by transforming machine-learning from a black
box into a glass house that operates according to human logic. Human-centric,
rules-based AIs will enable humans to audit every decision they make without
the need for external scrutiny, because they explain their decisions in human