Quantum computers will start delivering business benefits for banks when they can process thousands of variables

Berenice Baker, Editor

June 15, 2023

2 Min Read
Getty Images

At a Glance

  • Quantum computers will start delivering business benefits for banks when they can process thousands of variables.

The financial services cases for quantum computing demonstrated to date have little utility for the industry, but more powerful future machines will deliver value, Barclays head of machine learning technologies Dimitrios Emmanoulopoulos said at the Quantum Computing Summit London.

“The most important thing that we need for financial institutions are the use cases,” he said. “Because we always say adoption, adoption, adoption, what we need to understand is that use cases are required to advance acceleration.”

Emmanoulopoulos said while quantum computing can demonstrate exciting results for a set of optimization problems for some types of AI and machine learning workloads, commercial adoption needs to provide measurable, actionable insights.

“Everybody says the most common use case for quantum computers is portfolio optimization. We can do portfolio optimization in 10 seconds, two seconds, three seconds,” he said.

“But a big business unit like Barclays doesn’t change its portfolio optimization strategy, every hour, day or even month. These are big plans for big business areas. Delivering results within seconds or nanoseconds does not play any role.”

Emmanoulopoulos said that current optimization technology works perfectly well for most financial optimization problems. But future quantum computers could optimize a typical portfolio that consists of 120,000 assets each with constraints, leading to an optimization problem of a thousand variables with 20 constraints, which are the type of portfolio optimizations Barclays wants to do. 

“At the point that we are able to host this type of optimization problems for these types of use cases in a bank computer, this is where the adoption is going to come in,” he said. “If I have 500 qubits or fewer, we are far away from a proper adoption. As an R&D function, that's fine, and I'm experimenting continuously. 

Any framework that is out there, starting from genuine computers that use superposition and entanglement up to the annealers that can return up to 3,000 qubits, but they are unable to join them in the simulator board. In my view. I haven't come across any annealer technology that is faster than the state-of-the-art numerical algorithms that can run on FORTRAN or C++.

“A point that is important not only for quantum computing but for any technology, the latest and greatest and the fastest is not really what we are looking for. We’re always looking for the good enough. The good enough is the 8%, the 7% that has been tested, is bulletproof and has been there for quite some time in a production environment and technology. When quantum computers become commercially available, they’re not going to be used for tier-one customer-facing applications but they can be used for internal consumption.”

Read more about:

AI SummitConference News

About the Author(s)

Berenice Baker

Editor, Enter Quantum

Berenice is the editor of Enter Quantum, the companion website and exclusive content outlet for The Quantum Computing Summit. Enter Quantum informs quantum computing decision-makers and solutions creators with timely information, business applications and best practice to enable them to adopt the most effective quantum computing solution for their businesses. Berenice has a background in IT and 16 years’ experience as a technology journalist.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like