Training Multivariate Risk Models, Quantum Computing Outperforms Classical Methods

Quantum computing company IonQ and partner GE Research (General Electric Research Center) are exploring the application of quantum computing to multivariate distribution modeling in risk management, and on June 23 they published early results in a paper [1].

 

Using a Quantum Circuit Born Machine (QCBM) framework based on normalized historical indices, IonQ and GE Research were able to efficiently train quantum circuits to learn correlations between three and four indices. In some cases, predictions derived from the quantum framework outperformed classical modeling approaches, thus confirming that quantum copula may lead to smarter data-driven analysis and decision making in business applications.

 

 

6909d85356396a0dff9158a8c88f398d

Early results have laid the groundwork for developing new solutions to manage risk exposures.

 

copula is a mathematical tool for modeling joint probability distributions. Because of its ability to help one conveniently deal with the marginal distribution of each variable and the interdependence between variables separately, copula has become a fundamental analytical tool on classical computers over the past 60 years in fields ranging from quantitative finance and civil engineering to signal processing and medicine.

 

Although the classical technique of forming a copula using mathematical approximations is a good way to model multivariate risks, the classical technique faces inefficiencies when multiple variables must be modeled together with high precision. Therefore, IonQ and GE Research have identified a new training strategy that optimizes the results of quantum computation even when the system is scaled.

 

The researchers successfully trained quantum copula models of up to four variables on IonQ's ion-trap quantum computer, Aria, by using data from four representative stock indices (including the Dow Jones, Chicago Board Options Exchange Volatility Index, Nikkei 225 and Russell 2000) in an easily accessible and variable market environment.

 

2272750ae5f68185fb53b8dec714aa66

The hybrid training loop of the Quantum Circuit Boehn Machine (QCBM) framework: (a) is used to model the parameters of the n-variable copula ansatz.(b) is used to construct the structure of each layer of Ui. Each Ui constructed using such layers is used to address their corresponding variables. (c) At each iteration of the optimization loop, ansatz with a given set of parameters is repeatedly executed on the quantum computer to collect statistical estimates of the output states, represented here as histograms. This is then compared with the target histogram. The difference quantified by the KL-scatter is used by the classical optimizer to drive the optimization. The optimization cycle is repeated until convergence.

 

To address the challenge (as the model size increases, the complexity of parameter optimization increases, resulting in lower training efficiency), the team introduced an annealing-inspired strategy that greatly improved training results. In their end-to-end tests, various configurations of the quantum model made comparable or better predictions than the standard classical model in the risk aggregation task.

 

The following figure compares the results of the annealing training with the standard training. For each training method, they perform 800 and 200 iterations of training. For the same training method, more training iterations always produce better training results. In contrast, annealing training yielded better results than standard training, even though the number of training iterations was reduced by a factor of four. In annealing training, the standard deviation of the training results with randomly selected initial parameters is smaller.

 

d7956d54781df5259ff9b05b74a2d03e

Comparing the results of annealed and non-annealed methods, the annealed method is better than the non-annealed method with 800 iterations even with 200 iterations.

 

Admittedly, repeating the annealing step leads to a non-trivial overhead compared to standard training. However, the researchers expect the overhead to remain constant as the complexity of the model increases.

 

This improvement demonstrates a promising approach to perform multivariate analysis faster and more accurately, which GE researchers hope will lead to new and better ways to assess risk in key manufacturing processes such as product design, plant operations, and supply chain management [2].

 

David Vernooy, Senior Executive Officer and Head of Digital Technology at GE Research, said, "As we've seen from recent global supply chain volatility, the world needs more effective methods and tools to manage risk in situations where conditions are highly variable and interconnected. Our early results in IonQ's financial use cases show the tremendous potential of quantum computing to better understand and mitigate the risks associated with these highly variable scenarios."

 

Reference link:

[1]https://ionq.com/links/Quantum_Computing_for_Risk_Aggregation_20220622.pdf

[2]https://ionq.com/news/june-23-2022-ionq-ge-research-risk-aggregation

 

2022-06-27