When the Austrian physicist Erwin Schrödinger famously conducted his cat in a box experiment and concluded the cat could be simultaneously dead and alive until the observer opened the box, it highlighted, despite such a seeming absurdity, the much wider issue of how quantum systems can operate in multiple states until observed or measured.
Fast forward to the 21st century and its relevance is becoming increasingly clear, with quantum computing eventually set to supersede the classical (binary) computing we have now.
But what is it? And how far off is it?
The key difference between today’s classical computers and the quantum computational machines of the future is that the former only store information in binary (1s and 0s) format, hence each bit is in a state of being either ‘on’ or ‘off’.
In quantum computation, on the other hand, advantage is taken of the fact that at the quantum level, sub-atomic particles may exist in more than one state at any given time.
On? Off? Or on and off?
While bits are used as units of information in classical computing, quantum machines encode information as qubits (quantum bits). These units, in addition to being possibly ‘on’ or ‘off’, can also be both ‘on’ and ‘off’.
What this means in a practical sense is that because of the way particles behave at the quantum level, operations can be performed significantly more quickly, whilst using less energy than their classical computing counterparts.
While the jury is still out as to when quantum computing will enter the commercial sphere, extensive research is being undertaken by a range of stakeholders.
In FinTech, the most obvious application is portfolio optimisation and determining the financial risk associated with an investment portfolio.
While quantum computing is far away from maturity, when compared to other emerging technologies like artificial intelligence (AI) and virtual reality, a recent Accenture report found that 32 per cent of banking executives believed quantum computing will have the second biggest impact (behind AI) on their organisation over the next three years.
Faster means more profitable
As Andrew Poppleton, financial services lead for Accenture UK and Ireland, put it: “The faster a bank can compute an algorithm, execute a transaction or make a decision, the more profitable it is likely to be.”
Financial institutions have been quick cotton on to this, with several actively involved in collaborations with IBM Research, for example.
Using the company’s quantum computing cloud service - part of the IBM Q Network - a number of clients, including Barclays, JP Morgan Chase and some banks in Japan, are already testing and publishing results.
The Q Network provides access to hardware, software and education materials, to help enterprises, startups and academics become quantum ready. While the hardware is still nascent, it's at a sufficiently mature level to begin experimenting with, according Stefan Woerner, global leader for quantum finance and optimisation at IBM Research.
“In addition to the hardware, we have also developed Qiskit Finance, consisting of quantum algorithms and tutorials on different relevant financial applications to help users start their quantum journey; and it's all open source,” he stated.
Future proofing cryptography
One consequence of quantum computing is that it may render useless the current set of cryptography in use for security systems across industries, although Frost & Sullivan business and financial services principal Clare Walker, argued that quantum computing could be used to create cryptography that may actually future-proof it.
“The design and automation of trading algorithms will be possible due to quantum computational pattern recognition, and it is ideal for the analysis and storage of the millions of data points generated on a regular basis by the world’s financial markets.
“That data could be used (more quickly) to better assess risk and be used in conjunction with other technologies, like AI, to analyse much larger data sets,” she noted.
According to Frost & Sullivan’s annual survey, quantum computing is expected to have a broad-ranging impact and be adopted within the next decade; having been identified as one of three biggest game changers in this period by respondents – along with natural language interfaces and human brain-computer interfaces.
Monte Carlo or bust
Key to performing risk analysis though is the so-called Mote Carlo simulation. It’s used to assess the impact of risk and uncertainty in prediction, modelling the probability of different outcomes in a process where prediction is less easy, due to the incorporation of random variables.
“Using a Monte Carlo simulation is often an overnight task,” explained Woerner, and in a worst-case scenario can extend over days in terms of computing time expended. “The quadratic speedup provided by quantum computing may reduce calculation time from overnight to near-real time or from days to hours, respectively.
“Although it might take a few years until the hardware required to realise this speed-up becomes available at the required scale, the potential impact would be enormous,” he added.
In the meantime, interested companies will have to be content with playing a waiting game, even while ramping up their own research.
Recent Stories