Quantum computing will change the world— the industry has rightfully accepted this as fact. However, until it does, we must contend with some limitations in the noisy intermediate scale quantum (NISQ) era machines we have today. Many use cases allow us to show customers how to solve complex business problems with actual NISQ quantum computers. Still, we often must accept that it will be some time before we have enough qubits of a high enough fidelity to demonstrate true benchmarkable advantage. Fault-tolerant machines are coming, but organizations must be willing to invest in learning how to code for them and then, depending on the use case, wait up to a couple of years to roll a solution into production. ROI becomes a waiting game.
What if there was a way to get ROI today and still train the workforce for a quantum tomorrow? Enter quantum-inspired approaches. These algorithms, techniques and even hardware are designed based on the principles of either quantum physics or quantum computing (or both) but run on classical, scalable systems. If this sounds like a contradiction, bear with me a moment; it will all make sense.
Quantum-inspired solutions can train Large Language Models (LLMs) faster and cheaper, provide explainability when making credit decisions and help spot flaws in production lines. They can accomplish all types of optimization and can perform impressive forecasting. We believe they will shake up the industry quickly this year.
The heart of quantum computing use cases is solving a classical problem using a quantum algorithm on quantum hardware. For example, if a company handles fraud detection with binary classification in machine learning with a support vector machine (SVM), it would try to solve the same problem on a quantum gate-based machine running a quantum SVM (QSVM). But when it runs out of usable qubits due to hardware limits, it runs out of the ability to add parameters or otherwise improve the model. It’s then necessary to settle for extrapolation to figure out how many qubits will be needed in the future to achieve a potential advantage over classical SVM. With a quantum-inspired algorithm, it’s often possible to skip that last step. Sticking with the SVM example, there exists, in fact, a quantum-inspired SVM (QISVM). The latter runs on classical hardware, which is often deployable at any level of resources needed on the cloud. QISVMs have been around since 2019.
Another promising machine learning approach uses quantum-inspired convolutional neural networks (QICNNs). Since 2021, there have been examples of how these can outperform classical CNNs in some instances. This work builds on earlier quantum-inspired neurons from simple feed-forward networks. CNNs, often used for image recognition or classification, are getting less attention these days. LLMs like GPT are grabbing headlines and are based on transformers instead. However, LLMs may use CNNs as tools. Yes, AI is now using tools!
Other algorithms and use cases venture into optimization. The most common type is quantum-inspired annealing. With an actual quantum annealer, it is possible to map a problem like the traveling salesperson or a portfolio optimization to real qubits and use quantum tunneling to find the lowest energy state or answer. Think of this approach as examining all the peaks and valleys in the U.S. One could drive over them to find the lowest point, but it would be much faster to go straight through those hills. Annealers like the ones built by D-Wave allow for that type of tunneling. With quantum-inspired annealing, one can’t tunnel as with a real annealer but can use thermal fluctuations to hop around quickly, all on classical hardware. It works well for some problems and not others, so trial and error are involved.
Tensor networks—inspired by quantum physics
A tensor is a mathematical object that can represent complex multidimensional data. To create a tensor network, factorize a large tensor into a network of smaller tensors, thereby reducing the number of parameters and computational complexity. The tensors are connected by links that represent relationships between the subsets of data. Tensor networks are inspired by quantum physics, not quantum computing. The networks can model quantum states, including representing entanglement as graphical diagrams.
Tensor networks are becoming popular because of their use in machine learning. They can work with complex data and perform dimensionality reduction and feature extraction—think faster and cheaper compute for ML or Monte Carlo simulations. Notably, they can bring cost and performance benefits to the currently expensive methods for training LLMs.
A digital annealer is a chip that solves the types of combinatorial optimization problems addressed above but does so by emulating quantum annealing with classical hardware and software techniques. These devices have advantages over conventional and quantum computers as they can handle large-scale problems with thousands of variables and constraints without requiring complex encoding or decomposition techniques. A digital annealer can also operate at room temperature and consume less power than quantum computers that require cryogenic cooling and superconducting circuits.
While we march towards provable quantum advantage, we expect quantum-inspired approaches to solving real business problems with an edge today.
Read the results of our new Global IT Executive Survey: The Innovation vs. Technical Debt Tug-of-War.
To learn more about our emerging technology solutions, contact us.