Quantum computing is in the stage of development that many promising new technologies find themselves in. The enormous potential of quantum computing use-cases is exciting many industry professionals, sparking anticipatory software development in expectation of the hardware catching up. However, much like 5G, this expectation is frustrated by the difficulty in reaching the standard needed to make these use cases a reality.

But that’s not to despair. There are many companies out there making bold and interesting steps to bring the dream of generic, error-correcting quantum machines to life – one step at a time.

Quantum, what is it good for?

Little stirs the imagination like the most exciting use-cases for quantum computing. They are the reasons why IDC predicts investment in quantum to jump 2,000 percent in coming years – from $412 million in 2020 to $8.6 billion in 2027.

The goal of quantum computing is to make complex computational tasks that are currently impossible, possible.

If we take tasks like air travel and logistics, for example, there are millions of possible routes when flying a plane from A to B, there are millions of possible routes. Classical computing can cut down a lot of them, but that might also mean cutting out the optimal route.

There is an inherent frustration in being unable to compute large datasets with classical computing, only finding practical shortcuts. With quantum computing, you wouldn’t need to compromise.

Precision medicine is an area that quantum is set to disrupt entirely. It has the potential to overturn traditional expectations about the speed and effectiveness of clinical decision-making, raising the bar for patient care standards.

Quantum computing will improve the speed of disease diagnosis, treatment, and even prevention. Looking forward to the coming years and decades, this could improve the lives of billions – significantly reducing the distress and mortality rates of some of the world’s most deadly diseases. The value of this advancement will be measured in lives, not just in pounds and dollars.

What’s holding us back?

Realizing these applications will rely on qubits - quantum computing’s equivalent to the binary digit, core components of the technology. These use cases challenge us to develop enough high enough quality qubits for practical applications.

Right now, qubits are generally unreliable and very error-prone. As a result, many quantum machines aren’t stable and require intense calibration and special software to reach a degree of accuracy. Some quantum companies are focusing on quality to solve this problem, enabling quantum computing applications with far fewer qubits.

We need to focus on how to deliver just enough qubits, with the right performance, to solve certain tasks. Application-specific quantum machines are designed and built to solve specific tasks.

The dream of a universal, error-correcting quantum machine is of course appealing, but to achieve immediate progress, we must align our long-term goals with our current capabilities, taking this journey one step at a time and adopting a more patient attitude

In order to combine hardware manufacturers and software developers, we need ‘middleware’ at the OS-level allowing the two sides to talk to one another. This will tap into the culture of collaboration and openness that has led quantum computing to the state it is at today.

Intellectual property represents a sizable bump in the road, and manufacturers are – understandably – reluctant to fully open up their products to software companies. A reticence that may, in time, result in hardware providers building their own OS, Windows-style.

The issue is not purely software or hardware-based, but a little of both. It will require a skillset that is in high demand within the quantum field: serious interdisciplinary expertise.

The competing approaches to quantum hardware

Looking to the future of the quantum industry, we face diverging paths. There are a variety of ways that researchers and industry R&D are pursuing to bring real quantum machines closer to reality.

The two main approaches are superconducting circuits and trapped ions.

Compared to the trapped-ion approach, superconducting circuits appear to be a more practical solution, particularly when building a scalable quantum machine. They have a number of advantages: they’re far more achievable from an engineering perspective; they have gate times in the nanosecond range; and they’re faster at interacting, which may prove essential in compensating for real-time errors produced by the system.

All of this means that superconducting circuits represent a better choice for certain applications.

With the other option, trapped ions, you confine and suspend ions using electromagnetic fields. While promising, it is currently difficult to initialize the ion’s motional states, and the phonon states have relatively short life expectancies.

As with every branching decision, you run risks either way, but the race is heating up daily and few businesses can afford to be out of it.

Looking to the future

Developing stable quantum computing is a challenge, but a worthy one. There are still some barriers that we must scale, especially in finding practical software solutions. To make all this happen, companies must invest with a clear strategy and recognize the value of interdisciplinary academic research.

Nevertheless, on balance, there is potential for the industry to work together and deliver on the nearly $700bn of value that McKinsey predicts will emerge from this new paradigm of computing by 2035. Whoever cracks the issue of stable quantum computing will have a whole array of opportunities open to them, and the rest of the world won’t be far behind.

Editor's Note: OpenOcean's investment portfolio includes IQM, a quantum computing startup.