In recent news, IBM announced that it would be embarking on a journey to build the world’s first large-scale, fault-tolerant quantum computer, setting the stage for practical and scalable quantum computing.
The plans, which were unveiled last week on the 10th June, demonstrate how IBM is strengthening its position in quantum computing, as the company already operates a fleet of quantum computers. In Germany last year, it opened its first quantum computing centre outside of the US, based in Ehningen.
IBM Quantum Starling, which is expected to be delivered by 2029, will be built in a new quantum data centre in Poughkeepsie, New York – with figures provided by the company placing its capabilities as 20,000 times more operations than existing quantum computers. The hope is that these capabilities will speed up operations in areas such as drug development, chemistry, and materials discovery.
In the announcement, Arvind Krishna, Chairman and CEO of IBM said the company was “charting the next frontier in quantum computing”.
Tim Hollebeek, Vice President of Industry Standards at DigiCert shared his thoughts on the news below.

How realistic is IBM’s prediction?
IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.
Are there any factors that might help or hinder IBM’s timetable?
While there are no huge theoretical obstacles, IBM is attempting to do something that no one has ever done before, and will almost certainly run into challenges. But at this point it is largely an engineering scaling exercise, not a research project.
What are the major obstacles that exist today that must be overcome to bring a quantum computer to market?
Precisely the ones IBM is attempting. Current quantum computers are too small, and the qubits aren’t stable enough. Progress is needed on both fronts.
Are there alternative architectures for a quantum computer that could compete with IBM’s approach?
There are a number of competing architectures, but the approach that IBM is using is the least speculative and farthest along at this time.
What are some of the advantages of a large-scale, fault tolerant computer?
Simulating quantum mechanics is very hard on classical computers, so some of the most exciting opportunities leverage its quantum nature to better simulate quantum effects in drug discovery, material design, and so on. However it’s worth noting that the IBM goals are also dangerously close to what it would take to break modern cryptography, reinforcing the consensus that we need to move to quantum-safe encryption algorithms on all systems by the early 2030s.
Does IBM’s announcement add urgency to security experts’ calls for organisations to protect their data with post-quantum encryption?
Absolutely. The year 2029 was already a milestone the industry was marching toward, driven by Gartner’s guidance—but IBM’s planned delivery of Quantum Starling by then underscores that timeline. It adds weight to the call for organisations to accelerate post-quantum readiness and adopt cryptographic agility now, not later.
There’s plenty of other editorial on our sister site, Electronic Specifier! Or you can always join in the conversation by commenting below or visiting our LinkedIn page.