Scroll Top

IBM plans on building its first fault tolerant quantum computer by 2029

WHY THIS MATTERS IN BRIEF

Having ultra powerful computers is great, but if they aren’t accurate then they’re just shelfware.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

This week, IBM announced a pair of shiny new quantum computers. The company’s Condor processor is the first quantum chip of its kind with over 1,000 qubits, a feat that would have made big headlines just a few years ago. But earlier this year, a startup, Atom Computing, unveiled a 1,225-qubit quantum computer using a different approach. And although IBM says Condor demonstrates it can reliably produce high-quality qubits at scale, it’ll likely be the largest single chip the company makes until sometime next decade.

 

RELATED
Arm's flexible plastic computer chip gets ready to revolutionise the Internet of Things

 

Instead of growing the number of qubits crammed onto each chip, IBM will focus on getting the most out of the qubits it has which is why recently they created a noisy quantum computer – ironic as that sounds. In this respect, the second chip announced, Heron, is the future.

Though Heron has fewer qubits than Condor – just 133 – it’s significantly faster and much less error-prone than its cousin. The company plans to combine several of these smaller chips into increasingly more powerful systems, a bit like the multicore processors powering smartphones. The first of these, System Two, also announced this week, contains three linked Condor chips.

 

The Future of Computing, FanaticalFuturist Podcast

 

IBM also updated its quantum roadmap, a timeline of key engineering milestones, through 2033. Notably, the company is aiming to complete a fault-tolerant quantum computer by 2029. The machine won’t be large enough to run complex quantum algorithms, like the one expected to one day break standard encryption. Still, it’s a bold promise.

 

RELATED
US Department of Energy taps HP's 'Machine' to beat China to Exascale HPC

 

Practical quantum computers will be able to tackle problems that can’t be solved using classical computers. But today’s systems are far too small and error-ridden to realize that dream. To get there, engineers are working on a solution called error-correction.

A qubit is the fundamental unit of a quantum computer. In your laptop, the basic unit of information is a 1 or 0 represented by a transistor that’s either on or off. In a quantum computer, the unit of information is 1, 0, or – thanks to quantum weirdness – some combination of the two. The physical component can be an atom, electron, or tiny superconducting loop of wire.

Opting for the latter, IBM makes its quantum computers by cooling loops of wire, or transmons, to temperatures near absolute zero and placing them into quantum states. Here’s the problem. Qubits are incredibly fragile, easily falling out of these quantum states throughout a calculation. This introduces errors that make today’s machines unreliable.

One way to solve this problem is to minimize errors. IBM, like Microsoft, made progress here. Heron uses some new hardware to significantly speed up how quickly the system places pairs of qubits into quantum states – an operation known as a “gate” – limiting the number of errors that crop up and spread to neighbouring qubits. Researchers call this “crosstalk.”

 

RELATED
DotData's AI automates data scientists, builds its own machine learning models

 

“It’s a beautiful device,” Gambetta told Ars Technica. “It’s five times better than the previous devices, the errors are way less, [and] crosstalk can’t really be measured.”

But you can’t totally eliminate errors. In the future, redundancy will also be key.

By spreading information between a group of qubits, you can reduce the impact of any one error and also check for and correct errors in the group. Because it takes multiple physical qubits to form one of these error-corrected “logical qubits,” you need an awful lot of them to complete useful calculations. This is why scale matters.

Software can also help. IBM is already employing a technique called error mitigation, announced earlier this year, in which it simulates likely errors and subtracts them from calculations. They’ve also identified a method of error-correction that reduces the number of physical qubits in a logical qubit by nearly an order of magnitude. But all this will require advanced forms of connectivity between qubits, which could be the biggest challenge ahead.

 

RELATED
Researchers networked animal brains together to create an organic computer

 

“You’re going to have to tie them together,” Dario Gil, senior vice president and director of research at IBM, told Reuters. “You’re going to have to do many of these things together to be practical about it. Because if not, it’s just a paper exercise.”

Something that makes IBM unique in the industry is that it publishes a roadmap looking a decade into the future.

This may seem risky, but to date, they’ve stuck to it. Alongside the Condor and Heron news, IBM also posted an updated version of its roadmap.

Next year, they’ll release an upgraded version of Heron capable of 5,000 gate operations. After Heron comes Flamingo. They’ll link seven of these Flamingo chips into a single system with over 1,000 qubits. They also plan to grow Flamingo’s gate count by roughly 50 percent a year until it hits 15,000 in 2028. In parallel, the company will work on error-correction, beginning with memory, then moving on to communication and gates.

 

RELATED
A DNA computer just calculated the square root of 900

 

All this will culminate in a 200-qubit, fault-tolerant chip called Starling in 2029 and a leap in gate operations to 100 million. Starling will give way to the bigger Blue Jay in 2033.

Though it may be the most open about them, IBM isn’t alone in its ambitions.

Google is pursuing the same type of quantum computer and has been focused on error-correction over scaling for a few years. Then there are other kinds of quantum computers entirely – some use charged ions as qubits while others use photons, electrons, or like Atom Computing, neutral atoms. Each approach has its trade offs.

“When it comes down to it, there’s a simple set of metrics for you to compare the performance of the quantum processors,” Jerry Chow, director of quantum systems at IBM, told the Verge. “It’s scale: what number of qubits can you get to and build reliably? Quality: how long do those qubits live for you to perform operations and calculations on? And speed: how quickly can you actually run executions and problems through these quantum processors?”

 

RELATED
Researchers created self-assembling proteins that can the memories of cells

 

Atom Computing favours neutral atoms because they’re identical -eliminating the possibility of manufacturing flaws – can be controlled wirelessly, and operate at room temperature. Chow agrees there are interesting things happening in the nuetral atom space but speed is a drawback.

“It comes down to that speed,” he said. “Anytime you have these actual atomic items, either an ion or an atom, your clock rates end up hurting you.”

The truth is the race isn’t yet won, and won’t be for a while yet. New advances or unforeseen challenges could rework the landscape. But Chow said the company’s confidence in its approach is what allows them to look ahead 10 years.

“And to me it’s more that there are going to be innovations within that are going to continue to compound over those 10 years, that might make it even more attractive as time goes on. And that’s just the nature of technology,” he said.

Related Posts

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This