
What does it actually take to turn a qubit, a fragile quantum object operating at temperatures 100 times colder than outer space, into a reliable building block for a useful computer? And how do you go from controlling one qubit with a large beige box to orchestrating a million of them?
In the inaugural Qblox Quantum Builders conversation, Professor William D. Oliver of MIT offered a deeply practical tour of the engineering frontier of quantum computing. Oliver directs MIT's Center for Quantum Engineering, holds joint appointments in physics and electrical engineering, has shaped national quantum policy through the White House National Quantum Initiative Advisory Committee, and co-founded Atlantic Quantum to translate MIT breakthroughs into industry. He spent 20 years at MIT Lincoln Laboratory, a national lab that sits at the bridge between academic research and real-world systems.
His perspective is shaped by all of these roles, and the core message is clear: building a quantum computer is a systems engineering problem, and the hardest challenges ahead are not about any single component but about making everything work together at scale.
Oliver's career in quantum computing began more than 20 years ago with Terry Orlando at MIT, when superconducting qubits were, as he puts it, "kind of dogs."
"If you read Nielsen and Chuang's textbook on quantum, they wrote it more than 20 years ago, and in that textbook they would talk about cavity QED, they would talk about trapped ions, and then there was kind of an everything else chapter," he recalls. "Superconducting qubits were in that everything else, because at the time their coherence times were low and they weren't as performant, certainly not even close to the trapped ions."
From roughly 2003 to 2013, the field worked intensely just to get superconducting qubits to function well. A turning point came around 2011 and 2012, when Oliver's group, in collaboration with Yasunobu Nakamura's team at NEC in Japan, demonstrated qubits with coherence times of 10 to 20 microseconds. Around the same time, Yale's 3D transmon result from Hanhee Paik, Rob Schoelkopf, and collaborators revealed something critical: the surfaces of these qubits matter enormously.
"That really taught us that surfaces are important. The surfaces of these qubits is really critical."
Shortly after, Chad Rigetti and IBM produced their own 3D transmon results, and Rigetti went on to found his company. By that point, coherence times had reached the tens to roughly 100 microseconds. But as Oliver emphasizes, that was just the beginning.
Achieving good single-qubit and two-qubit gates satisfies David DiVincenzo's foundational criteria, but building a quantum computer requires far more. Since around 2013, the central question has shifted to scaling.
"You think about DiVincenzo's criteria. You need one and two qubit gates. Okay, great. But if you want to build a quantum computer, you need a lot more than that. And so then the question from 2013 maybe until today has been about how are we going to scale up."
Oliver identifies several threads that have driven progress. 3D integration, a major thrust at MIT Lincoln Laboratory from roughly 2012 to 2020, tackled the problem of getting wires and connectivity into processors by bringing signals in from the third dimension rather than routing everything from the edges. Fabrication engineering, advanced by companies like IBM, Google, and AWS alongside universities, has been equally important. And exploration of new materials, including work by Nathalie de Leon at Princeton and others, continues to push the field forward.
But Oliver's framing is important: none of these advances happens in isolation. Scaling is a systems optimization problem.
One of the most vivid moments in the conversation comes when Oliver describes the "brute force" approach to controlling qubits today.
"Today, that's basically done in what I would call a brute force manner. One wire per qubit. And in fact, often two or three wires per qubit. You DC bias and fast flux and microwave pulse and keep them separate."
This works at 100 qubits. Maybe at 1,000. If you're really determined, perhaps 10,000. But it can't hold.
"At some point, this just becomes a losing battle," Oliver says. He draws an analogy to vacuum tube computing in the 1940s and 1950s, where the sheer number of tubes meant that, on average, one was always failing and the machine couldn't run reliably. Classical error correction protocols were partly developed to handle exactly this kind of problem, but ultimately, a more robust technology had to come along.
The long-term solutions include collocating control electronics with the qubits themselves (using cryogenic CMOS or classical superconducting electronics) or multiplexing, where a small number of signal generators control a large number of qubits. Oliver notes that neutral atoms have a natural advantage here: one laser can control roughly 300 qubits from a single source.
"That's the kind of thing that we want to see: either through multiplexing or some other mechanism, with just a few generators, we can control a lot of qubits."
A decade ago, controlling qubits meant using large standalone instruments, one channel per box. Oliver describes the transformation that has taken place since then, and what still needs to happen.
"One major change is we went from the big beige box that held one channel to a chassis-based format, where just by sliding in cards, we go to multiple channels. Each card could have say four or eight channels. The backplane of that chassis is allowing the cards to talk to each other."
This chassis-based architecture, with embedded FPGAs, direct digital synthesis, and distributed feedback across backplanes and between chassis, has been critical for moving beyond small-scale experiments. It supports not just superconducting qubits but also semiconducting qubits and even some flavors of trapped ions.
When asked what's still missing, Oliver points to two things. First, the FPGA programming layer needs to become more accessible and flexible, with a clean interfacial layer where users can program in a reasonable language rather than wrestling with low-level FPGA code.
"Making that seamless and bug-free would be useful."
Second, and more fundamentally, the cost per channel has to come down.
"Even a thousand dollars per channel is pretty aggressive today. But if you think about it and you want to build a system of a million qubits and it's a thousand bucks per channel, that's a billion dollars for a quantum computer."
He acknowledges this is a market-driven problem: as volumes increase, costs will fall. But right now, the economics of control hardware are a real constraint on scaling.
Oliver frames the current state of quantum error correction through an analogy he returns to often: the Wright Flyer.
"We think back in time, you see the Wright brothers and one of them's flying the plane, and that was the dawn of commercial aviation. But that wasn't the end of it. Nobody goes on vacation getting on that plane. I wouldn't get on that plane. Didn't fly very far. Didn't fly very high either."
The point is that the Wright Flyer wasn't even the first time humans had flown. There were hot air balloons, aerodrones, and unmanned powered aircraft. What made the Wright brothers' demonstration matter was that it brought everything together, enabled in part by advances in classical control of the flight surfaces. Looking back, you could see that commercial aviation was going to come, even though it was still decades away.
"I think that what we're seeing today in the field are those types of demonstrations."
The quintessential example, in Oliver's view, is Google's Willow demonstration from late 2024, which showed that going from a distance-3 to a distance-5 to a distance-7 error correcting code, each step adding more qubits, produced lower error rates at each stage.
"That's fantastic because that means I added more qubits, but the performance of the system got better. And that's what we need to do if we're going to realize the promise of quantum computing."
Oliver puts the challenge in numerical terms. His group at MIT has achieved single-qubit fidelities as good as one error in 100,000. But commercially relevant quantum algorithms will require a billion or a trillion operations, meaning error rates need to reach one in a billion or one in a trillion. That's six or seven orders of magnitude beyond what physical qubits can deliver alone.
"We're going to work really hard to make our physical qubits better, but I don't think we're going to make it all up. What we can do is quantum error correction."
When Oliver is asked which quantum computing technology will ultimately prevail, his answer draws on the full history of classical computing.
"I don't think there's going to be a single winner. I think that technology doesn't evolve that way."
He traces the arc: mechanical machines gave way to vacuum tubes, then bipolar junction transistors, then emitter-coupled logic, then CMOS. CPUs have been running at roughly 4 gigahertz since Dennard-type scaling effectively stopped around 2006, but the field has continued optimizing through 3D integration, system-on-chip designs, GPUs, and tensor processing units.
"Technology evolves over time. Superconducting qubits are great today. They might carry us forward for quite a while. But at some point someone's going to come along with an innovation or a revolution that changes the game."
That revolution might still be superconducting but at higher temperatures, or it might be neutral atoms, semiconductors, ions, or something else entirely. Oliver's advice: keep your finger on the pulse, but don't bet on a single winner.
Oliver offers a candid assessment of where the field stands on near-term quantum computing without full error correction.
John Preskill coined the term NISQ (noisy intermediate-scale quantum) around 2012, and it captured an important idea: what can we do with the quantum computers we have today, before error correction? But Oliver notes that the community's thinking has shifted.
"It doesn't look like that is a viable path to quantum economic advantage," he says. "I think that the community is coalescing around one of two ideas."
The first is that you simply have to do error correction at scale to achieve universal fault-tolerant quantum computing. The second, which holds onto the spirit of NISQ, is quantum emulation: setting up qubits not as computational units but as physical systems that mimic the behavior of real quantum systems you want to understand.
"I call it or prefer to call it emulation, because I'm using one quantum system to emulate another quantum system. It goes back to Richard Feynman's original idea."
Beyond those two paths, Oliver stresses one more critical need: more quantum algorithms. The field simply doesn't have enough of them, and discovering new ones is a bootstrapping process that accelerates as hardware improves.
"Better hardware, okay, maybe there's some new algorithms. Better hardware, okay, new applications of those algorithms. We need people thinking about it."
Oliver has spent much of his career at the boundary between academic research and industrial systems, and he's clear-eyed about how difficult that translation is.
"What universities are very good at is trying a lot of things very quickly, failing fast, and publishing papers based on that," he explains. "But the motivation there is not to make something robust, reproducible, extensible, commercial. Not yet."
In academia, you're exploring the space: what's the right material, does the concept even work, what approaches are worth pursuing. The goal is to try many things and see what sticks. Industry, on the other hand, excels at taking ideas and making them work reliably at scale, every time.
"If you look at what companies like Intel or TSMC or Samsung can do, they make chips with billions and billions of transistors. A lot of that process development has to be done in industry because it's very expensive and it's not the kind of thing that lends itself necessarily to publication or to good training of students."
The gap between these two worlds is what Oliver calls the valley of death, and bridging it requires institutions like Lincoln Laboratory that have more engineering rigor than a university but more flexibility than a corporation.
He adds an important nuance about fabrication: the best material discovered in a lab may not be the best material for a commercial product, because fabrication is a multi-step process with complex interactions between steps.
"I might not take the absolute best material, and maybe I don't take the very best way to do step eight in my fab process, but I optimize over the whole process flow from materials to device. And what does best mean? It often comes down to yield, performance, robustness. It's a systems optimization problem."
Oliver introduces a term he credits to Jonathan Ruane, with whom he co-teaches a course at MIT's Sloan School of Management: quantum economic advantage.
"Quantum economic advantage is when a quantum computer is going to do something for me that's commercially relevant."
This is a higher bar than quantum advantage in the purely computational sense. It means the quantum computer solves a real industrial problem faster, cheaper, or better than any classical alternative. And it requires not just hardware performance but fully worked-out algorithms applied to specific real-world problems, including all the realistic overhead of error correction, classical co-processing, and data handling.
Oliver's advice to companies is practical: don't bet the farm, but don't ignore it either.
"It's too early to bet the farm. We need more performant quantum computers. We need more algorithms. This is a field that is nascent but growing and maturing. But it's important to be involved and aware."
He recommends that companies form small teams of one or two people who understand both the company's problems and the available quantum algorithms. Have them identify which algorithms apply, test them on today's small-scale machines, and estimate the size and performance of the quantum computer they'll eventually need.
"It probably won't bring you economic advantage yet, but it will bring you something valuable: I've got an algorithm that I know is going to solve a problem I care about, and it gives me a little bit of insight into the size of the quantum computer I'm going to need."
And whatever strategy a company takes, leader or fast follower: "Don't be a slow follower. This is a disruptive technology."
Oliver has invested heavily in workforce development, and his approach reflects a core conviction: the quantum industry needs people from every discipline, not just physics.
"If quantum computing is going to become an industry, we're going to need people from across the university doing it, whether it's Sloan School of Management, chemistry, biology, physics, electrical engineering, whatever."
At MIT, he and his colleagues developed a series of four professional development courses through MITxPro, each four weeks long and taken online, designed specifically for people who are not quantum physicists. Thousands have enrolled since the program launched. They recently added a shorter course aimed at executives who only have about 10 hours or a weekend.
He also co-teaches a course at MIT Sloan with Jonathan Ruane on the global business of quantum computing, which is "always packed."
But Oliver sees a deeper challenge ahead: defining what quantum engineering actually is as a discipline.
"What are the textbooks that we need? What is the curriculum? Is it mostly just traditional engineering, microwave engineering let's say, with a little bit of quantum included? Is that quantum what's taught in the physics department, or is it a new type?"
He uses an analogy from classical computing: people who design transistor chips today don't need to be semiconductor physicists. The physics has been abstracted into engineering design rules. Quantum computing needs to undergo a similar abstraction, and defining what that looks like is an active area of work.
On the connection between academia and industry, Oliver sees well-structured internships as important but warns they need to be designed carefully: advancing the student's learning, not just optimizing one narrow thing. And the underlying research at universities needs to stay relevant to industry needs, which creates a healthy tension around IP and funding.
"If we don't do that and I've trained somebody on 20-years-ago physics for quantum computing and then you hire that person, how useful is that?"
When asked directly what the current bottleneck is for realizing an economically viable quantum computer, Oliver gives a two-part answer.
"One is we need larger, more performant hardware. We need more qubits and they've got to last longer than they do. Now, maybe not intrinsically, maybe with error correction, but that's number one."
"And number two is we need more quantum algorithms."
Not just the big theoretical ideas from university research, but fully specified algorithms applied to real problems, with all the classical co-processing accounted for, so that at the end of the day someone can answer: how big a quantum computer do I need, how performant does it need to be, and what is the real advantage I'm going to get?
"Not the theoretical asymptotic advantage, but to the best I can estimate, what is the real advantage I'm going to get? That's what we need to do if we're going to realize quantum economic advantage."
When an audience member asks what topic Oliver would choose if he were starting a PhD today, his answer sidesteps the question in a deliberate and revealing way.
"What's important is that you do what you're passionate about and what you enjoy doing."
His reasoning is practical. The quantum industry will need people from every discipline. The best strategy is to master a traditional field (chemistry, biology, electrical engineering, computer science, physics, even business) and then add quantum awareness on top of it.
"You become a quantum computing aware person who knows your field at the level of your terminal degree. That's very valuable."
He points out that if you look at the job listings at companies like IBM, Google, AWS, and Microsoft, a few positions call for quantum physicists, "but a much larger number are everything else."
And he is honest about what a PhD is really like: "It's a long haul. It's delayed gratification. And there will be dark days. But some days are dark and you just have to say, okay, get a coffee and say, yeah, but I know that I really love this."
Passion and motivation are the keys to the marathon.
Quantum progress depends on collaboration across disciplines, institutions, and the boundaries between research and industry. At Qblox, we're proud to support that vision. Whether you're exploring new qubit modalities in the lab or scaling toward production, we're here to help you build what's next. Contact us today for more information.