Skip to content
MIT Better World

By Mark Wolverton

 
But quantum computing—or, more specifically, quantum information science and engineering—is a real, cutting-edge discipline focused on developing systems that will leave today’s fastest supercomputers in the dust.

In fact, it’s a whole ecosystem of technology based on quantum mechanics, a field of physics centered on how subatomic particles move and interact, according to Paola Cappellaro PhD ’06, the Korea Electric Power Company Professor in the Department of Nuclear Science and Engineering (NSE). Cappellaro is at the forefront of MIT’s quantum computing research as leader of the Quantum Engineering Group in the Research Laboratory of Electronics.

“In my group, we work not only on quantum computing but also on associated technologies,” Cappellaro says. “The common thread is quantum information science, how to manipulate, encode, and exploit information using quantum devices.”

The technology is still in its infancy, but approaching computing from the vanguard of physics promises a sea change in how computers tackle huge mathematical challenges, such as breaking cryptographic codes, and simulate intricate systems, such as complex chemical reactions.

More memory and power

While conventional computers operate by processing bits of data consisting of zeros and ones, generally encoded in electronic form as on/off, quantum computing is based on principles that permit subatomic particles to be in different states simultaneously, enabling quantum bits, or “qubits,” to hold more information.

In theory, a quantum computer should outmatch even the most advanced supercomputer—but so far, no one has quite figured out the best way to build one. That’s because there are many possible ways to create the qubits of data, all involving different physical systems and types of hardware. Also, qubits are delicate and subject to what physicists call “decoherence” or the collapse of their fragile quantum state at the slightest vibration or change in temperature.

Another major challenge centers on addressing errors, which today’s computers handle through redundancy. “Instead of just encoding information in one bit, you can encode them in a certain number of bits and then you take a majority vote,” she says. This doesn’t work in the fuzzier realm of qubits, for a variety of reasons including that the disturbance caused by measurement (“wave-function collapse”) forbids checking the majority vote conditions.

Cappellaro’s Quantum Engineering Group is using electron and nuclear spins to address this challenge. Their approach centers on a type of defect found within the crystal lattice of diamond, called a nitrogen-vacancy or N-V center, that could be harnessed to create qubits. “We came up with a way of characterizing the noise in our system and then came up with an efficient way of protecting it from errors,” she says. “What we hope is that … we can actually have a practical error correction system for today’s intermediate-scale quantum devices.”

Collaborations at MIT

Quantum computers are expected to be able to tackle the biggest of big data challenges, but the specific applications may depend on which systems prove most practical. “We’re still in the stage where we’re trying to pick the best technology,” Cappellaro says.

Making such choices means exploring many different options, reflected in the broad range of researchers involved in quantum computing across the MIT School of Science and the MIT School of Engineering, as well as many groups at MIT Lincoln Laboratory. The MIT Stephen A. Schwarzman College of Computing is expected to better unite the Institute’s quantum computing efforts.

“We have a long tradition in quantum computation,” Cappellaro observes. “But the Schwarzman College could position MIT even better to play a larger role both in the United States and on the world stage. It’s definitely an opportunity to be seized.”