The USC-Lockheed Martin Quantum Computing Center will be upgraded to the world’s most advanced commercially available quantum optimization processor in January.
The QCC– a super-cooled, magnetically shielded facility at the USC Viterbi School of Engineering’s Information Sciences Institute (ISI) – will be receiving its third processor in four years. “This one will take us a step closer to being able to test the true potential of quantum optimization,” said Daniel Lidar, electrical engineering professor at USC Viterbi and scientific director of the QCC.
For the past four years, Lidar’s team has been attempting to show whether the D-Wave processors could, in fact, perform optimization problems faster than a traditional computer. This elusive property, called “quantum speedup,” is the lynchpin of quantum computing as a whole – what’s going to ultimately justify the expense and effort of creating an entirely new type of hardware for data processing.
Optimization problems require machines to find the best solution to a problem – the classic example being a package delivery service needing to plan out a route for its drivers. In scientific terms, it’s described as “finding the lowest energy state” – if you imagine the problem is a 3D map of a landscape, the goal is to find the lowest elevation.
Quantum computers should, in theory, excel at these problems because they have the capacity for tunneling through energy barriers – drilling through the mountains – to find that lowest-energy solution. But it’s difficult to prove.
“Tests performed on the previous two generations of D-Wave processors were promising, but inconclusive. They just did not have enough qubits and had too much noise,” Lidar said.
Quantum bits, or “qubits,” have the capability of representing the two digits of one and zero at the same time, as opposed to traditional bits, which can encode distinctly either a one or a zero, a property known as “superposition.”
Lidar also hopes to use the 2X’s surplus of qubits to do a better job of error-correction – the process of entraining multiple qubits to answer the same question in order to ensure a better chance of a correct answer. “All quantum computers require error correction to function correctly at large problem scales,” Lidar said.
The new processor isn’t just bigger, it’s also better – boasting smaller qubits and qubit junctions, redesigned couplers between qubits to boost its problem energy scale, and more “noise” reduction. Noise is the interference that prevents quantum bits from encoding two states at the same time, or a process causing the optimization problem to be misrepresented by the quantum hardware.
Once the chip is installed in the QCC’s giant black box – one of the coldest places on earth, allowing the processor to operate at a temperature of 20 milliKelvin – it will take about a month to properly bring the device online. By some time in February, Lidar’s team plans to be hitting the 2X with a regimen of rigorous benchmarking, comparing the chip’s performance on hard optimization problems to that of classical computers.
This is one of only two D-Wave 2Xs that will be operating outside of the D-Wave headquarters worldwide; the other was installed at the Google/NASA/Universities Space Research Association Quantum Artificial Intelligence Lab at Moffett Field.
Through a multi-year agreement between manufacturer D-Wave and Lockheed Martin, the QCC started with the 128-qubit D-Wave one processor when it opened in 2011, upgraded to the 512-qubit D-Wave Two in 2013, and in January will become home to the D-Wave 2X, which boasts more than 1,000 quantum bits.
“This is an exciting time to be in the field of quantum computing. This is a field that was purely theoretical until the ‘90s, and now is making huge leaps forward every year,” Lidar said.