Galan Moody Receives a New Grant to Pursue a Photonic-Based Platform for Quantum Processing
Classical computing is built upon the power of the bit, which is, in essence, a microtransistor on a chip that can either be on or off, representing a 1 or a 0 in binary code. The quantum computing equivalent is the qubit. Unlike bits, qubits can be in more than one “state” at a time, enabling quantum computers to perform computational functions exponentially faster than classical computers can.
To date, most efforts to build quantum computers have relied on qubits created in superconducting wires chilled to near absolute zero or trapped ions held in place by lasers. But those approaches face certain challenges, most notably that the qubits are highly sensitive to environmental factors. As the number of qubits increases, those factors are more likely to compound and interrupt the entanglement of qubits required for a quantum computer to work.
Another approach that has been developed more recently is to use a photon as an optical qubit to encode quantum information and to integrate the components necessary for that process onto a photonic in- tegrated circuit (PIC). Recently, Galan Moody, an assistant professor in the UC Santa Barbara College of Engineering’s Department of Elec- trical and Computer Engineering, received a Defense University Research Instrumentation Program (DURIP) Award from the U.S. Depart- ment of Defense and the Air Force Office of Scientific Research. The grant will allow him to build a quantum photonic computing testbed, conducting his research in a lab set aside for such activity in recently completed Henley Hall, the new home of the CoE’s Institute for Energy Efficiency.
The grant supports the development or acquisition of new instrumentation to be used in fundamental and applied research across allareas of science and engineering. “In my field, it’s quantum photonics, so we’re working to develop new types of quantum light sources and ways to manipulate and detect quantum states of light for use in such applications as quantum photonic computing and quantum communications,” Moody says.
“At the high level,” he explains, the concept of quantum photonic computing is “exactly the same as what Google is doing with superconducting qubits or what other companies are doing with trapped ions. There are a lot of different platforms, and one of them is to use photonic integrated circuits to generate entangled photons, entanglement being the foundation for multiple quantum applications.”
To place an entire quantum photonics system onto a chip measuring about one square centimeter would be a tremendous achievement. Fortunately, the well-developed photonics infrastructure — including AIM Pho- tonics, which has a center at UCSB led by pro- fessor and photonics pioneer John Bowers — lends itself to that pursuit and to scaling up whatever quantum photonics platform is most promising. Photonics for classical applications is a mature technology industry that, Moody says, “has basically mastered large-scale and wafer-scale processing and fabrication of de- vices.” It is reliable, so whatever Moody and his team design, they can fabricate themselves or even order from foundries, knowing that they will get exactly what they want.
The Photonic Edge
The process of creating photonic qubits begins with generating high-quality single photons or pairs of entangled photons. A qubit can then be defined in several different ways, most often in the spatial path that the photons travel. For example, a PIC might have two waveguides that confine photons and deter- mine the path they travel on the chip. If a photon is traveling in the “lower” waveguide, this can be defined as state 0. Likewise, the “upper” waveguide can be defined as state 1. Moody and his team can create PICs that control whether photons travel in the upper path, the lower path, or in both simultaneously. These path-encoded photonic qubits become the carriers of quantum information and can be manipulated to perform logic operations.
The approach has several advantages over other methods that have been used to generate qubits. For instance, the aforementioned environmental effects that can cause qubits to lose their co- herence do not affect coherence in photons, which, Moody says, “can maintain that entan- glement for a very long time compared to the other quantum systems. The challenge is not coherence but, rather, getting the photons to become entangled in the first place.”
“That,” Moody notes, “is because photons don’t naturally interact but, rather, pass right through each other and go their separate ways. But they have to interact in some way to create an entangled state. We’re working on how to create PIC-based quantum light sources that produce high-quality photons asefficiently as possible and then how to get all the photons to interact in a way that allows us to build a scalable quantum processor or new devices to be used in long-distance quantum communications.”
Quantum computers are super-efficient, and the photonics approach to quantum technologies is even more so. When Google “demonstrated quantum supremacy” in fall 2019 using the quantum computer built in its Goleta laboratory under the leadership of UCSB physics professor John Martinis, the company claimed that its machine, named Sycamore, could do a series of test calculations in two hundred seconds that a super-computer would need closer to ten thousand years to complete. Recently, a Chi- nese team using a laboratory-scale table-top experiment claimed that, with a photon-based quantum computer, “You could do in two hun- dred seconds what would take a super-computer more like 2.5 billion years to accomplish,” Moody says.
Another advantage is that photonics is naturally scalable to thousands and, eventu- ally, millions of components, which can be done by leveraging the wafer-scale fabrication technologies developed for classical photon- ics. Today, the most advanced PICs comprise nearly five thousand components and could be expanded by a factor of two or four with existing fabrication technologies, which is at acomparable stage of development that digital electronics were in the 1960s and 1970s.
“But even a few hundred components is enough to perform important quantum computing operations with light, at least on a small scale between a few qubits,” says Moody. “With further development, quantum photonic chips can be scaled to tens or hun-dreds of qubits using the existing photonics infrastructure.”
Platform & Process
For its work, Moody’s team is developing a new materials platform, based on gallium arsenide and silicon dioxide, to generate single and entangled photons, and it promises to be much more efficient than comparable systems. In fact, they recently had a paper published in the March 4 issue of the journal PRX Quantum showing that their new quantum light source is nearly a thousand times more efficient than any other on-chip light source.
In terms of the process, Moody says, “At the macro level, we work on making better light sources and integrating many of them onto a chip. Then, we combine these with on-chip programmable processors, analogous to electronic transistors used for classical logic operations, to try to control photonic interactions as efficiently as possible. For more accessible ap- plications, like communications, no computing needs to occur.
“The process involves taking a high-quality light source and manipulating the photon states to have some sort of property, then sending those off to some other chip that’s up in a satellite or in some other part of the world, and then that chip can do some kind of measurement and send a signal back that you can collect.”
The researchers take great pains to create high-quality quantum light. At one end of a PIC, they inject laser light into a waveguide. Some light will couple into a microring and start circulating around. As it does, some of that “pump” light is converted into two new photons through nonlinear interac- tions of the laser light with the waveguide material. “The goal is to design the microrings in a way that makes this process as efficient and as high-speed as possible without introducing undesirable optical noise into the waveguide,” says Moody.
Engineering interactions between photons is tricky. In the middle of the circuit, photons generated from the sources can be injected into a series of waveguides. Two of the wave- guides are coupled by being brought close to each other. The light in one of them has some probability of continuing on in the waveguide along which it is traveling and some probability of switching to the adjacent one. An optical quantum proces- sor consists of a number of these waveguides, couplers, and tunable elements that affect these probabilities.
At the output side of the circuit, light from a subset of the waveguides is collected off-chip and sent to detectors to determine whether a photon reached the end of the waveguide. “If we get a click on two of the detectors, we’ll know that the photons in the other channels interacted in the way we wanted them to,” Moody notes. “So, for instance, if detector five and six clicked within a specified time window, we would know that the photons remaining on the chip interacted as we wanted them to and are now prepared into a specific quantum state that can be used in the rest of the photonic circuit.
“Making an analogy to digital electronics, this is the quantum version of a NOT gate based on transistors, which is an important logic operation for computers. If the state of a bit input into a NOT gate is 0, the gate flips this to a 1,” he adds. The quantum NOT gate operates in a similar way, with the caveat that the flip of one photonic qubit depends on the state of another adjacent qubit. Detection of photons on the two channels verifies that this process occurred and the two adjacent qubits have become entangled.
One catch for now is that the detectors, which indicate whether entan- gled photons have been created, work with very high efficiency when they are on the chip; however, some of them work only if the chip is cooled to cryogenic temperatures. “If we want to integrate everything on chip and put detectors on chip as well, then we’re going to need to cool the whole thing down. We’re going to build a setup to be able to do that and test the various quantum photonic components designed and fabricated for this,” Moody says. “The DURIP award enables exactly this: developing the instrumentation to be able to test large-scale quantum photonic chips from cryogenic tem- peratures all the way up to room temperature.”
There are also challenges associated with cooling the chip to cryogenic temperatures, he explains. “It’s getting this whole platform up and running, interfacing the instrumentation, and making all the custom parts we need to be able to look at large foundry-scale custom photonic chips for quantum applications at cryogenic temperatures.”