‘Momentum Computing’ pushes the thermodynamic limits of technology

In case you haven’t noticed, computers are literally hot. A laptop can pump out scorching heat, while data centers consume roughly 200 terawatt-hours each year, comparable to the power consumption of some medium-sized countries. The carbon footprint of information and communication technologies as a whole is similar to that of fuel use in the aviation industry. And as computer circuits get smaller and more densely packed, they become more prone to melting due to the energy they dissipate as heat.

Now physicist James Crutchfield of the University of California, Davis, and his graduate student Kyle Ray have proposed a new way of performing calculations that dissipate only a small fraction of the heat produced by conventional circuits. In fact, his approach, described in a recent preliminary article, could push heat dissipation even below the theoretical minimum that the laws of physics impose on today’s computers. That could greatly reduce the power needed to perform calculations and keep circuits cool. And it could all be done, the researchers say, using microelectronic devices that already exist.

In 1961, physicist Rolf Landauer of IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York, showed that conventional computing incurs a unavoidable cost in the dissipation of energy, basically, in the generation of heat and entropy. This is because a conventional computer sometimes has to erase bits of information in its memory circuitry to make room for more. Every time a single bit is reset (with the value 1 or 0), a certain minimum amount of energy is dissipated, which Ray and Crutchfield have dubbed “the Landauer”. Its value depends on the ambient temperature: in your living room, a Landauer would be around 10-twenty-one joule. (For comparison, a lit candle emits on the order of 10 joules of energy per second.)

Computer scientists have long recognized that the Landauer limit on the amount of heat a computation produces can be undermined by not deleting any information. A computation done that way is completely reversible because not discarding information means that you can retrace each step. It may sound as if this process quickly fills up a computer’s memory. But in the 1970s, Charles Bennett, also at TJ Watson, showed that instead of discarding information at the end of the calculation, it could be configured to “uncompute” intermediate results that are no longer needed by reversing its logical steps and returning the computer to his original state.

The problem is that, to avoid heat transfer, that is, to be what physicists call an adiabatic process, the series of logical operations in the calculus must generally be carried out infinitely slowly. In a sense, this approach avoids any “frictional heating” in the process, but at the cost of taking infinite time to complete the calculation.

It hardly seems like a practical solution, then. “The conventional wisdom has long been that power dissipation in reversible computing is proportional to speed,” says computer scientist Michael Frank of Sandia National Laboratories in Albuquerque, NM.

To the limit, and beyond

Silicon-based computing doesn’t come close to the Landauer limit anyway: Currently, such computing produces about a few thousand Landauers in heat per logical operation, and it’s hard to see how even some super-efficient silicon chip of the future could be far behind. below 100 or more. . But Ray and Crutchfield say it’s possible to do better by encoding information in electrical currents in a new way: not as pulses of charge but in the momentum of moving particles. They say this would allow computation to be done reversibly without sacrificing speed.

The two researchers and their co-workers introduced the basic idea of ​​moment computing last year. The key concept is that a bit-encoding particle’s momentum can provide a kind of “free” memory because it carries information about the particle’s past and future motion, not just its instantaneous state. “Previously, information was stored positionally: ‘Where is the particle?'” says Crutchfield. For example, is an electron given in East channel or that one? “Momentum computation uses information in position Y in speed,” he says.

This additional information can be exploited for reversible computing. For the idea to work, the logical operations must occur much faster than the time it takes for the bit to reach thermal equilibrium with its surroundings, which will randomize the bit’s movement and encode the information. In other words, “moment computation requires that the device runs at high speed,” says Crutchfield. For it to work, it “must compute fast”, i.e. not adiabatically.

The researchers considered how to use the idea to implement a logical operation called bit swapping, in which two bits simultaneously invert their value: 1 becomes 0 and vice versa. No information is discarded here; it is simply reconfigured, which means that, in theory, there is no cost to erase.

However, if the information is encoded only at the position of a particle, a bit swapping, for example switching particles between a left and a right channel, means that their identities are encoded and thus they are indistinguishable from their “before” and “after,” he says. But if the particles have opposite moments, they remain distinct, so the operation creates genuine, reversible change.

a handy device

Ray and Crutchfield have described how this idea could be implemented in a practical device, specifically, in superconducting flux quantum bits, or qubits, which are the standard bits used for most quantum computers today. “We are being parasites on the quantum computing community!” Crutchfield cheerfully admits. These devices consist of loops of superconducting material interrupted by structures called Josephson junctions (JJs), where a thin layer of a non-superconducting material is sandwiched between two superconductors.

Information in JJ circuits is usually encoded in the direction of the so-called supercurrent flow, which can be changed by microwave radiation. But because supercurrents carry momentum, they can also be used to calculate momentum. Ray and Crutchfield performed simulations suggesting that, under certain conditions, JJ circuits should be able to support their moment calculation approach. If cooled to liquid helium temperatures, the circuit could perform a single bit-swapping operation in less than 15 nanoseconds.

“While our proposal is based on a specific substrate to be as concrete as possible and accurately estimate the energies required,” says Crutchfield, “the proposal is much more general than that.” It should work, in principle, with normal electronic circuits (albeit cryogenically cooled) or even with small, carefully isolated mechanical devices that can carry momentum (and thus perform calculations) in their moving parts. However, an approach with superconducting bits might be particularly suitable, says Crutchfield, because “it’s a familiar microtechnology that’s known to scale up very well.”

Crutchfield should know: Working with Michael Roukes and his collaborators at the California Institute of Technology, Crutchfield previously measured the cost of erasing a bit in a JJ device and showed that it is close to the Landauer limit. In the 1980s, Crutchfield and Roukes even served as consultants for IBM’s attempt to build a reversible JJ computer, which was eventually abandoned due to what, at the time, were overly demanding manufacturing requirements.

Follow the bouncing ball

Harnessing the speed of a particle for computation is not an entirely new idea. Momentum computing is very similar to a reversible computing concept called ballistic computing that was proposed in the 1980s: in it, information is encoded in objects or particles that move freely through circuits under their own inertia, carrying with it some signal that is used repeatedly to execute many logical operations. If the particle interacts elastically with others, it will not lose energy in the process. In such a device, once the ballistic bits have been “fired”, they alone power the computation without any other energy input. The computation is reversible as long as the bits continue to bounce along their paths. Information is only erased and power is only dissipated when its states are read.

Whereas, in ballistic computing, the velocity of a particle simply carries it through the device, allowing the particle to carry information from input to output, says Crutchfield, in momentum computing, the velocity and position of a particle allow you to incorporate a unique and unambiguous sequence of states during a calculation. This last circumstance is the key to reversibility and thus low dissipation, he adds, because it can reveal exactly where each particle has been.

researchers, including Frank, have been working on ballistic reversible computing for decades. One challenge is that, in its initial proposal, ballistic computing is dynamically unstable because, for example, particle collisions can be chaotic and therefore very sensitive to the smallest random fluctuations: they cannot then be reversed. But researchers have made progress in solving the problems. In a recent preprinted paperKevin Osborn and Waltraut Wustmann, both of the University of Maryland, proposed that JJ circuits could be used to make a reversible ballistic logic circuit called a shift register, in which the output of one logic gate becomes the input of the next in turn. a series. of flip-flop operations.

“Superconducting circuits are a good platform for testing reversible circuits,” says Osborn. Their JJ circuits, he adds, appear to be very close to those stipulated by Ray and Crutchfield and thus might be the best candidates to test his idea.

“I would say that all of our groups have been working from the intuition that these methods can achieve a better trade-off between efficiency and speed than traditional reversible computing approaches,” says Frank. Ray and Crutchfield “have probably done the most comprehensive job yet of demonstrating this at the level of theory and simulation of individual devices.” Still, Frank cautions that all the various approaches to ballistic and impulse computing “are still a long way from becoming a practical technology.”

Crutchfield is more optimistic. “It really depends on getting people to support escalation,” he says. He thinks small, low-dissipation, boost-computing JJ circuits could be feasible within a couple of years, with full microprocessors debuting within this decade. Ultimately, he anticipates that consumer-grade boost computing could deliver energy efficiency gains of 1,000 times or more over current approaches. “Imagine [if] your Google server farm housed in a giant warehouse and using 1000 kilowatts for computing and cooling [was instead] reduced to just one kilowatt, equivalent to several incandescent light bulbs,” says Crutchfield.

But the benefits of the new approach, Crutchfield says, could be broader than a practical reduction in energy costs. “Impulse computing will lead to a conceptual shift in the way we view information processing in the world,” she says, including how information is processed in biological systems.

Leave a Reply

Your email address will not be published.