Balancing error and dissipation in computing
Published Web Locationhttps://doi.org/10.1103/physrevresearch.2.033524
Modern digital electronics support remarkably reliable computing, especially given the challenge of controlling nanoscale logical components that interact in fluctuating environments. However, we demonstrate that the high-reliability limit is subject to a fundamental error-energy-efficiency tradeoff that arises from time-symmetric control: Requiring a low probability of error causes energy consumption to diverge as the logarithm of the inverse error rate for nonreciprocal logical transitions. The reciprocity (self-invertibility) of a computation is a stricter condition for thermodynamic efficiency than logical reversibility (invertibility), the latter being the root of Landauer's work bound on erasing information. Beyond engineered computation, the results identify a generic error-dissipation tradeoff in steady-state transformations of genetic information carried out by biological organisms. The lesson is that computational dissipation under time-symmetric control cannot reach, and is often far above, the Landauer limit. In this way, time-asymmetry becomes a design principle for thermodynamically efficient computing.