Landauer's Principle states that the energy cost of information processing must exceed the product of the temperature, Boltzmann's constant, and the change in Shannon entropy of the information-bearing degrees of freedom. However, this lower bound is achievable only for quasistatic, near-equilibrium computations-that is, only over infinite time. In practice, information processing takes place in finite time, resulting in dissipation and potentially unreliable logical outcomes. For overdamped Langevin dynamics, we show that counterdiabatic potentials can be crafted to guide systems rapidly and accurately along desired computational paths, providing shortcuts that allow for the precise design of finite-time computations. Such shortcuts require additional work, beyond Landauer's bound, that is irretrievably dissipated into the environment. We show that this dissipated work is proportional to the computation rate as well as the square of the information-storing system's length scale. As a paradigmatic example, we design shortcuts to create, erase, and transfer a bit of information metastably stored in a double-well potential. Though dissipated work generally increases with operation fidelity, we show that it is possible to compute with perfect fidelity in finite time with finite work. We also show that the robustness of information storage affects an operation's energetic cost-specifically, the dissipated work scales as the information lifetime of the bistable system. Our analysis exposes a rich and nuanced relationship between work, speed, size of the information-bearing degrees of freedom, storage robustness, and the difference between initial and final informational statistics.