Many paradigms in cognitive science posit that human learning is characterized by a limited capacity to represent the information relevant for a given task. We argue that excess capacity -- using more representational resources than needed for a task at hand -- is a plausible alternative paradigm for the study of human learning. Leveraging recent results from machine learning, we show that excess capacity can be consistent with high predictive ability. We also review extant empirical findings from the cognitive science literature, demonstrating that excess capacity learning can account for a range of empirical phenomena, such as humans' simultaneous yet apparently contradictory tendency to both memorize observations and capture higher-level patterns in them. We conclude by discussing promising directions for future inquiry under the excess capacity learning paradigm.