Modeling and inference are central to most areas of science and especially to evolving and complex systems. Critically, the information we have is often uncertain and insufficient, resulting in an underdetermined inference problem; multiple inferences, models, and theories are consistent with available information. Information theory (in particular, the maximum information entropy formalism) provides a way to deal with such complexity. It has been applied to numerous problems, within and across many disciplines, over the last few decades. In this perspective, we review the historical development of this procedure, provide an overview of the many applications of maximum entropy and its extensions to complex systems, and discuss in more detail some recent advances in constructing comprehensive theory based on this inference procedure. We also discuss efforts at the frontier of information-theoretic inference: application to complex dynamic systems with time-varying constraints, such as highly disturbed ecosystems or rapidly changing economies.