Renewal processes are broadly used to model stochastic behavior consisting of
isolated events separated by periods of quiescence, whose durations are specified by a
given probability law. Here, we identify the minimal sufficient statistic for their
prediction (the set of causal states), calculate the historical memory capacity required to
store those states (statistical complexity), delineate what information is predictable
(excess entropy), and decompose the entropy of a single measurement into that shared with
the past, future, or both. The causal state equivalence relation defines a new subclass of
renewal processes with a finite number of causal states despite having an unbounded
interevent count distribution. We use these formulae to analyze the output of the
parametrized Simple Nonunifilar Source, generated by a simple two-state hidden Markov
model, but with an infinite-state epsilon-machine presentation. All in all, the results lay
the groundwork for analyzing processes with infinite statistical complexity and infinite
excess entropy.