Identifying and quantifying memory are often critical steps in developing a
mechanistic understanding of stochastic processes. These are particularly challenging and
necessary when exploring processes that exhibit long-range correlations. The most common
signatures employed rely on second-order temporal statistics and lead, for example, to
identifying long memory in processes with power-law autocorrelation function and Hurst
exponent greater than $1/2$. However, most stochastic processes hide their memory in
higher-order temporal correlations. Information measures---specifically, divergences in the
mutual information between a process' past and future (excess entropy) and minimal
predictive memory stored in a process' causal states (statistical complexity)---provide a
different way to identify long memory in processes with higher-order temporal correlations.
However, there are no ergodic stationary processes with infinite excess entropy for which
information measures have been compared to autocorrelation functions and Hurst exponents.
Here, we show that fractal renewal processes---those with interevent distribution tails
$\propto t^{-\alpha}$---exhibit long memory via a phase transition at $\alpha = 1$. Excess
entropy diverges only there and statistical complexity diverges there and for all $\alpha
< 1$. When these processes do have power-law autocorrelation function and Hurst exponent
greater than $1/2$, they do not have divergent excess entropy. This analysis breaks the
intuitive association between these different quantifications of memory. We hope that the
methods used here, based on causal states, provide some guide as to how to construct and
analyze other long memory processes.