The utility of limited feedback for coding over an individual sequence of DMCs is investigated. This study complements recent results showing how limited or noisy feedback can boost the reliability of communication. A strategy with fixed input distribution $P$ is given that asymptotically achieves rates arbitrarily close to the mutual information induced by P and the state-averaged channel. When the capacity-achieving input distribution is the same over all channel states, this achieves rates at least as large as the capacity of the state-averaged channel, sometimes called the empirical capacity.