In this paper we present a method for synthesizing textures on animated liquid surfaces generated by a physically based fluid simulation system. Rather than advecting texture coordinates on the surface, our algorithm synthesizes a new texture for every frame using an optimization procedure which attempts to match the surface texture to an input sample texture. By synthesizing a new texture for every frame, our method is able to overcome the discontinuities and distortions of an advected parameterization. We achieve temporal coherence by initializing the surface texture with color values advected from the surface at the previous frame and including these colors in the energy function used during optimization.