Storing knowledge of an agent’s environment in the form of aprobabilistic generative model has been established as a cru-cial ingredient in a multitude of cognitive tasks. Perceptionhas been formalised as probabilistic inference over the state oflatent variables, whereas in decision making the model of theenvironment is used to predict likely consequences of actions.Such generative models have earlier been proposed to under-lie semantic memory but it remained unclear if this model alsounderlies the efficient storage of experiences in episodic mem-ory. We formalise the compression of episodes in the norma-tive framework of information theory and argue that seman-tic memory provides the distortion function for compressionof experiences. Recent advances and insights from machinelearning allow us to approximate semantic compression in nat-uralistic domains and contrast the resulting deviations in com-pressed episodes with memory errors observed in the experi-mental literature on human memory.