Performance models for storage devices are an important part of simulations of large-scale computing systems. Storage devices are traditionally modeled using discrete event simulation. However, this is expensive in terms of computation, memory, and configuration. Configuration alone can take months, and the model itself requires intimate knowledge of the internal layout of the device. The difficulty in white-box model creation has led to the current situation, where there are no current, precise models. Automatically learning device behavior is a much more desirable approach, requiring less expert knowledge, fewer assumptions, and less time. Other researchers have created behavioral models of storage device performance, but none have shown low per-request errors. By making use of only a few high-level domain-specific assumptions, such as spatial periodicity and the existence of a logical-to-physical mapping, neural nets can learn to predict access time with low per-request errors. Providing neural nets with specific sinusoidal inputs allows them to generate periodic output, which is necessary for this problem. Weight sharing in the neural net accounts for regularities in the structure of the input, which reduces data requirements and error by reducing the number of free parameters. A trigonometric change of variables in the output of the neural net removes a discontinuity in the objective function, which makes the problem more amenable to neural nets and reduces error. Combining these approaches, we demonstrate that a neural net can predict access times with a mean absolute error of about 0.187 ms over a small portion of a hard disk drive, and a mean absolute error of about 2.120 ms over half of a hard disk drive.