Smartphone sensors are becoming more universal and more accurate. In this paper, we aim to distinguish between four common positions or states a phone can be in: in the hand, pocket, backpack, or on a table. Using a uniquely designed neural network and data from the accelerometer and the screen state, we achieve a 92% accuracy on the same phone. We also explore extending this to different phones and propose an acceleration calibration technique to do so.