The complex bio-mechanics of human body is capable of generating an unlimited repertoire of movements, which on one hand yields highly versatile motor behavior but on the other hand presents a formidable control problem for the brain. Understanding the computational process that allows us to easily perform various motor tasks with a high degree of coordination is of central interest to both neuroscience and robotics control. In recent decades, it became widely accepted that the observed movement features can be understood as a result of the brain's effort of optimizing the motor behaviors. However, a major challenge for the optimal control models is deciding how the motor task should be represented. In this thesis, (1) I compare the existing representations in literature (2) derive a novel formulation using the principles and analytic tools from physics, (3) predict previously unreported features of human movements, and (4) conrm them in psychophysics experiments. I also explore implementing the optimal controller with articial neural networks, which may potentially lead to a deeper understanding of neural computation. The obtained controller generates realistic target-directed hand movements in real-time, and reproduces many of reported neuro-physiology data in primary motor cortex