- Main
The Dual of the Least-Squares Method
Abstract
The least-squares method was firmly established as a scientific approach by Gauss, Legendre and Laplace within the space of a decade, at the beginning of the nineteenth century. Legendre was the first author to name the approach, in 1805, as "méthode des moindres carrés," a "least-squares method." Gauss, however, is credited to have used it as early as 1795, when he was 18 years old. He, subsequently, adopted it in 1801 to calculate the orbit of the newly discovered planet Ceres. Gauss published his way of looking at the least-squares approach in 1809 and gave several hints that the least-squares algorithm was a minimum variance linear estimator and that it was derivable from maximum likelihood considerations. Laplace wrote a very substantial chapter about the method in his fundamental treatise on probability theory published in 1812. Surprisingly, there still remains an unexplored aspect of the least-squares method: since the traditional formulation is stated as minimizing the sum of squared deviations subject to the linear (or nonlinear) specification of a regression model, this mathematical programming problem must have a dual counterpart. This note fills this gap and shows that the least-squares estimates of unknown parameters and deviations can be obtained by maximizing the net value of sample information.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-