The Least Square Method
Least Squares MethodIt is a mathematical optimization method that finds the best matching function for data by minimizing the sum of squares of errors. The least squares method can quickly obtain unknown data, and the sum of squares of errors between the obtained data and the actual data is minimized.
The form of the least squares method
The principle of least squares method is:
Objective function = ∑ (observed value – theoretical value)²
The least squares method is a standard method for obtaining approximate solutions to overdetermined systems (systems with more equations than unknowns) using regression analysis. In the overall solution, the least squares method is calculated as the result of each equation and the sum of the squares of the residuals is minimized.
Application of the least squares method
The least squares method is often used for curve fitting, and the best fit covered by the least squares is to minimize the sum of squares of the residuals (the difference between the observed values and the fitted values provided by the model).
Least squares problems are usually divided into linear least squares and nonlinear least squares, which are determined by whether the residuals in all unknowns are linear.
Linear least squares problems often arise in statistical regression analysis and have a closed-form solution; nonlinear problems are usually solved by iterative refinement, with a linear approximation to the system at each iteration, so the core algorithm is the same in both cases.
When the observations are from an exponential family and mild conditions are satisfied, the least squares and maximum likelihood estimates are identical.
Limitations of the least squares method
Problems with simple regression and least squares arise when the problem has large uncertainties in the independent variables, and in this case, another approach other than least squares should be considered to fit the model to the variables-errors-fit.