In 1805 Legendre published the method of least squares: given a set of observations and a proposed mathematical relationship, find the parameters that minimise the sum of squared differences between observed and predicted values. This was the first systematic approach to model fitting.
Legendre developed the method to determine the orbits of comets from telescopic observations. Within decades it was applied to surveying, physics, and economics. Today, linear regression — a direct application of least squares — is the most widely used statistical technique in every quantitative field.
Least squares is an optimisation problem: find the point in parameter space that minimises a quadratic loss function. This geometric perspective — models as surfaces, solutions as minima — generalises to neural networks, gradient descent, and modern machine learning. Legendre's method is the ancestor of every optimisation-based model.
Factor models, regression-based scoring, and signal calibration all use least-squares estimation. When our systems fit models to financial data, they are performing the optimisation Legendre invented two centuries ago.