Linear least-squares problem.
Let A be an n×m matrix, c an m-component vector and b an
n-component vector. For n>m the equation
Ac = b
generally has no solution. However one can find "the best possible"
least squares solution which minimizes the Eucledian norm of the
difference between the right- and the left-hand size.
The problem can be solved by the QR decomposition. The matrix A factorizes
as A=QR, where Q is n×m matrix with orthogonal columns and R is
an m×m upper triangular matrix. We can then minimize
|Ac-b|2 = |QRc-b|2 = |Rc - QTb|2 + |(1 - QQT)b|2
|
by solving a m×m set of linear equations
by simple back-substitution.
The QR decomposition can be performed by modified Gram-Schmidt
orthogonalization method with pivoting.
This problem arises, for example, if n data points yi with errors
σi are to be fit by a linear combination of m functions
F(x)=∑kckfk(x).
The best fit is then when the square deviation (χ2 -- chi squared)
is minimized
χ2 = ∑i ([ yi - F(xi) ]/σi)2 .
|
One can recognise the above problem Ac=b where
Aik = fk(xi)/σi ,
bi = yi/σi. |
with the formal solution c=R-1QTb. The error Δci can be
approximated as
Δci = ∑k ∂ci/∂yk σk
= ∑k ∂ci/∂bk
|
The covariance matrix [ΔciΔcj] is then simply
[ΔciΔcj] = (∂c/∂b)
(∂c/∂b)T = R-1R-T = (ATA)-1
|
The inverse R-1 of a triangular matrix R can be calculated as
(R-1)ii = 1/Rii ;
(R-1)ij = -(R-1)ii
( ∑k=i+1,j Rik (R-1)kj )
|