Non-linear Optimization
- Least square method
- 1. Generalized pseudo-inverse
- 2. Singular value decomposition (SVD)
- Reference:
Least square method
1. Generalized pseudo-inverse
Simply Define :
A x = b , A ∈ R m × n Ax=b, A \in \mathbb{R}^{m\times n} Ax=b,A∈Rm×n
The solution can be generally defined as :
x = A + b x=A^+b x=A+b
Assume r a n k ( A ) = r rank(A) = r rank(A)=r, then deploy BC decomposition method:
A m × n = B m × r C r × n A_{m\times n} = B_{m\times r}C_{r\times n} Am×n=Bm×rCr×n
The generalized pseudo-inverse of A A A is:
A + = C T ( C C T ) − 1 ( B T B ) − 1 B T A^{+} = C^T(CC^T)^{-1}(B^TB)^{-1}B^T A+=CT(CCT)−1(BTB)−1BT
If r = m r = m r=m(row full rank)
A = E A A = EA A=EA
If r = n r = n r=n(column full rank)
A = A E A=AE A=AE
Then, we can deduce that if matrix A A A is row full rank, the generalized pseudo-inverse of A A A is:
A + = A T ( A A T ) − 1 A^+=A ^T(AA^ T)^ {−1} A+=AT(AAT)−1
If A A A is column full rank, A + A^+ A+ is:
A + = ( A T A ) − 1 A T A^ + =(A ^T A) ^{−1} A^ T A+=(ATA)−1AT
2. Singular value decomposition (SVD)
Define:
A = U Σ V T , A ∈ R m × n A=U\Sigma V^T, A \in \mathbb{R}^{m\times n} A=UΣVT,A∈Rm×n
The least square problem is:
min ∥ A x − b ∥ 2 2 \min \| Ax-b \| ^2 _2 min∥Ax−b∥22
When m > n = r a n k ( A ) m>n=rank(A) m>n=rank(A), it is overdetermined equations:
A = ( U 1 , U 2 ) ( Σ 1 0 ) V T A = (U_1,U_2) (\begin{matrix} \Sigma_1 \\ 0 \end{matrix}) V^T A=(U1,U2)(Σ10)VT
U 1 ∈ R m × r U_1 \in \mathbb{R}^{m\times r} U1∈Rm×r , x x x can be described as:
x = V Σ 1 − 1 U 1 T b x = V\Sigma_1^{-1}U_1^T\bm b x=VΣ1−1U1Tb
When n > m = r a n k ( A ) n>m=rank(A) n>m=rank(A), then:
A = U T ( Σ 1 0 ) ( V 1 T V 2 T ) A = U^T (\begin{matrix} \Sigma_1 & 0 \end{matrix}) (\begin{matrix} V_1^T \\ V_2^T \end{matrix}) A=UT(Σ10)(V1TV2T)
V 1 T ∈ R r × n V_1^T \in \mathbb{R}^{r\times n} V1T∈Rr×n, thus:
x = V 1 Σ 1 − 1 U T b x = V_1\Sigma_1^{-1}U^T\bm b x=V1Σ1−1UTb
Reference:
- 矩阵SVD分解