Jan 29, 2013

SVD for homogenous least-square problem

Reference: 3DCV_svd_000.pdf
Reference
weighted-least-squares-and-locally-weighted-linear-regression

 
Nonlinear dimensionality reduction

 

Updated note (03/10/21):

Relationship with Polar Decomposition

  • A Polar Decomposition decomposes any matrix A into an Orthogonal matrix Q and a symmetric positive semi-definite matrix P.
    • A = Q P 
    •                    
      Figure: if we apply only P or only Q on the object
       
    • Q is an orthogonal matrix (rotation/reflection)
      •  rotated around the origin with some angle but no shear or scaling
    • P is symmetric positive semi-definite matrix
      • scaling in a different set of orthogonal bases but no rotation
      • Spectral theorem says P can be decomposed into orthogonal matrix and a diagonal matrix
      • P = VDV'
      • Geometrically it means, P will do a scaling (non-negative and non-uniform D) along some orthogonal set of axes (ie, a set of eigen vectors V').
    • A = Q (VDV')

                     = (QV) DV' 

                     = UDV'

    •  U = QV, so U is also orthogonal since multiplication of two orthogonal matrix is another orthogonal matrix

    •  This is singular value decomposition. It applies following three operations sequentially:
      • rotation (V') --> axis-aligned scaling (D) --> another rotation (U)

SVD with Eigen-Decomposition relationship: 

  • Eigen decomposition can be applied when a matrix is diagonalizable, eg, any symmetric matrix can be diagonalized into a set of eigen basis vectors
  • A = UDV'
       

       A'A= (UDV')'(UDV')

              = VD'U' UDV'

              = VD'(U'U)DV'  

              = VD'(I)DV'

              = VEV'        

 

So the orthonormal basis vectors [nxn] matrix can be estimated by Eigen-decomposition of [nxn] symetric matrix A'A. Only difference is the singular values of original matrix A are now square root diagonal matrix E

No comments:

Cat

Ragdoll Cat         Persian Cat     Finish Cat