The way I've been doing it is to use Singular Value Decomposition, so that M=U * S * V.transpose  and then R=U * V.transpose. Essentially, removing the singular values, although I do not know exactly what that means.

However, if I follow the GramSchmidt process, I also get an orthonormal matrix, that sure looks like a rotation matrix but it is a different matrix than I would get following SVD.

The questions are Why? and more importantly What makes one method right and the other method wrong?

Example: 

M=1 2 3

      4 5 6       7 8 9

Use the SVD method to get:

R=-0.419386, -0.277519, 0.864349

     -0.277519, 0.945739, 0.168998       0.864349, 0.168998, 0.473647

But use GramSchmidt and I get:

R=0.123091, 0.904534, 0.408248

     0.492366, 0.301511, -0.816497      0.86164, -0.301511, 0.408248 Both R matrices are orthonormal and appear to be good rotation matrices. But, they are different matrices. One method has to be correct, but which one and why? I suppose both could be incorrect, but then what is the correct method for this conversion?

More Thomas M Greiner's questions See All
Similar questions and discussions