Thanks everybody. I used the pseudoinverse and compared the results with the exact solution. Unfortunately there is more than 50% L2 norm error. What I have done is:
Now, pseudoinv(inv(B)) = B if B is invertible. Is it? Did you check it? And above all, does trans(A) satisfies the orthonormality condition among its columns? Did you check it?
How could it be since A is not square! The columns (and rows due to the second application of multiplication) of A must be orthonormal vectors! Are they? If not then the above property of pseudoinverse cannot be applied and you must look for something else.
You rarely need to explicitly evaluate the inverse of a matrix. Have you considered the problem B*x_{i} = a_{i} where a_{i} is the ith column of A and x_{i} is the ith column of some solution matrix X. This assumes the form of a linear least squares problem BX = A. By solving for X in this way, you will 1) not have to actually invert B because you are solving for B^{-1}*A directly and 2) gain numerical stability since the LLS algorithm is more stable and less expensive than the inversion.