Any detail proof of how a matrix inverse is derived and why a matrix multiplied by its inverse yields an identity matrix?

If reduce row echelon form is perform on a matrix A alongside its identity matrix U (augmented to A for example) to transform A into an identity matrix I, the identity matrix U will be transformed into the inverse of A to yield A-1 when A is I. It's somewhat amazing. Afterwards A x A-1 = I. Any clear proof how A-1 is derived and why are A-1 and A x A-1 = I correct?

More Araoluwa Filani's questions See All
Similar questions and discussions