Is it possible to have a matrix A which is invertible, and has repeated eigenvalues at, say, 1 and still has linearly independent eigenvectors corresponding to the repeated values? I require any case other than the identity matrix.
Of course, it's possible. Construct a diagonal matrix D with your eigenvalues and a non singular matrix X whose columns are of norm 1 and then A=X D X^{-1}. It gives you a diagonalizable matrix. If you take X to be orthonormal that is, X^* X = I you will get a normal matrix.
Of course, it's possible. Construct a diagonal matrix D with your eigenvalues and a non singular matrix X whose columns are of norm 1 and then A=X D X^{-1}. It gives you a diagonalizable matrix. If you take X to be orthonormal that is, X^* X = I you will get a normal matrix.
Of course there are infinitely many cases, just take a diagonal matrix with repeated values. That why the condition of diagonalization of a matrix is taken on the eigenvectors not on the eigenvalues:
Theorem: A square matrix is diagonalizable if and only if all its eigenvectors are linearly independent. (i.e. eigenvectors constitute a basis for the vector space.
Please you can learn more on this subject on: diagonalization of square matrix (endomorphism), Also, Jordan form, there you find many many good tools like relation between dimensions of sub-eigenspaces and multiplicities of eigenvalues in the characteristic polynomial and the minimal polynomial of the given matrix, the number of Jordan Blocs in the matrix, etc....
The multiplicity of an Eigenvalue is strongly related to the Dimension of the vector space, that it generates. If you get the Eigenvalue -2 for example three times(multiplicity), it means, that -2 generates a vector space of Dimension 3. Consequently the Basis of that vector space will be composed of 3 linearly Independent vectors. So it is possible.
I thank you David for the precision. You are right. I had only in mind at that time the Goal to express this possibiliy, that I forgot the conditions under which it is possible.