If we consider matrix as a transformation then in simple terms eigenvalue is the strength of that transformation in a particular direction known as eigenvector.
I generally think of eigenvectors as a natural basis (orthogonal for Hermitian operators) of the input and output spaces. The eigenvalues can be thought of as (complex) "gains" for the eigenvectors.
In data analysis, the eigenvectors of a covariance (or correlation matrix) are usually calculated. Eigenvectors are the set of basis functions that are the most efficient set to describe data variability. They are also the coordinate system that the covariance matrix becomes diagonal allowing the new variables referenced to this coordinate system to be uncorrelated. The eigenvalues is a measure of the data variance explained by each of the new coordinate axis. They are used to reduce the dimension of large data sets by selecting only a few modes with significant eigenvalues and to find new variables that are uncorrelated; very helpful for least-square regressions of badly conditioned systems. It should be noted that the link between these statistical modes and the true dynamical modes of a system is not always straightforward because of sampling problems. cheers, arthur
If we consider matrix as a transformation then in simple terms eigenvalue is the strength of that transformation in a particular direction known as eigenvector.
Consider a rubber and stretch it at the two ends, then the amount of deformation created by the stretching in both the direction is the eigen vector and the permanent stretch is Eigen value.
With respect to signal processing, eigendecompositions are usually applied to covariance matrices. In this case, the eigenvectors are vectors that point in the direction of the largest variance, while the eigenvalues define the magnitude of this variance.
This can be understood intuitively by considering the fact that a covariance matrix is nothing more than the square of a linear transformation matrix, i.e. a rotation and a scaling of the data. The rotation is defined by the eigenvectors, while the scale is defined by the eigenvalues.
A nice explanation of this concept can be found here: http://www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/
Considering one particular example of transformation of an image ,the direction of transformation will be given by the eigenvectors and the amount of "expansion" and "compression" in those directions are the (positive and negative) eigenvalues.
Considering one particular example of transformation of an image ,the direction of transformation will be given by the eigenvectors and the amount of "expansion" and "compression" in those directions are the (positive and negative) eigenvalues.
Eigen value and eigen vector has major significance in the stress-strain analysis and vibration related problem.
Eigen value is always independent to the direction.this features makes our life easy to calculate the stress-strain value when we express stress or strain related problem in form of characteristic equation(eigen and eigen vector form)then the eigen value gives us invariant which is independent of any coordinate and the the corresponding direction in which it acts is called eigen vector.
In vibration related problem when we solved mass-stiffness force relation in form of eigen value problem then eigen value gives us the information about the natural frequency about the system which also independent of coordinate system and corresponding amplitudes is our eigen vector.