With multivariate research we come to eigenvalues and eignenvectors
Eigenvalues
Conceptually can be considered to measure the strength (relative length)
of an axis in N-dimensional space
Derived via eigenanalysis of the square symmetric matrix
The covariance or correlation matrix
Eigenvector
Each eigenvalue has an associated eigenvector. While an eigenvalue is
the length of an axis, the eigenvector determines its orientation in space.
The values in an eigenvector are not unique because any coordinates that
described the same orientation would be acceptable.
Any factor whose eigenvalue is less than 1.0 is in most
cases not going to be retained for interpretation. Unless it is very close or has a readily understood and interesting meaning.
Assessing the variance accounted for Eigenvalue is an index of the strength of the component, the amount of variance it accounts for. It is also the sum of the squared loadings for that component
in my ouput: the first three cross-sections (out of six) have eigenvalues greater than 1 (ranging from 1.1 to 1.7) and their cumulative proportion is 0.74