Simply put, an eigenvalue is a measure of the variance explained by one component (or factor). Eigenvalues of a correlation matrix are used in exploratory factor analysis (FA) and exploratory principal components analysis (PCA) to determine the number of factors that should be kept without loosing too much information. The question here is: Can a correlation matrix be approximated (reproduced) with fewer factors than the number of variables of the correlation matrix?
There are different criteria to determine the number of factors in FA or components in PCA, for example the difference of the first eigenvalues (scree plot), the absolute size of a factor (e.g. > 0.0) or component (e.g. > 1.0), or (presumably the best available method) by using parallel analysis. A good paper describing the latter (assuming basic knowledge about FA and PCA) is:
Crawford, A. V., Green, S. B., Levy, R., Lo, W.-J., Scott, L., Svetina, D., & Thompson, M. S. (2010). Evaluation of parallel analysis methods for determining the number of factors. Educational and psychological measurement, 70, 885--901.
If you are looking for a practical introduction into FA and PCA see: https://stats.oarc.ucla.edu/spss/seminars/introduction-to-factor-analysis/a-practical-introduction-to-factor-analysis/
A good paper explaining the decisions to be made in factor analysis is:
Preacher, K. J. & McCallum, R. C. (2003). Repairing Tom Swift's electric factor analysis machine. Unterstanding Statistics, 2, 13--43.
For more information about eigenvalues and eigenvectors, see some contributions here (especially by Sanket Tilekar and by Viktor Toth): https://www.quora.com/What-is-the-meaning-of-eigenvalues-in-factor-analysis
or here: https://stats.stackexchange.com/questions/2691/making-sense-of-principal-component-analysis-eigenvectors-eigenvalues
or for some intuitive "understanding" here (my favorite, the graphics are phantastic): http://www.alanfielding.co.uk/multivar/eigen.htm
In addition to the reply by Dirk Enzmann , let me add this note:
In a correlation matrix (the usual starting point for factor analysis and PCA), if all variance is common, then the sum of the full set of extracted factors' or components' eigenvalues will = k, where k is the number of variables involved.
So, if you had 20 variables, and the first extracted factor's eigenvalue was 8.543, that would indicate that the first factor extracted "captured" or "accounted for" about 42.7% of the variance in the data set. This would indicate that the remaining potential 19 factors would collectively explain the other 57.3% of the variance. As each successively extracted factor or component necessarily accounts for smaller portions of the variance, the first factor extracted always will have the highest eigenvalue for your chosen solution (before any rotation occurs, of course).
If this is still mysterious to you, it might be a good idea to consult with someone at your institution who is familiar with the procedure. That might be helpful for better understanding what the results allow you to infer about the structure of your data set.
... then you are not looking at the eigenvalues of the original correlation matrix and probably you are not running a principal components analysis but (for example) a principal axis or maximum likelihood factor analysis -- how eigenvalues are displayed depends on the software you are using. Without additional information about the software and the full commands used and the complete output it is hard to tell what it is that you see. You should consult the manual of the software you are using.