This approach is used even when exists papers where give alternatives procedures, who are more useful because focus on variance of construct, not in variance of error like the PCA.
I would highly recommend the chapter below regarding the use of the Kaiser criterion and other general issues regarding factors extraction using PCA. They view the Kaiser rule as arbitrary and not suitable for assessing the number of extracted factors. The Kaiser rule often results in over-extraction of components, and they recommend a parallel analysis procedure to calculate an eigenvalue cut-off point for a given analysis.
Velicer, W. F., Eation, C. A., & Fava, J. L. (2000). Chapter 3: Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determinng the number of factors or components. In R. D. Goffin & E. Helmes (Eds.), Problems and Solutions in Human Assessment: Honoring Douglas N. Jackson at Seventy (1st ed., pp. 41-71). New York: Springer-Verlag.
Chapter Problems and Solutions in Human Assessment
I think Jonathan has 'hit the nail on the head', and the fact that PCA is the default on SPSS just compounds the problem!
The only reason I know anything about this at all is because one of our MSc lecturers is a psychometrician and was particularly annoyed by this very issue - thankfully.
Yes, both the SPSS default and Andy Field's book uses PCA so I think these are some of the reasons this approach is adopted.
Regarding the kaiser criterion, personally I think it best to use a number of approaches to identify a range of potential solutions and then examine the pattern matrices etc.
There are some quite innovative ways to find an optimal solution in the free program FACTOR, such as using polychoric correlations (to account for the categorical nature of likert scale items) and subsequent parallel analysis that uses factors (rather than components) and polychoric correlations (if suitable). I've also used the Hull method (paper below) to help me decide on a sensible range of factor solutions to further examine.
I completely agree with all previous contributions. Another fact is, that even if there are no latent factors (uncorrelated items), you can find a principal component, explaining huge amounts of variance, which however only bins together test specific variance (e.g. longer questions take more time than shorter questions in a survey...). And isn't that something every researcher is glad about? Finding a factor with simple structure...
You could indeed put out, what would be the right reasoning, although it was not your question. ... In PCA if every item would only correlate with itself, then every factor/component you find represents a single item (explaining itself). And each factor/component would have an eigenvalue equal to one... (In PCA an eigenvalue of one means: explaining variance equal o one item). Now let correlations vary by random error, then you would obtain quite a sum of factors/components with eigenvalues above 1. Again, glad to find factors, when using Kaiser criterion! (It is good for publication).
And rotation, the missunderstanding here is simple but revealing: PCA is by definition already rotated... When you see someone rotating the dimensions obtained in a PCA: Do - not- trust - him/her. :)
I have two hypotesis: 1) Aditionally to statistics there are deductive criteria that help to choose factors and items, this would be why kaiser more a deductive argumentation, may be to be enough to give good a realiable explanation to phenomena; 2) Must be a balance between statistic complexity and rigour, with a pragmatical answers... Statistic by the statistic may be dangerous...