Kaiser criterion suggests to retain those factors with eigenvalues equal or higher than 1. Then rotate the factor loads, just type rotate to get a final solution. Please note that rotation is varimax which produces orthogonal factors( factors that are not correlated), recommended when you want to identify variables to create indexes or new variables without inter-correlated components. Use promax after rotate: rotate, promax for the inverse.
Factors estimated with MLM, there may possibly be more than one local maximum. Multiple maximums are especially likely when there is more than one group of variables. Set the random-number seed to ensure that your results are reproducible.
The question, as I understand it, is how can one assign individual items to factors? Once that is done, it is possible to create an index for each respondent on each latent factor. In exploratory factor analysis, a common rule of thumb is to use an absolute |.40| loading as a cutoff to decide if a loading is relevant to a particular. Typically, the majority of items load on only one factor – they are good measures of the underlying “latent factor.” But often a few items will load on more than one factor. It is always interesting to contemplate why these items represent both factors, which in theory are supposed to be independent (orthogonal, uncorrelated). In practice, it is common to discard items that do not cleanly load on a single factor, because usually the researcher wants to create an index score for each respondent for each factor. Each cluster of items, ideally, should represent a single idea, a latent factor, rather than a mishmash of different ideas.
If the researcher wants to avoid these judgments about which items belong to which factor, they can use a “factor score” approach. That is, every item is weighted according to how much it represents the latent factor. For each respondent the output consists of a factor score for each of the factors; these are the variables that go into the analysis of group differences.
As Pradeep recommended, after extracting the factors it is wise to do a Kaiser Varimax rotation of the results. Use the rotated loadings to figure out which items cluster together and represent one concept. The rotation does not alter the relationship between the items, rather it rotates the axes to make it easier to see the relationships between items and factors. The idea is that an item should load on one factor and minimally on all others. To see the impact of rotation adjustments, consider two factors at a time; plot the items in this two dimensional space using the original factor loadings and then again using the rotated factor loadings. The latter should be easier to interpret; items should cluster tightly around one factor or the other.
If you are asking, what threshold of variable-factor correlation ("loading") should one use to declare that variable as being salient for the factor, there is no fixed threshold. You can find publications in which values as low as |.15| (Raymond Cattell) to values like |.30| (selected, I think, because it's close to 10% shared variation), |.40|, |.50|, or even higher (people who worry a lot about average variance extracted or other comparable indicators may even suggest thresholds such as |.70| or higher). Henry Kaiser once suggested a rolling threshold, that of the RMS loading for a factor (I've never seen this used in practice!).
What will happen as you set the threshold higher is: (a) fewer variables will tend to be retained as having made the "cut"; (b) you are more likely to have latent variables with fewer salient variables (and therefore possibly being more narrow in focus than conceptualized); and (c) resultant internal consistency reliability estimates will tend to be biased upwards.
I disagree with a couple of points made by earlier respondents to this thread (though these weren't really addressing your query, as I understood it):
1. The eigenvalue > 1 criterion is probably one of the worst standards to use in choosing how many factors to retain in a solution. Both minimum average partial correlation and parallel analysis have been shown to be superior methods for this purpose. If there is a theoretical basis for an a priori decision about number of factors, then that's a different case entirely (and, you should probably be looking at confirmatory rather than exploratory, factoring).
2. Unless you're working with physical measurements (or the like), oblique rotation is far more likely to be a realistic way to evaluate relationships among factors than orthogonal rotation. If the estimated correlations turn out to be of negligible magnitude, then, yes: orthogonal solutions are simpler.