CFA (confirmatory factor analysis) implies that there is a good reason in the first place for a given variable to be part of a factor model, so I don't understand why one would not bother trying to assess how well that factor model works to account for observed relationships among the manifest variable set (by abandoning variable/s prior to the CFA).
If you were using EFA, and were uncertain about the nature of the factor structure, then guidelines about communality, loading strength, and a host of other decision points could well be usefully applied. But not so much in a CFA, unless you find that the proposed model fails in some tangible way(s).
I suspect that the source of such advice would be a felt need to maximize variance accounted for, or some other ersatz indicator of "validity." A .5 communality guideline would imply that variable-factor loadings would average to .7 or better. Please note that such suggestions are guidelines, not commandments.
I think about item communalities as indicting how much variance in each item is being explained by all the factors (assuming there are more than one factor in the solution). So if the item loads on just one factor at .70 and has zero loadings on all other factors then the communality would be .49 (squared factor loading), and according to the .50 it has to be dropped. However, this appears to me to be a great item, loading strongly on one factor and not on any others - it's contributing to a nice 'simple structure'. Next let's think about another item that loads .4 on 4 different factors - this is a terrible item, as it doesn't know what it wants to measure. It would have a communality of .64 (.16 + .16 +.16 +.16) and using the .50 rule it would be hailed as an item to keep when in fact it's a junk item.
This is a longwinded way of saying that communalities can be useful to have a glance at - if there are any that are close to zero or close to one then I'd keep an eye on them when I'm looking at the pattern matrix. I certainly wouldn't be making any decisions about retaining or dropping based on the commonalities. In SPSS this information is at the very top of the output, the first thing that you'll see. So I think about it as a brief introduction, an opportunity to say 'Howdy' have a quick look and get some first impressions - nothing too serious, no commitments!!
So the main point I'm trying to make is that these 'greater than .50', lower than whatever' type rules-of-thumb never really work in a consistent way. They do, however, give me an opportunity to come on RG and say 'Forget the rule of thumb' - that's not always true, but it's a good rule of thumb.