Such an arbitrary rule about correlations would not make sense since some or all of the factors may be uncorrelated. As a consequence, there could be sets of observed variables in a correlation matrix that are fully or nearly uncorrelated, and factor analysis could still make sense (as long as some of the other correlations are substantial).
One situation where it would not be meaningful to run a factor analysis is when all observed variables are uncorrelated in the population (i.e., when there are only chance correlations in your sample data). You can test this formally by fitting an independence (null) model to your sample data and looking at the chi-square test of model fit. If the chi-square value for the independence model is non-significant (i.e., if the independence model is not rejected for the data), then factor analysis would not make sense because this would indicate that there are no substantial associations to model in the data. An independence model chi-square is available in many programs for structural equation modeling such as Mplus and lavaan.
Just to provide example data to Christian Geiser excellent answer, here are data with clearly three factors and no correlation even above 0 (search for seed so all the small cors between factor variations were negative). Did this in freeware R so anyone can replicate.