Sure, you can give it a go, but you need to check that you have met the assumptions behind the analysis . This can be done using a range of indicators. KMO and Bartlet's tests are the most common for checking sampling adequacy assumptions. See a text for guidance on these - I'd recommend Andy's Field's Discovering Statistics.
Following Samantha Curle 's suggestion, you can group the data by simplifying the model. If you can group some of the items (i.e., eliminating their loadings on other factors) this will mean less things to estimate. This makes it CFA, but depending on why you are doing the EFA the n may be too small. What are your goals? Are you interested in the latent structure of the covariations or just doing data reduction to use the factor scores elsewhere?
In my experience one can remove some statements to reduce the data using analytics . If the Cronbach alpha is increased for fewer statements one can use EFA for preliminary investigation.
Yes, you can apply EFA to you data. Kline (2011) and Gorsuch (1983) suggest a minimum sample size of 100. Additionally, simulation studies (e.g. MacCallum, Widaman, Zhang, & Hong, 1996) have shown that higher loadings accompanied by high communalities (greater than .6) would result in valid factor analyses.
It depends on how large the correlations are, and hence how large the communalities are. The number of variables per factor also matters; if you have 10 variables per factor you can get away with having a smaller sample than if you have (say) four variables per factor. Read McCallum 1999 who provides evidence to back up these recommendations - which is always better than "rules of thumb"