After doing EFA the cumulative% of variance is 49%. Is anything below 60% unacceptable or can we use lesser values as well. If yes what is the minimum threshold level?
Having a low level of variance accounted for implies that there is considerable random error left in your measures. Since that random error cannot correlate with anything, it will lower the observed correlations with the scale you create.
Two things you can do. First, eliminate items with very low loadings, because you are not accounting for their variance. Of course, you should take theoretical and substantive considerations into account, but if you have variables that aren't performing as you predicted, then it might well be reasonable to eliminate them.
Second, examine the Cronbach's alpha for the scales you are trying to create, because the reliability of a scale is more direct measure of the extent to which it is made up of shared rather than random error. In particular, if you have an option to obtain "alpha if item deleted," then be sure to check that, because it will show you if you can raise your reliability by deleting items that are only weakly related to your overall scale.
Tahir, I'm not sure what you are researching. Cumulative percentage of variance is a source of disagreement in the factor analysis approach, particularly in different disciplines, for example, the natural sciences, psychology, and the humanities. No fixed threshold exists, although certain percentages have been suggested. In the natural sciences, it is suggested that factors should be stopped when around 95% of the variance is explained. In the humanities however, the explained variance is commonly as lower than 60%.