Class imbalance is never a big general issue. It is however., a special issue for some algorithms indeed. I would therefore like to suggest you some readings : https://pdfs.semanticscholar.org/907b/02c6322d0e7dff6b0201b03e3d2c6bc1d38f.pdf
In imbalanced learning, the usual cost functions will be biased to the class with most samples, or majority class. That is, imagine a binary classification problem where 90% of the samples are positive. Simply guessing all test samples as positive will give you a 90% accuracy as well. Cost-sensitive learning will give a higher cost for the minority class in the loss function to mitigate such an issue.
However, sampling techniques can also be used, such as SMOTE and RUS.
If you are interested in multi-objective optimization for imbalanced learning, please take a look at my recent paper. Article Ensemble learning by means of a multi-objective optimization...
Another strategy to handle class imbalance problem instead of using sampling techniques is to modify your threshold, i.e. the probability for a case to be classified in the lower class.
So this value is not 50% as default but will be automatically calculated by your algorithm and generall reflects the proportion of the lower class.