Padmaja, T. M., Dhulipalla, N., Bapi, R. S., & Krishna, P. R. (2007, December). Unbalanced data classification using extreme outlier elimination and sampling techniques for fraud detection. In Advanced Computing and Communications, 2007. ADCOM 2007. International Conference on (pp. 511-516). IEEE. .
Bhowan, U., Johnston, M., Zhang, M., & Yao, X. (2013). Evolving diverse ensembles using genetic programming for classification with unbalanced data. IEEE Transactions on Evolutionary Computation, 17(3), 368-386.
Sun, Y., Kamel, M. S., Wong, A. K., & Wang, Y. (2007). Cost-sensitive boosting for classification of imbalanced data. Pattern Recognition, 40(12), 3358-3378.
Padmaja, T. M., Dhulipalla, N., Bapi, R. S., & Krishna, P. R. (2007, December). Unbalanced data classification using extreme outlier elimination and sampling techniques for fraud detection. In Advanced Computing and Communications, 2007. ADCOM 2007. International Conference on (pp. 511-516). IEEE. .
Bhowan, U., Johnston, M., Zhang, M., & Yao, X. (2013). Evolving diverse ensembles using genetic programming for classification with unbalanced data. IEEE Transactions on Evolutionary Computation, 17(3), 368-386.
Sun, Y., Kamel, M. S., Wong, A. K., & Wang, Y. (2007). Cost-sensitive boosting for classification of imbalanced data. Pattern Recognition, 40(12), 3358-3378.
If you use method SVM, you can simply change parameter Weight for positive and negative objects (see, e.g., MATLAB description).In your task Amount_Pos = 55, Amount_Neg = 803. You should start from values W_Pos = 1, W_Neg = 0.1 and then to select optimal value of the W_Neg by means of Cross-Validation
You also can simply dublicate several times you minor class with some low noise. But in this case you should some change rules for calculating of output parameters as TPR, FPR, Precision, F_n, etc.