Hi,

I know that some of Support Vector Machine approaches and other machine learning approaches use the methodology of reducing the number of sample from the training set to reduce the computational run-time. However, this method can work very well on large training sets if they nearly have instances characteristics that can represent with the small portions (small samples) of these training sets. However, it will not do the same outperformance on training sets that have a lot of variations in the instances.

Please, is there any method in machine learning methods to reduce the computational run-time with considering all sample to be involved in the learning approach?

Thanks

Osman

More Osman Ali Sadek Ibrahim's questions See All
Similar questions and discussions