you will have to study your datasets to determine the appropriate algorithm to use, but as stated, Bayesian is the fastest, while any of either genetic or graph algorithms the most optimal
Classification is by itself optimization. In order to understand such tasks, I suggest you to read the attached book, that covers all your current quieries.
I think you are talking about hyperparameter optimization which involves exploring a hyperparameter search space for a set of values for the hyperparameters of a classification tree that can optimize the model performance. It is a tuning process that aims to basically minimize the impurity in a classification tree and maximize purity. Firstly, identify the parameters and hyperparameters of the classification tree required to tune. Then, optimization can be done using either a manual or automatic approach. Manual method is done by hand, more or less a trial-and-error strategy till you reach the point of optimal performance. The automatic approach involves a grid search method that is exhaustive and a random search method that improves the grid search method. Advanced automatic methods can also be used, like Bayesian method, Genetic Algorithm, Particle Swarm Optimization (PSO) method. Note that each of the methods has its strength and weaknesses. Choosing an appropriate approach is problem and data-dependent.
If you are talking about hyperparameter optimization (as stated by Taiwo in a previous answer), I recommend you to check automatic algorithm configuration tools like SMAC (https://www.automl.org/automated-algorithm-design/algorithm-configuration/smac) or irace (https://iridia.ulb.ac.be/irace). I also refer to the techniques used for the so-called automated machine learning. More information is available in the book of Hutter, Kotthoff and Vanschoren (https://www.automl.org/book).