Which is better between Radial Basis Function Neural Network and Support Vector Machine in terms of function approximation? Have some already conducted the comparisons for function approximation? Thanks
I agree with Rahman Peimanker's answer based on the NFL theorem (published 21 years ago). Noone can claim to have the best algorithm for regression or classification, although many people still too frequently do (especially, I regret to say, in Research Gate Q&A's..).
But there is an additional issue: do you want to do function approximation, or nonlinear regression? In other words, do you want to approximate a known function, or do you want to build a model from experimental data? In the latter case, you should estimate the variance of the noise on your data, and compare it to the root mean square error on your test set (RMSE). If the RMSE is significantly larger than the variance of the noise, it is worth while improving your model, or trying another model. If the RMSE is on the order of magnitude of the variance of the noise, you have done the best than anyone cando. If the RMSE is signficantly smaller than the variance of the noise, something is wrong (test set too small to provide an accurate estimate of the generalization error, or test set that does not sample significantly the data distribution, or software bug, ...).
the structure of RBFNN is almost the same as SVM. Both of them are good for function approximation too. To know which is beter, is very based on who can optimized the hidden layer in RBFNN and Kernel in SVM. With SVM, you are able to put activation funtion for every kernel differently. But, it will be more complex structure.
Thanks for your answer. For my problem, I already applied the Bayesian Regularization ANN(BRANN) for time series estimation. However, its robustness is questionable though it is robust for one single peak value prediction. Then I succeed in integrating the optimization technique into BRANN. The integration indeed improves model robustness. In addition, I learned that RBFnn and SVM are also robustness for function approximation. So I would like to conduct the comparison among each algorithm.
Thanks for your answer. I agree with your ideas. I just have a brief look at your papers. Those are very useful. For my specific problem, i.e. function approximation, I already compared the performance among single NN algorithm and integrated algorithms. I found that the performance depends on the database. There is indeed no superior algorithm.
I agree with Rahman Peimanker's answer based on the NFL theorem (published 21 years ago). Noone can claim to have the best algorithm for regression or classification, although many people still too frequently do (especially, I regret to say, in Research Gate Q&A's..).
But there is an additional issue: do you want to do function approximation, or nonlinear regression? In other words, do you want to approximate a known function, or do you want to build a model from experimental data? In the latter case, you should estimate the variance of the noise on your data, and compare it to the root mean square error on your test set (RMSE). If the RMSE is significantly larger than the variance of the noise, it is worth while improving your model, or trying another model. If the RMSE is on the order of magnitude of the variance of the noise, you have done the best than anyone cando. If the RMSE is signficantly smaller than the variance of the noise, something is wrong (test set too small to provide an accurate estimate of the generalization error, or test set that does not sample significantly the data distribution, or software bug, ...).