What are the strengths of classifiers such as: (SVM, random Forest, ...), in the field of regression?
https://elitedatascience.com/machine-learning-algorithms
https://towardsdatascience.com/comparative-study-on-classic-machine-learning-algorithms-24f9ff6ab222
Article Advantages and Disadvantages of Using Artificial Neural Netw...
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.47.8153&rep=rep1&type=pdf
https://ieeexplore.ieee.org/abstract/document/5997308/
https://www.pnas.org/content/88/24/11426.short
Article Comparing regression, naive Bayes, and random forest methods...
You can have SVR:
Support Vector Regression (SVR) using linear and non-linear kernels
Toy example of 1D regression using linear, polynomial and RBF kernels.
📷
print(__doc__) import numpy as np from sklearn.svm import SVR import matplotlib.pyplot as plt # ############################################################################# # Generate sample data X = np.sort(5 * np.random.rand(40, 1), axis=0) y = np.sin(X).ravel() # ############################################################################# # Add noise to targets y[::5] += 3 * (0.5 - np.random.rand(8)) # ############################################################################# # Fit regression model svr_rbf = SVR(kernel='rbf', C=100, gamma=0.1, epsilon=.1) svr_lin = SVR(kernel='linear', C=100, gamma='auto') svr_poly = SVR(kernel='poly', C=100, gamma='auto', degree=3, epsilon=.1, coef0=1) # ############################################################################# # Look at the results lw = 2 svrs = [svr_rbf, svr_lin, svr_poly] kernel_label = ['RBF', 'Linear', 'Polynomial'] model_color = ['m', 'c', 'g'] fig, axes = plt.subplots(nrows=1, ncols=3, figsize=(15, 10), sharey=True) for ix, svr in enumerate(svrs): axes[ix].plot(X, svr.fit(X, y).predict(X), color=model_color[ix], lw=lw, label='{} model'.format(kernel_label[ix])) axes[ix].scatter(X[svr.support_], y[svr.support_], facecolor="none", edgecolor=model_color[ix], s=50, label='{} support vectors'.format(kernel_label[ix])) axes[ix].scatter(X[np.setdiff1d(np.arange(len(X)), svr.support_)], y[np.setdiff1d(np.arange(len(X)), svr.support_)], facecolor="none", edgecolor="k", s=50, label='other training data') axes[ix].legend(loc='upper center', bbox_to_anchor=(0.5, 1.1), ncol=1, fancybox=True, shadow=True) fig.text(0.5, 0.04, 'data', ha='center', va='center') fig.text(0.06, 0.5, 'target', ha='center', va='center', rotation='vertical') fig.suptitle("Support Vector Regression", fontsize=14) plt.show()
How to justify the partitioning of the samples to (70%; 30%): (Calibrating and Validation) in the context of general modeling and artificial intelligence in particular?
11 December 2018 243 0 View
What Characteristics makes CNN work better?
03 March 2021 1,458 4 View
i would to know some of the research gaps in the artificial intelligence field in most african countries.
03 March 2021 6,145 3 View
i am try to classify the x-ray images. During classification , can i block unwanted images (except x-ray image).
03 March 2021 7,100 1 View
I have an experiment with 28 participants The independent variable is the times 1, 2, 5, and 10 used as the cross hairs during a task These 4 times were tested using 4 different blocks, each...
03 March 2021 9,692 2 View
dear community, my model is based feature extraction from non stationary signals using discrete Wavelet Transform and then using statistical features then machine learning classifiers in order to...
03 March 2021 6,994 5 View
I just wanted to check if I need to run a linear regression separately if I am using PROCESS MACRO to run mediation analysis. Thank you.
02 March 2021 4,359 3 View
NFL theorem is valid for algorithms training in fixed training set. However, the general characteristic of algorithms in expanded or open dataset has not been proved yet. Could you show your...
01 March 2021 1,189 3 View
Hello, I have classified 10 S3 OLCI images from the same area (althouth not the same size and/or quadrant) and I would like to do a final image using the mode of the pixel of these 10 images. I...
01 March 2021 9,874 3 View
Let F be a field. Consider U, the set of n times n strictly upper triangular matrices in F. For X, Y in U, we call them similar if there exists some S, which is non-degenerate and upper...
01 March 2021 2,957 8 View
Hello, Could you please share any interesting research explaining how to choose the number of hidden layers and nodes per layer in case of regression problems using ANN? Thank you, any help would...
01 March 2021 6,200 3 View