A. The activation function is pre-defined before the training or learning begins. For example, in wavelet neural networks, you can choose Mexican, Gaussian or Morlet wavelet function as the activation function.
B. The interconnection weights, or synaptic weights, between input and hidden later, and/or hidden and output layers, as well as the bias terms, are adjusted during the learning process, through gradient-based techniques, or metaheurisric algotirhms.
C. Input values remain unchanged, unless it deals with k-fold cross validation, where one set of input is hold in each run to be used as the testing set.
B. the interconnection weights are the parameters you can tune during the training process. You may also have some free parameters inside the activation functions ( for instance, the variance of some Gaussian function in the RBF network), but it is not very common. The values of the input signals do not change and are given by your dataset.
A. The activation function is pre-defined before the training or learning begins. For example, in wavelet neural networks, you can choose Mexican, Gaussian or Morlet wavelet function as the activation function.
B. The interconnection weights, or synaptic weights, between input and hidden later, and/or hidden and output layers, as well as the bias terms, are adjusted during the learning process, through gradient-based techniques, or metaheurisric algotirhms.
C. Input values remain unchanged, unless it deals with k-fold cross validation, where one set of input is hold in each run to be used as the testing set.
To summarize, basically all the weights and biases are updated during training process as well as automatic tuning parameters of training algorithm. However, activation functions can include some specific parameters depending on different network designs.
As far as I know the activation function and the value of input signals remain constant during the generation and model evolving as you assign them at the beginning of the process. But the interconnection weight of each input parameters would change during the model evolution.