When designing a Convolutional Neural Network (CNN), there are several criteria that can be changed to suit the unique problem someone is attempting to address. These criteria can be divided into many groups and subgroups.
Customized convolutional neural networks (CNNs) can have a wide range of criteria that influence their architecture and performance, including the number of layers, types of compute units (such as convolutional, pooling, or fully connected layers), choice of activation functions (like ReLU, sigmoid, or softmax), optimization algorithms (such as Adam, SGD, or RMSprop), and regularization techniques (like dropout or batch normalization). Additionally, hyperparameters such as learning rates, batch sizes, and input data preprocessing methods can also be tailored, resulting in a highly versatile and adaptable framework for various applications.