Complexity of an algorithm is mostly represented in Big O notations that plays an important role in finding efficient algorithm. What is the time complexity of MLP and other ANN?
This depends on the architecture of the network. For a trained MLP the complexity of classification (the forward propagation) is roughly:
1. the number of multiplications needed to compute the activation of all neurons (vector product) in the i-th layer of the net equals: NumberOfNodesInLayer(i)*NumberOfNodesInLayer(i-1)
Since the separation of concave regions is possible with 3 layers, lets assume we got 3 at most. Given that the last layer only contains 1 neuron which is used to output the classification of the input, we got 2 layers to distribute our remaining n-1 neurons to. The worst case (regarding complexity) would be that the nodes are equally distributed, which results in ((n-1)/2)^2 \in O(n^2).
2. since output functions of neurons are in general very simple to compute i would assume those costs to be constant per neuron. Given a network with n neurons, this step would be in O(n).
Given the fact, that the number of neurons n for a given problem can be regarded as a constant, the overall complexity of O(n^2) equals O(1).
This depends on the architecture of the network. For a trained MLP the complexity of classification (the forward propagation) is roughly:
1. the number of multiplications needed to compute the activation of all neurons (vector product) in the i-th layer of the net equals: NumberOfNodesInLayer(i)*NumberOfNodesInLayer(i-1)
Since the separation of concave regions is possible with 3 layers, lets assume we got 3 at most. Given that the last layer only contains 1 neuron which is used to output the classification of the input, we got 2 layers to distribute our remaining n-1 neurons to. The worst case (regarding complexity) would be that the nodes are equally distributed, which results in ((n-1)/2)^2 \in O(n^2).
2. since output functions of neurons are in general very simple to compute i would assume those costs to be constant per neuron. Given a network with n neurons, this step would be in O(n).
Given the fact, that the number of neurons n for a given problem can be regarded as a constant, the overall complexity of O(n^2) equals O(1).
^ Ferreira, C. (2006). "Designing Neural Networks Using Gene Expression Programming". In A. Abraham, B. de Baets, M. Köppen, and B. Nickolay, eds., Applied Soft Computing Technologies: The Challenge of Complexity, pages 517–536, Springer-Verlag.