It has been argued that the capacity as well as the speed of a neural network depends on the computational cost of its main building block. But, is there an objective (or at least, quantitative) criterium to measure and compare the computational cost of the traditional neuron models (Integrate and Fire, Izhikevich's, Hodgkin and Huxley's)?

More Erick Arguello's questions See All
Similar questions and discussions