I am trying to determine the robustness of a regular square network of NxN nodes. I am doing this by means of the average node connectivity, which is the average of local node connectivity over all pairs of nodes.

I am doing this in Python, and I am using the function described in the attached link. 

The problem: if N is relatively small, the calculation is pretty fast, but as N grows it becomes computationally demanding.

My question: is there a faster way of measuring the robustness of a network, other than using the average node connectivity? 

This is what it takes me to compute the average node connectivity of regular networks of varying size:

  • N=10 (100 nodes) -> 6.81 seconds per iteration;
  • N=20 (400 nodes) -> 486 seconds per iteration (circa 8 min).

https://networkx.github.io/documentation/latest/reference/generated/networkx.algorithms.connectivity.connectivity.average_node_connectivity.html

More Francesco Castellani's questions See All
Similar questions and discussions