Nowadays complex network data sets are reaching enormous sizes, so that their analysis challenges algorithms, software and hardware. One possible approach to this is to reduce the amount of data while preserving important information. I'm specifically interested in methods that, given a complex network as input, filter out a significant fraction of the edges while preserving structural properties such as degree distribution, components, clustering coefficients, community structure and centrality. Another term used in this context is "backbone", a subset of important edges that represent the network structure.
There are methods to sparsify/filter/sample edges and preserve a specific property. But are there any methods that aim to preserve a large set of diverse properties?