Preserving sparsity in dynamic network computation

Arrigo, Francesca and Higham, Desmond J.; Cherifi, Hocine and Gaito, Sabrina and Quattrociocchi, Walter and Sala, Alessandra, eds. (2016) Preserving sparsity in dynamic network computation. In: Complex Networks & Their Applications V. Studies in Computational Intelligence, 693 . Springer, ITA, pp. 147-157. ISBN 9783319509013 (https://doi.org/10.1007/978-3-319-50901-3_12)

[thumbnail of Arrigo-Higham-SCI2016-Preserving-sparsity-dynamic-network-computation]
Preview
Text. Filename: Arrigo_Higham_SCI2016_Preserving_sparsity_dynamic_network_computation.pdf
Accepted Author Manuscript

Download (170kB)| Preview

Abstract

Time sliced networks describing human-human digital interactions are typically large and sparse. This is the case, for example, with pairwise connectivity describing social media, voice call or physical proximity, when measured over seconds, minutes or hours. However, if we wish to quantify and compare the overall time-dependent centrality of the network nodes, then we should account for the global flow of information through time. Because the time-dependent edge structure typically allows information to diffuse widely around the network, a natural summary of sparse but dynamic pairwise interactions will generally take the form of a large dense matrix. For this reason, computing nodal centralities for a timedependent network can be extremely expensive in terms of both computation and storage; much more so than for a single, static network. In this work, we focus on the case of dynamic communicability, which leads to broadcast and receive centrality measures. We derive a new algorithm for computing time-dependent centrality that works with a sparsified version of the dynamic communicability matrix. In this way, the computation and storage requirements are reduced to those of a sparse, static network at each time point. The new algorithm is justified from first principles and then tested on a large scale data set. We find that even with very stringent sparsity requirements (retaining no more than ten times the number of nonzeros in the individual time slices), the algorithm accurately reproduces the list of highly central nodes given by the underlying full system. This allows us to capture centrality over time with a minimal level of storage and with a cost that scales only linearly with the number of time points.