Rank-driven Markov processes

Grinfeld, Michael and Knight, Philip and Wade, Andrew (2012) Rank-driven Markov processes. Journal of Statistical Physics, 146 (2). pp. 378-407. ISSN 0022-4715 (https://doi.org/10.1007/s10955-011-0368-7)

[thumbnail of Rank-Driven Markov Processes] Other. Filename: 1106.4194v1
Preprint
License: Unspecified

Download (879kB)

Abstract

We study a class of Markovian systems of N elements taking values in [0,1] that evolve in discrete time t via randomized replacement rules based on the ranks of the elements. These rank-driven processes are inspired by variants of the Bak–Sneppen model of evolution, in which the system represents an evolutionary ‘fitness landscape’ and which is famous as a simple model displaying self-organized criticality. Our main results are concerned with long-time large-N asymptotics for the general model in which, at each time step, K randomly chosen elements are discarded and replaced by independent U[0,1] variables, where the ranks of the elements to be replaced are chosen, independently at each time step, according to a distribution κ N on {1,2,…,N} K . Our main results are that, under appropriate conditions on κ N , the system exhibits threshold behavior at s ∗∈[0,1], where s ∗ is a function of κ N , and the marginal distribution of a randomly selected element converges to U[s ∗,1] as t→∞ and N→∞. Of this class of models, results in the literature have previously been given for special cases only, namely the ‘mean-field’ or ‘random neighbor’ Bak–Sneppen model. Our proofs avoid the heuristic arguments of some of the previous work and use Foster–Lyapunov ideas. Our results extend existing results and establish their natural, more general context. We derive some more specialized results for the particular case where K=2. One of our technical tools is a result on convergence of stationary distributions for families of uniformly ergodic Markov chains on increasing state-spaces, which may be of independent interest.