Pinning model in random correlated environment: appearance of an infinite disorder regime

Research paper by Quentin Berger

Indexed on: 09 Oct '13Published on: 09 Oct '13Published in: Mathematics - Probability


We study the influence of a correlated disorder on the localization phase transition in the pinning model. When correlations are strong enough, a strong disorder regime arises: large and frequent attractive regions appear in the environment. We present here a pinning model in random binary ({-1,1}-valued) environment. Defining strong disorder via the requirement that the probability of the occurrence of a large attractive region is sub-exponential in its size, we prove that it coincides with the fact that the critical point is equal to its minimal possible value. We also stress that in the strong disorder regime, the phase transition is smoother than in the homogeneous case, whatever the critical exponent of the homogeneous model is: disorder is therefore always relevant. We illustrate these results with the example of an environment based on the sign of a Gaussian correlated sequence, in which we show that the phase transition is of infinite order in presence of strong disorder. Our results contrast with results known in the literature, in particular in the case of an IID disorder, where the question of the influence of disorder on the critical properties is answered via the so-called Harris criterion, and where a conventional relevance/irrelevance picture holds.