Is it there any Hypothesis test that would allow for one to accept or reject the hypothesis of existence/absence of a Self-Similarity behaviour (https://en.wikipedia.org/wiki/Self-similarity) within a time-series ?
While not testing for self-similarity per se, a hypothesis test for nonlinearity, typically used to detect chaos in time series, is the BDS (Brock-Dechert-Scheinkman) test which allows you to reject the null hypothesis of linearity for a given p value. You can use this first to eliminate linearity and then use other tests to try to measure the self-similarity.
Remember though self-similarity and long range dependence are different. Self-similarity is scaling behavior while LRD is the fact that the autocorrelation should never decrease completely to 0 (integration of the autocorrelation plot is infinite).
Imagine your time series data looks like a curve, e.g., stock price. You can derive all the bends of the curve X, as shown in Figure 1 of the following paper:
There are far more small bends than large ones, and the large bends are self-similar to all bends as a whole. To see or understand clearly the kind of self-similarity, you need to conduct so called head/tail breaks for bends X, and accordingly derive ht-index for characterizing how many times the scaling pattern of far more small bends than large ones.
Jiang B. and Yin J. (2014), Ht-index for quantifying the fractal or scaling structure of geographic features, Annals of the Association of American Geographers, 104(3), 530–541.
Jiang B. (2013), Head/tail breaks: A new classification scheme for data with a heavy-tailed distribution, The Professional Geographer, 65 (3), 482 – 494.