To assess the degree of dependence of two variables you can use the Mutual Information (MI) or statistical correlation measurements such as "Pearson", "spearman", or "Kendall's tau".
The most popular measure, Pearson's correlation, is a well-known similarity measure between two random variables. According to it if two variables are linearly dependent, then their correlation coefficient is ±1. If the variables are uncorrelated, the correlation coefficient is 0. Please note that the correlation coefficient is invariant to scaling and translation.
Mutual information is a more general measure that computes the reduction of uncertainty in Y after observing X. MI can measure non-monotonic relationships and other complex relationships.