I'm currently implementing variance-based sensitivity analysis. For uncorrelated variables in a simple additive model, I get expected behavior: the sum of first-order Sobol indices = 1. When I add correlation to the variables for the same simple additive model, I get a sum greater than 1. For three inputs, it's something like
S1 = 0.117
S2 = 0.505
S3 = 0.792
In my opinion, the sum of the Sobol indices should always be less than or equal to one, but I am no expert (in fact I'm a novice). Does anyone know if the indices exhibit any special behavior for correlated inputs so that the sum can be greater than 1? Or is it more likely that I've made an error?
Thank you in advance for your help