Your goal isn't clear to me, unfortunately, so my answer may not be as helpful as you would like.
The short answer, as to how to get an r or r-squared of zero, is to select an independent variable that has no correlation with the dependent variable. How to do this?
1. Regress the DV on the chosen IV, and save the residual values (Y - Yestimated).
2. The residuals (your "new IV") will then have a zero correlation with the DV.
Since you want to add a variable and apparently keep the same data and other variables, least squares (OLS is a special case of weighted least squares and may not be the case you want, by the way) will not do what you want. It will try to use the data to find the OLS (or WLS) regression with the smallest sum of squared estimated residuals (sum of squared random factors of estimated residuals). Any additional variable that will not help can not reduce R-square to zero as there already are some useful data and variables already present. The regression will rely mainly on what helps. At least that's my take on this. R-square is often not a good measure because it can be misinterpreted, but I just don't see a way to totally mess up least squares if you already have something. I first thought of forcing a change in sign of a coefficient, but then realized least squares just won't let you do that. So, I think the answer is that you cannot do this. Perhaps if you gave more background about what you are doing, though, that might help.
Try generating two random sequences from very different distributions. Plot one sequence against the other. That will tell you if this idea works. David Morse has an excellent suggestion. Best, David Booth
But I don't think that was the question. Esra, as I read this, wants to add another independent variable to his existing OLS regression. Yes, if you "...select an independent variable that has no correlation with the dependent variable" that will give r-square zero, but I think that Esra already has at least one independent variable and a dependent variable, with data. He wants to add yet another independent variable. I think he might lower R-square, but not make it zero.
Or this question could have been intended to be less complicated.
You can't drop R2 to 0 by adding a variable. Let say you have 1 IV correlated to a DV by b. Adding a second IV related to the DV by c, by fixing the R2=0, you also fix c'=b'=0 (the prime denote a partial correlation, that is the effect of the variable given the presence of another variable in the model. We don't know b and c yet. Let say also that both IV are related together by the correlation a. We know from path analysis that c = c'+ab=(0)+a(0) = 0 and from partial correlation equation that b = b'(1-a^2)+ac=(0)(1-a^2)+a(0)=0. So b=b'=c=c'=0. In conclusion, to have a R2=0 model by adding a second IV, the new variable has to be uncorrelated to the DV and the model has to be already at R2=0.
Basically, the problem is that the added variable need to account for the correlations between all previous IV to the DV while having no link to the DV.
Practically, you might be able to make R2 drop slightly by adding useless variables (like collinear variables or uncorrelated variables).
Maybe, were you more interested in suppressor variables? I link you my article on mediation, as it addresses most of the equations herein.