You have this result because the log of zero is a negative infinity. Python simply returns "-inf" for the command np.log(0). Please note that what's highlighted in red is actually a Warning, not an error. The latter stops the execution, while the former does not.
"What shall I do?"
One solution I've seen to this problem (in the context of training neural networks) is to add a small positive value epsilon. You can use the same trick, just do something equivalent to: np.log(x_trans.var() + epsilon), where epsilon can be set to a very small value (for example: epsilon = 10^-5). Thus, even if x_trans.var()= 0, you will get np.log(epsilon), which is different than infinity and the warning will disappear.
Khalda Zabel The general rule is to add it at the same place where the zero may occur. In your case, you can declare:
epsilon = 10**-5
then instead of np.log(x_trans.var()), use:
np.log(x_trans.var() + epsilon)
If it doesn't work, try to add epsilon to all values of x_trans.var() in another way.
Please note that the value you choose for epsilon may have an important effect on the interpretation of your results. You should know that better than anyone else. Good luck.