The standard deviation is defined as the square root of the variance. From this definition, the variance will be less than the standard deviation if it's < 1. For example, if variance = 0.01, then std = squareroot(variance) = 0.1 > variance.
I don't understand your definition of variance in terms of the standard deviation divided by mean...
now I'm also have this problem. my data is not normal distributed and i get variance as 0.45 then sd is 0.65 approximately. what does it mean? how we can interpret this condition?