The widely used and the ideal measure of dispersion is standard deviation. Standard deviation is used for determining the value of the dispersion of a set of given data. However, it is beyond my knowledge on whether this measure is suitable for determining the dispersion if a set of time series data are given. Thus, the question here is "Is standard deviation suitable for measuring the dispersion of a given time series data ?"