When you normalize the data you bring are adjusting the data to a common scale so you have a common base to evaluate the norm and outliers in the data set. In my research I normalized the data from Complex Cyber Threats when I was using Emergent Self Organizing Maps (ESOM) after an initial review using Formal Concept Analysis. This way I knew I was using a standard measure each time I ran the data sets and could easily spot items outside the norm to investigate further.
Most of the data mining algorithm are not perform well in case of unscale data. In KDD Cup 99, the attribute like duration, source byte, dst byte contains high variations as a result the performance of the algorithm degrades. I have tested it and published a paper.
The URL of the Paper is: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=6779523.
If you requred then i will send you the full text. In this paper i have given the necessity of Normalization on KDD 99 dataset.