I think it depends on the MCDM methods and problem characteristics. For example, CoCoSo method uses Weitenforf's linear normalization; TOPSIS method uses vector normalization; Entropy methods uses logarithmic normalization.
I think the choice of normalization is a separate problem in itself, and at the same time applying normalization is the solution to another problem (if the units are different). If there are units, they should be used, but for example Financial data is unitless and in this case a non-normalization methodology can be adopted. Moreover, outranking methods that are not affected by normalization such as FUCA and PROMETHEE can also be tried. Whether a normalization is better is debatable. But we can reduce these discussions with objective criteria. For example, Rank Reversal is an example of deformation caused by normalization. Can we say that whichever produces less RR is better? I think it's worth investigating. And it's just an idea. By the way, I think there is no need to look for a harmony between the existing methods and the normalization they adopt. They may even have gotten married by chance.
Mahmut Baydas Thank you for your answer. In future research, I will test several different normalizations on one MCDM method and share my conclusions with you.
Duško Tešić You can refer this article for comparison among normalization procedures, i.e., the Min-Max method, the Maximum method, the Sum method, the Vector method.
Article Efficiency of Methods for Determining the Relevance of Crite...