There are generally three approaches towards "binning" (discretizing) a continuous variable for Bayesian Analysis: 1) Frequency, 2) Quintile and 3) Entropy. Frequency divides the range of the variable into equal size amplitude-difference bins over the range. Quintile creates bins based on an equal number of the variable's amplitude "counts" in each bin, resulting in equivalent of a uniform distribution of the variable's amplitude across the range, and the bin sizes (widths) are different for each segment. Entropy creates bin sizes based on (what I believe) equal entropy, or information, in each bin. Shannon Entropy is the probability times the log base 2 of the probability.
Therefore, I believe Quintile is binning based on equal probability in each bin and Entropy is based on the probability times the log of probability in each bin. If this is true, then how can the bin sizes be different with the same number of partitions?
Teach me oh Sensei...