Information gain is the concept used to select variables that is used for classification in decision trees.
Is the quantity of information the most important part of information gain or is the quality of the information obtained more important?
IMO, Quality of the information that determines entropy, which in turn determines information gain is more important than the quantity of the input information but I just wanted to make sure my thoughts..