AdaBoost uses iterations to train and build base classifiers that are combined to form the final output. How does the number iterations relate to the number of base classifiers that are formed in a model?
Every iteration AdaBoost algorithm trains a new classifier. Hence number of classifiers equals number of iterations.
Note, that one can refer sometimes to the original weak classifier used for training as a base classifier. In this case it is typically just one classifier, no matter how many iterations you do.
The output of the 'weak learners' is combined into a weighted sum that represents the final output of the boosted classifier. AdaBoost is adaptive in the sense that subsequent weak learners are tweaked in favor of those instances misclassified by previous classifiers.