When using Bayesian methods to estimate mixture models, such as latent class analysis (LCA) and growth mixture models (GMMs), researchers seem to use common priors for class-specific parameters. Of course, this makes sense when using so-called "noninformative" priors, yet Monte carlo studies often indicate that such priors provide biased and variable estimates, even with relatively large samples.
Confirmatory approaches to mixture modeling, using (weakly) informative priors, performs much better than "noninformative" priors. However, care must be taken when selecting prior distributions (e.g., through careful elicitation from experts or previous research findings).
But consider a scenario in which latent classes are unbalanced (e.g., a two-class model with .80/.20 class proportions). To my knowledge, most researchers use the same priors for parameters in each class, regardless of differences in relative class size. Does anybody know of research in which priors for class-specific parameters have been adjusted to equate their informativeness, dependent on the number of observations in each class? I would be happy to hear of any research where such an approach has been used.