as far as I know and as Yaroslav said, two studies are the minimum.
I have seen meta-analyses with "only" something like 4-5 studies, but these were almost identical replications of the same experiment, so heterogeneity was intentionally low and the goal was just to pool the effect and increase power. It also depends on the field - see this investigation on the Cochrane database [1].
However, in most cases a meta-analysis on such small number of studies might not make much sense. Besides estimating a weighted average of the reported effects, a meta-analysis is a tool to investigate other important things such as between-studies heterogeneity, roles of potential moderators, whether the effect differs in different subgroups of studies, and the presence of bias in the literature. If you have a very small number of studies, you might lack power to conduct these analyses.
--------------
[1] Davey, J., Turner, R. M., Clarke, M. J., & Higgins, J. P. (2011). Characteristics of meta-analyses and their component studies in the Cochrane Database of Systematic Reviews: a cross-sectional, descriptive analysis. BMC medical research methodology, 11, 160. https://doi.org/10.1186/1471-2288-11-160
I guess at least 2 studies to summarize something. There is no minimal number of studies. First it is more important to be sure that you are not missing any studies on the subject. Second you need to be sure that it is appropriate to lump the studies in a meta-analysis.