The main citation I found is to Nelson (2017). Computational Grounded Theory: A Methodological Framework, Sociological Methods and Research, 49,3-42.
Here is the abstract:
This article proposes a three-step methodological framework called computational grounded theory, which combines expert human knowledge and hermeneutic skills with the processing power and pattern recognition of computers, producing a more methodologically rigorous but interpretive approach to content analysis. The first, pattern detection step, involves inductive computational exploration of text, using techniques such as unsupervised machine learning and word scores to help researchers to see novel patterns in their data. The second, pattern refinement step, returns to an interpretive engagement with the data through qualitative deep reading or further exploration of the data. The third, pattern confirmation step, assesses the inductively identified patterns using further computational and natural language processing techniques. The result is an efficient, rigorous, and fully reproducible computational grounded theory. This framework can be applied to any qualitative text as data, including transcribed speeches, interviews, open-ended survey data, or ethnographic field notes, and can address many potential research questions.
Note that this approach uses procedures based on older versions of AI (i. e., pre-generative AI), but it could probably be updated without much trouble. Also, I note that it has 600+ citations in Google Scholar, so you could follow up on those to find out more.
Personally, I don't see much here that has anything to do with Grounded Theory. Instead, it just seems like a generic version of inductive analysis.