Articles I have written mostly use data from other people. One possible exception was my calculation of lexical growth rates (lexical scaling on arxiv and RG), which outside of glottochronology, no one else seems to have much bothered with. To calculate lexical growth rates I used historical dictionaries of the English language. Collecting, adjudging, and organizing words in the English lexicon is a data project much vaster in scope than merely taking word counts and calculating rates. It seemed to be that in this era, there are all kinds of sources of data which a person can use as a basis for theory. I performed no experiments (unless spreadsheet calculations and forming equations as theoretical experiments count). I instead find an abundance of data. The data I found were susceptible of new theoretical investigation. So I wonder: if data in this computer age is increasing by vast amounts, can theory keep up? Will AI remedy that possible deficiency?

More Robert Shour's questions See All
Similar questions and discussions