In my current research I use corpora data, data elicited from native speakers and lexicographic sources (e.g. definitions from dictionaries).

I operate in this way because corpora may not include tokens for certain constructions (they may be vanishingly rare), but speakers may judge possible construction examples quickly and efficiently. The two methods can complement one another nicely, I believe.

I also use dictionaries (and grammars) to guide the design of possible constructions. For instance, I was testing the polysemy of prepositions in Italian and French by verifying whether possible definitions can be used to describe their senses in corpora data.

I was thus wondering if the joint use of these methods could be considered as a form methodological triangulation of the kind found in social sciences.

I would like to thank in advance any colleagues who will provide answers, after showing enough patience by reading this not so clear description (!).

Francesco

More Francesco-Alessio Ursini's questions See All
Similar questions and discussions