“Machine learning tells us nothing!!!!  It [human language] is too rich. … It [language] works as well with impossible languages as with regular languages. … [AI] is as useful as a bulldozer is useful. …  It tells you nothing about mind and thoughts. …  Computational complexity is what accounts for language. …  [AI] will never provide [an] explanation to language. …  Syntax gives language meaning…and syntax is totally independent of the external world [and is controlled by the brain].” (Chomsky, N., 2022, On Theories of Linguistics (Part 2), Dec. 30, 42.32 minutes, Youtube).

If Noam Chomsky is correct that his Universal Grammar, as controlled by the brain, is independent of the external world, then ChatGPT—which is based on vacuuming large amounts of information from the external world—will never be a suitable metaphor for human language.  Furthermore, AI was not designed to provide meaning to its output and that it is merely a transmitter of an output that underwent a previous reconfiguration (i.e., a Factor Analysis that can act on its input). Therefore, the creativity attributed to AI—with its high energy cost as compared to that of the biological brain—will, at best, remain a tool in the hands of humans.  This of course is good news, since humans need to be held responsible for all actions (good and bad) generated by AI (Harari 2024).

More Edward J Tehovnik's questions See All
Similar questions and discussions