I'll start by repeating the title, above: What psychologists have not yet realized is that eye-tracking technology, etc. ALLOWS FOR AN _OVERALL_ MORE EMPIRICAL APPROACH !!

The new technologies are not just a tool for the "empiricism" they already do!

I have described and formalized into concrete, now-testable hypotheses that which would establish the most empirical grounding for "abstract" concepts. More empirically grounded and founded than anything heretofore, without a doubt -- and the view/approach is biologically LIKELY and this approach to research (on some new CONTENT it is good for) has not yet been tried. It involves "believing" nothing (actually believing MUCH less "on faith"); it really involves simply more empiricism, more direct observation [ specifically: discovering the DIRECTLY OBSERVABLE OVERT behavioral foundations for the qualitatively different levels/stages of cognitive development -- and HOW __LEARNING__ ITSELF (presently often ill-defined) CHANGES WITH THIS NEWLY OBSERVABLE PHENOMENON, and the consequences, ALSO ].

I have tried to clearly outline (including ending with most-empirical and testable hypotheses): the inception of abstract concepts with "perceptual shifts" (and thus providing them a concrete in-the-world foundation). Again, the theory has to do with "perceptual shifts", NOW -- presently (at this point in history) -- able to be SEEN with new technologies: SEEING what subtle overt behaviors likely occur at the inception of each higher level of thinking during ontogeny. The outlook and approach is a cognitive-developmental theory -- i.e. of human child development -- and for finding of more major parts of who we all are).

You might well want to take a look:

The perspective and approach especially and specifically has to do with:  perception and quickly/soon after that: attentional and gazing changes which likely occur at the inception of new qualitative cognitive developments (with ontogeny) (and literally, sensibly, set them off). 

The following theory, with its most-empirical and testable hypotheses, indicates (clearly, with as good and totally empirical "guesses" as are now possible) the nature of these perceptual/attentional shifts accompanying (actually: "starting off") major qualitative changes in cognition:

Here it is : Minimally, read both of the major writings: Not only

Article A Human Ethogram: Its Scientific Acceptability and Importanc...

BUT ALSO the much, much more recent:

Book "Essentially, all Recent Essays on Ethogram Theory"

(these much later, recent essays filling in some of the aspects of the treatise not originally provided, as stated directly in "A Human Ethogram ... " itself).

This theory does a LOT else correctly (unlike other theories) in abiding by necessarily applicable principles and seriously trying to have a perspective and approach which has ALL the features and dimensions a science theory should have . It is parsimonious. It uses the well-developed vocabulary of CLASSIC ethology (not the bastardized 'ethology' of today).

Psychologists may ignore this, but that would be just ignoring a most-empirical way of study (and ignoring some of the most-empirical, most-testable hypotheses). In short, it is scientifically WRONGFUL for psychologists to ignore this.

P.S. ALSO: Because all of this is so much more concrete, this theory of development and changes in learning should be THE theory of most interest to those trying general artificial intelligence.

More Brad Jesness's questions See All
Similar questions and discussions