I will give my answer:

Metacognition is like a view of the 'self' in the sense that there is evidence only that you necessarily (absolutely) need it for conventional social interactions (e.g. communication).  Otherwise you may well need neither and much or most of the time you likely "have" and use neither .

It is highly likely that if you think there is always a guiding 'self' or guiding "metacognition" or operational 'higher' executive processes, you have a homunculus (artificial, unreal person-within-the-person) on your hands -- at least in some major instances.  It is no kind of necessary (required-thought), foundational basis for any decent general theory of cognition.  [ (And, on a personal note, constantly actively believing or having [either of] these things is not good for you -- it's maladaptive.  You may be able to get in such a state, but in some circumstances you will be "messed up" -- it would never (generally) be an improvement.  ]

If you think like many of my personal-belief-system persuasion then thinking about thinking would be RATHER THAN using all you've got to think about the subject matter and that is OFTEN NOT adaptive (in short, because it is irrelevant and distracting).

If you can't overcome these criticisms you should abandon any central (or required) role for metacognition, executive processes, "mind-reading", "time travel", and the like.  And, these criticisms are more-than-plausible.

Discovering the nature of innate guidance mechanisms is much better than positing a homunculus (note the word, 'discovering', early in the sentence, NOT positing).  Worry not, cognitive scientists, my Project is here for you: "Human Ethology and Development".  If you would like to approach such a problem as I have attributed to the metacognition people from a personal direction/perspective, you could try the "Core Buddhism" Project.

More Brad Jesness's questions See All
Similar questions and discussions