The fact that AI relies on human to provide it with data critical to its intelligence growth should make us more skeptical when speculating scenarios related to singularity where AI becomes superior to human. To what degree do you think such skepticism is justified?