Assume that human will be able to build a superintelligent machine which surpasses all aspects of his intelligence reaching the point of singularity where this superintelligent machine recursively self-improves itself in an uncontrollable and irreversible way that might pose a threat to human civilization
The presented opinion here suggests that there will be no real threat to human civilization caused by AI singularity for the following reasons:
Does it take only intelligence to adapt to and survive in this world ?
Human civilization is built on a history of development of survival techniques including seeking shelter, providing food, developing financial systems governing daily transactions, and development of healthcare systems for human welfare.
Unlike humans, machines cannot survive on their own. This is because data is the fuel of AI which feeds its intelligence and, therefore, machine is dependent on human civilization for providing such data or otherwise it will be outdated
A lag in human civilization during this anticipated phase of post singularity would negatively affect data provided to AI and therefore delay its development.
Therefore, according to the previous argument, singularity is thought to be self limiting and in case it happens it's only a matter of time before machine intelligence becomes obsolete
Based on the above argument do you think AI singularity is a threat to human civilization?
References:
1. https://www.sciencedirect.com/science/article/abs/pii/S0065245808604180
2. https://edoras.sdsu.edu/~vinge/misc/singularity.html