Human consciousness could be the result of the brain's abstraction of the multitude of external signals. If so, how could such a condition be emulated in an AI?
While I am usually in agreement with Joachim, I will do a qualified dissent in his statement:
" AI systems are not overwhelmed by the wealth of possible input channels that could be chosen in the present moment, as opposed to humans. Currently. "
While it is true that channel independence is a given in a computer system, it is not for data and information aggregation/fusion. At one point the different channels have to be joined in a system bus or storage system limited by a single access point. Also, while we have made strides in parallel processing there is still a long road ahead to cover and while there is progress in pattern recognition, higher order reasoning is still not there.
My opinion is that the field of higher order processing (metaprogramming) for AI needs to be initiated. This takes several forms:
the form of evaluating the processing of the data as a metaprocess that is currently in Ml being done manually (learning parameter tweak, data selection and preprocessing as well as experimental setup design and implementation, etc.)
Mixed symbolic/subsymbolic systems that can do metaprogramming routines to change behavior based on the processed data
This addresses raw data processing and not the actual higher level thinking and abstraction that can be associated with consciousness which requires other areas such as NLP to be further developed.
I recall Igor Aleksander unveiling what he announced to be a conscious machine to the computer science department in Manchester in 1996. He showed how it could dream...
Consciousness seems not to be algorithmic. For example we perceive time, but in computer time does not exist. There is just a clock that synchronized states of the system. Because of this, the states can be printed out and we get a book that describes the evaluation of the algorithm. Of course the algorithm does not read its own book. But we do.