He did not specify which intelligent systems he was referring to, but it is likely to be OpenAI’s GPT-3 autoregressive language model.

The chief scientist of the research group at artificial intelligence development company OpenAI, Ilya Sustskever, posted on his Twitter account on Thursday that “it may be that today’s large neural networks are slightly conscious”.

Sutskever did not specify which intelligent systems he was referring to, but it is likely to be the GPT-3 (acronym for Generative Pretrained Transformer) autoregressive language model, which was developed by OpenAI in 2020 to generate human-like text using deep learning techniques.


What is WuDao 2.0, China’s artificial intelligence model capable of writing poems and generating recipes that beat Google and Musk’s OpenAI?
Sutskever’s tweet quickly sparked a debate among experts in the field, and most were sceptical about the idea.

Artificial intelligence systems researcher Joscha Bach asked Sutskever what functionality he associates with consciousness. In addition, he wondered whether consciousness is “a matter of continuous degree, a roughly binary property or a set of discrete capabilities that allow for distinct degrees”.

Oregon State University Distinguished Professor Thomas G. Dietterich , opined that “if consciousness is the ability to reflect and model itself”, he has not seen that ability in actual neural networks, suggesting that the chief scientist was just ‘joking’. Meanwhile, Dietterich is agreed by Valentino Zocca, an expert in deep learning technology, who said that AI “is not conscious, but apparently hype (artificially generated expectations around a person or product) is more important than anything else”.

Meanwhile, Toby Walsh, professor of artificial intelligence at the University of New South Wales in Sydney, said that “every time such speculative commentary is aired it takes months of effort to bring the conversation back to the more realistic opportunities and threats posed by AI”.

In September 2021, OpenAI cancelled the GPT-3 project, claiming it violated the rules for using its computer system, after a user tried to virtually resurrect his deceased girlfriend through its use.

The original article can be found here