How the definition of strong AI has changed over time, and how this may be creating a fertile ground for an AI winter
Most people understand strong AI as a term interchangeable with Artificial General Intelligence (AGI). That is, strong AI is today generally used to refer to machines that are as intelligent as humans. Historically, however, the term strong AI has been coined to emphasize something else: the idea that intelligence, how it has been achieved by biological systems, is fundamentally different from the today’s approaches used by machines. In its original version, the definition of strong AI presumed that dogs and bees were also strongly intelligent — and that machines, in contrast, were not designed in a way that would enable them to match up to biology. Machines were thus presumed to be weakly intelligent. Over time, the definition of strong AI has changed; and with that, also the arguments have been forgotten about the principle limitations inherent into today’s AI. Today’s definition od strong AI presumes that there are no problems with the existing technology, and that no fundamental problem stands on its way towards AGI. In contrast, I will argue that this ignorance of known problems has been the main reason in the past that we experienced AI winters. And if we continue to ignore the original message meant to convey through the concept of strong AI, we may be heading towards an AI winter allover again.