In 1950, Alan Turing, revered as the "father of artificial intelligence", launched the famous "Turing test". Turing first defined the "intelligence" of a machine: if a machine can engage in dialogue with humans but cannot be identified as its machine identity, then it can be said to have intelligence.
In the summer of 1956, Dartmouth College in the United States held its first ever seminar on artificial intelligence, and McCarthy first proposed the concept of "artificial intelligence", which is widely believed to have been born since that year. McCarthy believes that every aspect of human learning or any other feature of intelligence can, in principle, be accurately described to the point where machines can simulate it.
However, due to various factors, "artificial intelligence" may only remain at the theoretical level for a long time. One fact is that in the decades since that conference, the field of artificial intelligence has experienced several "cold winters", which has often led countless researchers to experience disillusionment.
In the 1960s, some advocates of artificial intelligence continued to be optimistic and encouraged government agencies in the United States and Britain to invest funds "recklessly" in research. Marvin Minns, the father of artificial intelligence, even proposed that "within a generation, the problem of creating 'artificial intelligence' will be substantially solved
However, soon after, government funds began to dry up, and more and more people believed that research on artificial intelligence had made no substantial progress beyond hype. Therefore, the 1970s seemed to have ushered in an unprecedented cold winter for the development of artificial intelligence.
In the early 1980s, researchers in semiotic AI brought a new "boom" in artificial intelligence, focusing on specific disciplines and gaining recognition from many legal or medical experts; But in the late 1980s, research on symbolic artificial intelligence once again fell into a slump. Artificial intelligence required specialized hardware, and the huge costs were difficult to calculate returns, which gave it a greater market advantage compared to ordinary desktop computers.
In this way, the development of artificial intelligence has been accompanied by huge fluctuations, but under the support of relevant supporters, artificial intelligence has never truly disappeared. Until the end of the last century, with the large-scale development of various fields such as computers, chips, software, and the internet, artificial intelligence was also awakened to the "dream of spring".