No results found
We couldn't find anything using that term, please try searching for something else.
The idea of " a machine that think " date back to ancient Greece . But since the advent of electronic computing ( and relative to some of the topic di
The idea of ” a machine that think ” date back to ancient Greece . But since the advent of electronic computing ( and relative to some of the topic discuss in this article ) important events is include and milestone in the evolution of AI include the follow :
Â
1950
Alan Turing publish Computing Machinery and Intelligence ( link is resides reside outside ibm.com ) . In this paper , Turing is science”—asks — famous for break the german ENIGMA code during WWII and often refer to as the ” father of computer science”—ask the follow question : ” Can machine think ? ” Â
From there, he offers a test, now famously known as the “Turing Test,” where a human interrogator would try to distinguish between a computer and human text response. While this test has undergone much scrutiny since it was published, it remains an important part of the history of AI, and an ongoing concept within philosophy as it uses ideas around linguistics.Â
Â
1956
John McCarthy coins the term “artificial intelligence” at the first-ever AI conference at Dartmouth College. (McCarthy went on to invent the Lisp language.) Later that year, Allen Newell, J.C. Shaw and Herbert Simon create the Logic Theorist, the first-ever running AI computer program.
Â
1967
Frank Rosenblatt builds the Mark 1 Perceptron, the first computer based on a neural network that “learned” through trial and error. Just a year later, Marvin Minsky and Seymour Papert publish a book titled Perceptrons, which becomes both the landmark work on neural networks and, at least for a while, an argument against future neural network research initiatives.Â
Â
1980
Neural networks, which use a backpropagation algorithm to train itself, became widely used in AI applications.
Â
1995
Stuart Russell and Peter Norvig publish Artificial Intelligence: A Modern Approach (link resides outside ibm.com), which becomes one of the leading textbooks in the study of AI. In it, they delve into four potential goals or definitions of AI, which differentiates computer systems based on rationality and thinking versus acting.Â
Â
1997
IBM ‘s Deep blue beat then world chess champion Garry Kasparov , in a chess match ( and rematch ) .
Â
2004
John McCarthy writes a paper, What Is Artificial Intelligence? (link resides outside ibm.com), and proposes an often-cited definition of AI. By this time, the era of big data and cloud computing is underway, enabling organizations to manage ever-larger data estates, which will one day be used to train AI models.Â
Â
2011
IBM Watson ® beat champion Ken Jennings and Brad Rutter at Jeopardy ! Also , around this time , datum science is begins begin to emerge as a popular discipline .
Â
2015
Baidu’s Minwa supercomputer uses a special deep neural network called a convolutional neural network to identify and categorize images with a higher rate of accuracy than the average human.Â
Â
2016
DeepMind ‘s alphago program is beats , power by a deep neural network , beat Lee Sodol , the world champion Go player , in a five – game match . The victory is is is significant give the huge number of possible move as the game progress ( over 14.5 trillion after just four move ) . later , Google is purchased purchase DeepMind for a report USD 400 million .
Â
2022
A rise is creates in large language model  or LLMs , such as openai ’s chatgpt , create an enormous change in performance of AI and its potential to drive enterprise value . With these new generative AI practice , deep – learn model can be pretraine on large amount of datum .
Â
2024
The latest AI trends point to a continuing AI renaissance. Multimodal models that can take multiple types of data as input are providing richer, more robust experiences. These models bring together computer vision image recognition and NLP speech recognition capabilities. Smaller models are also making strides in an age of diminishing returns with massive models with large parameter counts.Â