Modern-day AI is a culmination of weird ideas of stalwarts spread over centuries. The year 2021 especially is special in this regard as it also happens to be the 375th birth anniversary of Gottfried Wilhelm Leibniz, the 90th anniversary of Kurt Goedel’s 1931 groundbreaking paper, and the 80th anniversary of Konrad Zuse’s seminal work. These works laid the foundations for modern-day AI and its algorithms. The significance of this year was first brought to light by Prof. Juergen Schmidhuber, who himself has been responsible for many groundbreaking works in the field of AI.
Leibniz: World’s First Computer Scientist?
Also known as the world’s first computer scientist, Leibniz’s work had a great impact on the field of computing. According to Prof.Schmidhuber, many aspects of “modern” computer science can be traced back to Leibniz. Leibniz’s step reckoner was the first machine to add, multiply and memorise numbers. In 1679, writes Prof.Schmidhuber, Leibniz described the very principles of a binary computer. “His description describes precisely how electronic computers function. Gravity and movement of marbles are replaced by electrical circuits, but the principle functions in the same way.”
Leibniz was also one of the first thinkers who tried to understand the fundamental concepts behind human thoughts and language. In a way, Leibniz has laid the foundations for modern-day natural language processing. In one of his essays, Leibniz opined that languages mirror the human mind, and by analysing the significance of words, one can gain more insights into the foundations of understanding. The earliest works on symbolic representation of human cognitive processes and human reasoning can be traced back to Leibniz. The intricacies of human reasoning still elude current AI researchers as the pursuit of artificial general intelligence continues.
Talking about Leibniz’s contributions, computer scientist Stephen Wolfram in one of his talks on AI ethics, said that we can still have many of his contributions have their traces in theorem proving systems and modern ML or, “It’s humbling to see how many concepts it effectively forms — that we haven’t yet absorbed in our culture,” mulled Wolfram.
Kurt Gödel’s Breakthrough
Kurt Gödel was known for his works on fundamental limits of theorem proving, computing, AI, logic, and mathematics itself. This had an enormous impact on the science and philosophy of the 20th century. Gödel’s incompleteness result is widely regarded as the most remarkable achievement of 20th-century mathematics. The significance of Gödel’s theorems aged well with time. From being ignored by mathematicians to realising its connectivity in many computing endeavours, Gödel’s work has had a great impact on theoretical computer science.
Gödel’s work on universal formal languages and the limits of proof and computation laid the foundations of theoretical computer science. Gödel’s incompleteness theorems deal with mathematical logic that demonstrates the inherent limitations of every formal axiomatic system capable of modelling basic arithmetic. Godel’s theorems are aimed at building a universal computer that can do any computation.
Wolfram also wrote in 2006: “In the seventy-five years since then, what became known as Gödel’s theorem has been ascribed almost mystical significance, sowing the seeds for the computer revolution.” Back in 2003, Prof. Schmidhuber even made an attempt to make superintelligent machines a reality by proposing a Gödel machine, which can rewrite its own code as soon as it has found proof that the rewrite is useful. “Given the tremendous impact of Gödel’s results on AI theory, it does make sense to date AI’s beginnings back to his 1931 publication 75 years ago,” wrote Prof.Schmidhuber in a 2006 paper.
Konrad Zuse’s Breakthrough
In 1941, Konrad Zuse completed the world’s first practical, working, programmable, general-purpose, computer called Z3, which was destroyed in a WW2 aerial bombing, but Zuse’s ideas survived. Z3 uses 2,300 relays, performs floating point binary arithmetic, and has a 22-bit word length. “The physical hardware of Z3 was indeed universal in the “modern” sense of Gödel, Church, Turing, and Post — simple arithmetic tricks can compensate for Z3’s lack of an explicit conditional jump instruction,” explained Prof.Schmidhuber.
The last century or so has witnessed tremendous innovation in the field of mathematics. New theories have been postulated, and traditional theorems have been immortalised through the advancement of fields such as computer science. We are still reaping the benefits of the exhaustive endeavours of the pioneers who unknowingly enabled us to build intelligent machines of the future. “It seems incredible that within less than a century, something that once lived only in the minds of titans has become something so inalienable from modern society. The world owes these scientists a great debt,” writes Prof.Schmidhuber.
Read more about the history of AI here.