Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Lisp and Prolog for Artificial Intelligence? [closed]

People also ask

Is Lisp still used for AI?

So, yes, Lisp is still being used in artificial intelligence! However, it's also true that Python and C/C++ (to implement the low-level stuff) are probably the two most used programming languages in AI nowadays, especially for deep learning.

Is Prolog still used for AI?

Prolog is highly used in artificial intelligence(AI). Prolog is also used for pattern matching over natural language parse trees.

What is Lisp and Prolog in AI?

Definition. Lisp is the second-oldest high-level programming language after FORTRAN that has changed a great deal since its early days. In contrast, Prolog is a logic programming language associated with artificial intelligence and computational linguistics.

Which is better Prolog or Lisp?

Prolog is a declarative language, while Lisp is a functional language. Both are used for various AI problems but Prolog is used most for logic and reasoning problems, while Lisp is used for problems with rapid prototyping needs.


The question has already been answered for Lisp, so I'll just comment on Prolog.

Prolog was designed for two things: natural language processing and logical reasoning. In the GOFAI paradigm of the early 1970s, when Prolog was invented, this meant:

  1. constructing symbolic grammars for natural language that would be used to construct logical representations of sentences/utterances;
  2. using these representations and logical axioms (not necessarily those of classical logic) to infer new facts;
  3. using similar grammars to translate logical representation back into language.

Prolog is very good at this and is used in the ISS for exactly such a task. The approach got discredited though, because

  1. "all grammars leak": no grammar can catch all the rules and exceptions in a language;
  2. the more detailed the grammar, the higher the complexity (both big O and practical) of parsing;
  3. logical reasoning is both inadequate and unnecessary for many practical tasks;
  4. statistical approaches to NLP, i.e. "word counting", have proven much more robust. With the rise of the Internet, adequate datasets are available to get the statistics NLP developers need. At the same time, memory and disk costs has declined while processing power is still relatively expensive.

Only recently have NLP researchers developed somewhat practical combined symbolic-statistical approaches, sometimes using Prolog. The rest of the world uses Java, C++ or Python, for which you can more easily find libraries, tools and non-PhD programmers. The fact that I/O and arithmetic are unwieldy in Prolog doesn't help its acceptance.

Prolog is now mostly confined to domain-specific applications involving NLP and constraint reasoning, where it does seem to fare quite well. Still, few software companies will advertise with "built on Prolog technology" since the language got a bad name for not living up to the promise of "making AI easy."

(I'd like to add that I'm a great fan of Prolog, but even I only use it for prototyping.)


Can't really speak to Prolog, but here's why Lisp:

  • Lisp is a homoiconic language, which means that the code is expressed in the same form (s-expressions) as data structures in the language. i.e. "code is data". This has big advantages if you are writing code that modifies/manipulates other code, e.g. genetic algorithms or symbolic manipulation.

  • Lisp's macro system makes it well suited for defining problem-specific DSLs. Most Lisp developers effectively "extend the language" to do what they need. Again the fact that Lisp is homoiconic helps enormously here.

  • There is some historical connection, in that Lisp became popular at about the same time as a lot of the early AI research. Some interesting facts in this thread.

  • Lisp works pretty well as a functional programming language. This is quite a good domain fit for AI (where you are often just trying to get the machine to learn how to produce the correct output for a given input).

  • Subjective view: Lisp seems to appeal to people with a mathematical mindset, which happens to be exactly whet you need for a lot of modern AI..... this is possible due to the fact that Lisp is pretty closely related to the untyped lambda calculus

I'm doing some AI/machine learning work at the moment, and chose Clojure (a modern Lisp on the JVM) pretty much for the above reasons.


Lisp had an advantage when we believed AI was symbol manipulation and things like Ontologies. Prolog had an advantage when we believed AI as logic, and Unification was the tricky operation. But neither of these provide any advantage for any of the current contenders for "AI": Statistical AI is about sparse arrays. Neural networks of all kinds, including deep learning, is about oceans of nodes connected with links. Model Free Methods (many kinds of machine learning, evolutionary methods, etc) are also very simple. The complexity is emergent, so you don't have to worry about it. Write a simple base that can learn what it needs to learn. In either of these cases, any general purpose language will do. Arguments can even be made that most Neural Network approaches are so simple that C++ would be overkill.

Use the language that allows you to most easily hire the best programmers for the task.


There has been some good and informative responses here but the point of Lisp and Prolog has either been missed, marginalized, or not emphasized enough.

Lisp and then later Prolog emerged in an era when the main AI research revolved around symbolic processing. A simple example of symbolic processing is how we humans do algebra, calculus, or integrals by hand. We symbolically manipulate the variables and constants to derive equivalent relationships. Lisp and Prolog were designed for this purpose.

Symbolic manipulation is not trivially implemented in C++ or Java for they were not designed with this purpose in mind. However C++, Java or similar languages may be buzzword languages in AI nowadays because there now exists several variations of AI research that do not deal with symbolic processing.

One form of AI deals with using statistical methods as the basis of knowledge and this requires using much leaner languages to reduce computation time. Also many so called AI systems are nothing more than specialized systems to serve a particular niche purpose. Of course these systems may be best programmed in a non-Lisp/Prolog language, and rely less on 'reasoning' or common-sense knowledge acquisition and more on processing data from inputs.

Even Watson (which is programmed in Java, C++, and a little Prolog) is arguably a highly specialized system. It appears Watson was designed to acquire a vast amount of facts whereby it then sorts through these facts using sophisticated search algorithms (not sure though and IBM would probably resent me for saying that). The future AI implementations will likely combine AI paradigms and implement various languages for each specialized part. Even Lisp and Prolog may one day make a comeback.