In Prolog, problems are solved using backtracking. It's a declarative paradigm rather than an imperative one (like in C, PHP or Python). In this kind of languages is it worth to think in terms of complexity?
The natural way you think problems seems to be O(N^2) as someone pointed in this question.
A Prolog program consists of a data base of facts and rules. There is no structure imposed on a Prolog program, there is no main procedure, and there is no nesting of definitions. All facts and rules are global in scope and the scope of a variable is the fact or rule in which it appears.
There are only three basic constructs in Prolog: facts, rules, and queries. A collection of facts and rules is called a knowledge base (or a database) and Prolog programming is all about writing knowledge bases.
Prolog clauses are normally of three types: facts declare things that are always true. rules declare things that are true depending on a given condition. questions to find out if a particular goal is true.
A clause in Prolog is a unit of information in a Prolog program ending with a full stop (" . "). A clause may be a fact, like: likes(mary, pizza). food(pizza).
You can definitely analyze the complexity of Prolog programs, just like any other language. That particular problem you linked might be O(n^2). But not all Prolog programs will have this complexity. For example, you can easily write a SAT solver in Prolog, and that problem is NP-Complete.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With