I am not familiar with logic programming but I would like to know if the following setting have been studied and if it corresponds to a known system in logic programming.

- I work with first-order resolution where we have clauses $ c = \{p_1(t_1), …, p_k(k_k), \lnot p_{k+1}(t_{k+1}), …, \lnot p_n(t_n)\}$ : we have a disjunction of (positive or negative) first-order predicates (for instance $ p(f(x, y))$ ).
- When we have a program $ P = \{c_1, …, c_n\}$ , using Robinson’s resolution between two clauses $ c_i$ and $ c_j$ , we would like to compute all the predicates we can infer from $ P$ . We can obtain different sets of predicates depending on how we connect the predicates but we would like all such sets.
- We would like all these connections to be
*maximal* in the sense that it we connect predicates in $ P$ until no more predicates in $ P$ can be added. It should represent a “full computation”.

For instance, let’s $ $ P = \{add(0,y,y)\} \quad \{\lnot add(s(x),y,s(z)), add(x,y,z)\} \quad \{add(s^2(0),s^2(0),s^4(0))\}$ $ be a program with $ s^n(0)$ being $ n$ repetitions of the unary predicate $ s$ . If the clauses are labelled $ c_1, c_2, c_3$ , the unique way of constructing such “maximal connections” is to do $ c_1-c_2^m-c_3$ for any $ m$ but only one is correct: $ c_1-c_2^2-c_3$ corresponding to checking $ 2+2=4$ .

To give more context, I work in another field with a system with (at first) no connections to logic programming but which later showed strong similarities (for instance with answer sets) so I wanted to connect it to known concepts in logic programming.