Termination of logic programs depends critically on the selection rule, i.e. the rule that determines
which atom is selected in each resolution step. In this article, we classify programs
(and queries) according to the selection rules for which they terminate. This is a survey and
unified view on different approaches in the literature. For each class, we present a sufficient,
for most classes even necessary, criterion for determining that a program is in that class.
We study six classes: a program strongly terminates if it terminates for all selection rules; a
program input terminates if it terminates for selection rules which only select atoms that are
sufficiently instantiated in their input positions, so that these arguments do not get instantiated
any further by the unification; a program local delay terminates if it terminates for local
selection rules which only select atoms that are bounded w.r.t. an appropriate level mapping;
a program left-terminates if it terminates for the usual left-to-right selection rule; a program
∃-terminates if there exists a selection rule for which it terminates; finally, a program
has bounded nondeterminism if it only has finitely many refutations. We propose a semantics-preserving
transformation from programs with bounded nondeterminism into strongly terminating programs.
Moreover, by unifying different formalisms and making appropriate assumptions, we
are able to establish a formal hierarchy between the different classes.