To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
What seem to be Kurt Gödel’s first notes on logic, an exercise notebook of 84 pages, contains formal proofs in higher-order arithmetic and set theory. The choice of these topics is clearly suggested by their inclusion in Hilbert and Ackermann’s logic book of 1928, the Grundzüge der theoretischen Logik. Such proofs are notoriously hard to construct within axiomatic logic. Gödel takes without further ado into use a linear system of natural deduction for the full language of higher-order logic, with formal derivations closer to one hundred steps in length and up to four nested temporary assumptions with their scope indicated by vertical intermittent lines.
Frege explained the notion of generality by stating that each its instance is a fact, and added only later the crucial observation that a generality can be inferred from an arbitrary instance. The reception of Frege’s quantifiers was a fifty-year struggle over a conceptual priority: truth or provability. With the former as the basic notion, generality had to be faced as an infinite collection of facts, whereas with the latter, generality was based on a uniformity with a finitary sense: the provability of an arbitrary instance.
Some of our earliest experiences of the conclusive force of an argument come from school mathematics: Faced with a mathematical proof, however we try to twist the matter, there is no possibility of denying the conclusion once the premisses have been accepted.
Behind the examples from mathematics, there is a more general pattern of'demonstrative arguments' that is studied in the science of logic. Logical reasoning is applied at all levels, from everyday life to the most advanced sciences. As an example of the former, assume that under some specific conditions, call them A,something, call it B, necessarily follows. Assume further that the conditions A are fulfilled. To deny B under these circumstances would lead to a contradiction, so that either B has to be accepted or at least one of the assumptions revised – or at least that is what the fittest thinker would do to survive.
A remarkable level of complexity is achieved in everyday logical reasoning, even if the principles behind it remain intuitive. We begin our analysis of logical reasoning by the observation that the forms of such reasoning are connected to the forms of linguistic expression used and that these forms have to be made specific and precise in each situation. When this is done, it turns out that a rather limited set of first principles is sufficient for the representation of any logical argument.
When I was little and Christmas time was approaching, we children knew that there would be two kinds of presents: the soft packages that contained useful but unexciting clothes, and the hard boxes that contained gorgeous new toys. I learned later that the same formula repeats itself often in life, and even in logic. There are the discussions about first principles: what rests on what, what comes first in the end of all analyses, and what it all means – and these are the useful but relatively unexciting soft packages. Then there is the box that is really interesting to open, and that is what I call the deductive machinery of logic – how it all actually works. Others have called it the inferential engine. I believe that logic should not be presented to us just in those soft packages – the hard box has to be there to be opened as well, so that we can find out how logical arguments function. It is a hands-on kind of learning in which one tries and retries things by oneself until the machinery runs smoothly. Then it is the time to discuss the nature of the first principles.
The book begins with a linear form of proofs that I learned from Dag Prawitz' Swedish compendium ABC i Symbolisk Logik. Little did I think, back in 1973 when using that text for the first time, that my teaching of elementary logic would one day grow into a comprehensive presentation in the form of a book.
Similarly to classical propositional logic, the classical form of predicate logic has a simple semantics. Kripke semantics for intuitionistic predicate logic instead has the complication that the domain of individual objects is not given once and for all.
(a) The semantics of classical predicate logic. The semantics of classical propositional logic presented in Section 7.1 was based on the idea that in each concrete situation, the truth values of atomic formulas are determined. The formal presentation was in terms of valuations, i.e., assignments of truth values to the atomic formulas. In predicate logic we have a domain of individuals and the atomic formulas make statements about the properties of individuals and relations among them. The basic ideas of the semantics of classical predicate logic were already given in Section 8.2: As explained there, the schematic atomic formulas get interpreted in a given domain, and a universal formula ∀x A(x) is true under an interpretation if each of its instances A(a), A(b), A(c),… is true, and an existential formula ∃x A(x) is similarly true if there is some instance A(a) that is true under an interpretation.
If a domain is infinite, as in the case of the natural numbers, it is not possible to go through all the instances of a formula A(x) with a free variable. In the Frege–Gentzen explanation of the universal quantifier in Section 8.2, provability of ∀x A(x) required a proof of A(y) for an arbitrary y. This condition is stronger than the truth condition for a universal formula.
The linear variety of natural deduction makes it possible to construct derivations in steps, one after the other. On the other hand, we have not treated disjunction yet, and we have noticed that the normal form of derivations would not be transparent and simple in a linear arrangement of formulas. Both of these defects are corrected when we now turn to a study of Gentzen's original system of natural deduction for propositional logic. Formulas in derivations are arranged in a tree form, such that each formula is either an assumption or the conclusion of exactly one logical rule, and each formula except the endformula of the whole derivation is a premiss of exactly one logical rule. When we here talk about ‘each formula’, we mean more precisely each single formula occurrence in a rule instance in a derivation, but don't repeat that each time.
Tree derivations were in practice a novelty with Gentzen and their wide-spread use in logic derives from his doctoral thesis (1934–5). He took the idea over from the work of Paul Hertz of the 1920s. The tree form shows ‘what depends on what’ in a derivation and makes it possible to transform the order of application of rules; the most central methodological novelty in Gentzen that soon led to spectacular results about the structure of proofs.
Predicate logic starts from propositional logic and adds to it two things: The atomic formulas receive an inner structure, and quantifiers are added, one for expressing generality, another for expressing existence.
The structure of the atomic formulas is as follows: We have some given collection of individuals, denoted as a whole by D and called the domain, and individuals in the domain, called objects and denoted by a, b, c,…, a1a2,… etc. Each atomic formula gives a property of the objects in D, or a relation between several such objects. The notation and reading of atomic formulas is exemplified by the following:
P(a), object a has the property P(a)
Q(a, b), objects a and b stand in the relation Q(a, b) to each other
R(a, b, c), objects a, b, and c stand in the relation R(a, b, c) to each other
For a concrete example, let D consist of the natural numbers 0, 1, 2,…, and let P be the property to be a prime number. We can form atomic formulas by writing numbers in the argument place of P, say P(17) that is the proposition 17 is a prime number. Let Q be the order relation < between two natural numbers. Then Q(7, 5) is the proposition 7 is smaller than 5.
We shall review the development of logic, beginning with Aristotle's syllogistic logic. Next the tradition of algebraic logic is described, through the work of George Boole, Ernst Schröder, and Thoralf Skolem. There follows a section on axiomatic logic with two phases, the early work of Gottlob Frege, Giuseppe Peano, and Bertrand Russell, and a second phase with David Hilbert and Paul Bernays, and up to Heyting's intuitionistic logic in 1930. That is the point right before Gentzen's development of natural deduction and sequent calculus. The emphasis is on how the logical systems work, i.e., on their deductive machinery.
Aristotle's deductive logic
Aristotle's system of deductive logic, also known as the ‘theory of syllogisms’, has been interpreted in various ways in the long time since it was conceived. The situation is not different from the reading of other chapters of the formal sciences of antiquity, such as Euclid's geometry and works of Archimedes. When Frege invented predicate logic, he finished the presentation proudly with a reconstruction of the Aristotelian forms of propositions, such as Every A is B that is interpreted as ∀x (A(x) ⊃ B (x)), with a universal quantification over some domain and the predicates A and B. Frege reproduced similarly Aristotelian inferences, such as the conclusion Every A is C obtained from the premisses Every A is B and Every B is C, in the way shown in Section 9.1.
We showed in Section 3.7(a) that the law of double negation, ¬¬ A ⊃ A, is not derivable in intuitionistic logic. The proof of underivability was done with formal detail in Section 4.4, example (d). The difference between A and ¬¬ A was explained in Section 3.7(c): The former is a direct proposition, the latter expresses the impossibility of something negative, the best example being direct existence against the impossibility of non-existence. The former can be established by showing an object with a required property, the latter by showing that it is impossible that no object has the property.
One reason for the natural tendency to accept the law of double negation, or the related law of excluded middle, is as follows. If there is only a finite number of alternatives, the question of A or ¬ A can be decided by going through all of these. Say, if we claim, for natural numbers less than 100, that there are three and only three successive odd numbers that are all prime, we can go through all possible cases and find that 3, 5, and 7 are precisely those three numbers. More generally, the constructive interpretation of A ∨ ¬ A is that it expresses the decidability of A.
Logical reasoning proceeds from given assumptions to some sought conclusion. The essence of assumptions is that they are hypothetical so that it is not determined if they hold, and the point with the steps of reasoning is that they produce correct conclusions whenever the assumptions are correct. These steps are two-fold: In one direction, we analyse the assumptions into their simpler parts, in another direction, we look at the conditions from which the sought for conclusion can be synthesized. The aim is to make these ends meet. Some examples lead us to a small collection of basic steps and it turns out that all logical arguments based on the connectives can be reproduced as combinations of the basic steps.
Steps in proofs
Consider our bather in Cap Breton. The argument was: We have assumptions of the forms A ⊃ B and ¬ B. Now a is added to these assumptions, and a contradiction follows. The argument can be presented as a succession of steps each one of which is in itself hard to doubt. We write the steps one after another together with a justification at right:
Example argument 2.1. Proof of a contradiction from A ⊃ B, ¬ B, and A.
The explanation of the notion of a proposition in Section 1.3 required that such propositions be complete declarative sentences that state a possible state of affairs. The notion of logical truth tries to capture the relation between such sentences and states of affairs, and to be formulated relative to classical and intuitionistic ways of reasoning, respectively. We start with the former because it is simpler, then explain the Kripke semantics of intuitionistic propositional logic. In the final section, a completeness proof is given for classical propositional logic that ties closely together the proof system of Chapter 6 and the standard ‘truth-table’ semantics of this chapter.
The semantics of classical propositional logic is based on a notion of absolute truth, whatever that may be. Specifically, each atomic proposition will be either true or false. The concept of truth in classical propositional logic is built on such an assumption:
Basic assumption about truth. The truth and falsity of atomic propositions in specific circumstances is determined in itself.
How this determination takes place, whether truth and falsity can be actually determined and known, etc., are questions from which this notion of truth abstracts away: The different possible states of affairs are represented abstractly so that to each of any given atomic formulas P1,…, Pn is assigned a truth value, either the value true that is abbreviated as t or the value false that is abbreviated as f.