As an undergraduate I was taught to multiply two numbers with the help of log tables, using the formula
Having graduated to teach calculus to Engineers, I learned that log tables were to be replaced by slide rules. It was then that Imade the fateful decision that there was no need for me to learn how to use this tedious device, as I could always rely on the students to perform the necessary computations. In the course of time, slide rules were replaced by pocket calculators and personal computers, but I stuck to my original decision.
My computer phobia did not prevent me from taking an interest in the theoretical question: what is a computation? This question goes back to Hilbert's 10th problem (see Browder ), which asks for an algorithm, or computational procedure, to decide whether any given polynomial equation is solvable in integers. It quickly leads to the related question: which numerical functions f : ℕn → ℕ are computable?
While Hilbert's 10th problem was only resolved in 1970, this related question had some earlier answers, of which I shall single out the following three:
(1) f is recursive (Gödel, Kleene),
(2) f is computable on an abstract machine (Turing, Post),
(3) f is definable in the untyped λ-calculus (Church, Kleene).
These tentative answers were shown to be equivalent by Church  and Turing [1936–7].
I shall discuss here some of the more recent developments of these notions of computability and their relevance to linguistics and logic. I hope to be forgiven for dwelling on some of the work I have been involved with personally, with greater emphasis than is justified by its historical significance.