Many of my recent posts centre around the concept of computability so I thought I would step back today and give a review of the topic for the uninitiated. Obviously, there are multiple text books on the topic so I won’t be bringing anything new. However, I would like to focus on one small aspect of it that is generally glossed over in books. The main thing to take away from computability for me is that it involves functions of integers. Essentially, a computation is something that maps an integer to another integer. The Church-Turing thesis states that all forms of computation are equivalent to a Turing machine. Hence, the lambda calculus, certain cellular automata, and your MacBook have the same computational capabilities as a Turing machine (actually your MacBook is finite so it has less capability but it could be extended arbitrarily to be Turing complete). The thesis cannot be formally proven but it seems to hold.
The fact that computation is about manipulation of integers has profound consequences, the main one being that it cannot deal directly with real numbers. Or to put it another way, computation is constrained to countable processes. If anything requires an uncountable number of operations then it is uncomputable. However, uncomputability or undecidability as it is often called, is generally not presented in such a simple way. In many popular books like Godel, Escher, Bach, the emphasis is on the magical aspect of it. The reason is that the proof on uncomputability, which is similar to Godel’s proof of the First Incompleteness Theorem, relies on demonstrating that a certain self-referential function or program cannot exist by use of Cantor’s diagonal slash argument that the reals are uncountable. In very simple nonrigorous terms, the proof works by considering a list of all possible computable functions on all the integers . This is the same as saying you have a list of all possible Turing machines , of all possible initial states . Now you suppose that one of the functions takes the output of function j given input and puts a negative sign in front. So . The problem then comes if you suppose that the function acts on itself because then , which is a contradiction and thus such a computable function cannot exist.
You can see that it is quite a simple argument that shows that there exists one function that is not computable. There is nothing mystical about it. People seem to get wrapped up in the self-referential part, which at first blush seems to have some magical quasi-religious quality. However, the basic fact of computation is that there are only a countable number of computable functions but an uncountable number of possible functions. Although this result is simple, it is incomparably profound. It puts real limitations on what you can do. For example, whether or not a polynomial equation of integer coefficients has an integer solution (i.e. Hilbert’s tenth problem) is undecidable. That doesn’t mean you can’t solve particular examples, it just means there does not exist a general algorithm to solve all examples. The philosophical implications are also extremely deep. I’ve touched on them in previous posts and will undoubtably return to it in future ones.
One thing you might ask is “don’t we use the real numbers in computations that involve calculus?” The answer is that one can still talk about real numbers and uncountable infinities in a countable or even finite world. For example, even if you lived in a finite world you could still discover that real numbers exist and are uncountable. This is what Cantor did. Essentially, the set of all sets (i.e. power set) of a countably infinite set (e.g. the integers) is uncountable. A list of countably infinite strings of digits is also uncountable. You can also assume the existence of real numbers and ask what happens when you manipulate them. For example, when you solve a differential equation on the real numbers you are mapping the initial data to some final state. In most cases, we assume that the initial data is smooth or regular enough that it can be approximated by integers. However, we could still talk about functions of real numbers that cannot be approximated by integers abstractly. The classic example is the use of the Axiom of Choice, which I will get to in the future.
2011-9-2. Typos corrected.