Gödel’s Mistake

Gödel’s Mistake




This book connects Gödel’s and Turing’s theorems to differences between ordinary language and mathematics. Ordinary language allows distinctions between things, names, concepts, programs, algorithms, and problems but mathematics does not. Gödel’s proof arises due to category mistakes between things, names and concepts while Turing’s proof results from a categorical confusion between programs and descriptions.

If mathematics and computing could distinguish between things, names, concepts, programs, algorithms, and problems, paradoxes in logic, mathematics, and computing will not exist. To make mathematics consistent and complete, there is a need to redefine mathematics as a description of symbols rather than objects. Symbols are things too, although things are not symbols. A symbol – by definition – has contextual and intentional properties while objects only have possessed properties. The shift from objects to symbols, therefore, inducts the distinction between names, concepts, and things.

In this process, many common mathematical constructs such as numbers, sets, logic, and space-time undergo a radical revision. Space and time, for instance, have to be treated as domains of symbols with contextual and intentional properties rather than as containers of objects. Numbers, similarly, have to be redefined as representations of meanings (or types) rather than quantities. Logic needs to institute a distinction between proof and truth. And, set theory needs to define collections in a manner that is logically prior to (and independent) of the objects. The book discusses revisions to logic, set theory, space-time, and number theory required to build a consistent and complete mathematics.

The relation between the necessity of types and the role that consciousness plays in nature is discussed. The book argues that the types used in distinguishing, ordering, and relating are properties of consciousness. Paradoxes in mathematics can be seen as stepping stones towards a new type-based foundation in mathematics, which reflects the observer’s type-perception capabilities in a formal language.