Anyone who has lectured on the construction of the number systems has faced a problem with definition. For the most obvious example, do you define real numbers as Dedekind cuts, or as Cauchy sequences, or more simple-mindedly as infinite decimals, or in some other way? And when you have defined them, how do you go about ensuring, as painlessly as possible, that the set of rational numbers is a subset of the set of real numbers?

To take a simpler example, consider constructing the integers from the natural numbers. The best-known approach is to construct first the set of ordered pairs of natural numbers, where the pair (*a,b*) will eventually stand for the integer *a*−*b*. Since each integer has many labels, we have to define an equivalence relation on the set of ordered pairs, where two pairs are equivalent if they label the same integer (and some calculation on the back of an envelope says that this means, explicitly, that (*a,b*) and (*c,d*) are equivalent if *a*+*d* = *b*+*c* as natural numbers). Then we have to define addition, subtraction and multiplication of ordered pairs, and show that these operations respect equivalence, so they induce operations on the set of equivalence classes; this set is then to be taken as the set of integers. The original natural numbers have vanished, and a small amount of violence is required to re-insert them in the integers.

A more simple-minded approach is to say that an integer is of one of three forms: a natural number, zero, or of the form −*n*, where *n* is a natural number. This is certainly closer to how we think about integers! Now the natural numbers obviously form a subset of the integers. But the definitions are a pain. Addition, by my reckoning, requires a definition divided into thirteen separate cases. (Each of the summands may be positive, zero, or negative; and if they have opposite signs, then a further case division is required.) I do not know how many different cases would be required to verify the associative law!

One person who faced a more serious version of this difficulty was John Conway, in his definition of what he originally called “numbers”, but since Donald Knuth’s account of them have become known as “surreal numbers”. This redefinition process, which occurs finitely many times in the standard construction of the number systems, occurs infinitely often in Conway’s definition. In the book *On Numbers and Games*, he chafes against this difficulty. Indeed, he added an appendix to the first part (on Numbers) in the silver jubilee edition, which is worth quoting here in some detail.

… mathematics has now reached the stage where formalisation within some particular axiomatic set theory is irrelevant, even for foundational studies. It shoud be possible to specify conditions on a mathematical theory which would suffice for embeddability within ZF [Zermelo–Fraenkel set theory] (supplemented by additional axioms of infinity if necessary), but which do not otherwise restrict the possible constructions in the theory. … This appendix is in fact a cry for a Mathematicians’ Liberation Movement!

Among the permissible kinds of construction we should have:

- Objects may be created from earlier objects in any reasonably constructive fashion.
- Equality among the created objects can be any desired equivalence relation.

He says that set theory would be such a theory, and (for example) ordered pairs would be described by their rules of equality (that is, that (*x,y*) = (*z,t*) if and only if *x* = *z* and *y* = *t*, rather than require set-theoretical cleverness such as defining (*x,y*) to be {{*x*},{*x,y*}}.

He goes on:

I hope it is clear that this proposal is not of any particular theory as an alternative to ZF (such as a theory of categories, or of the numbers or games considered in this book). What is proposed is instead that we give ourselves the freedom to create arbitrary mathematical theories of these kinds, but prove a metatheorem which ensures once and for all that any such theory could be formalised in terms of any of the standard foundational theories.

The situation is analogous to the theory of vector spaces. Once upon a time these were collections of *n*-tuples of numbers, and the interesting theorems were those that remained invariant under linear transformations of these numbers. Now even the initial definitions are invariant, and vector spaces are defined by axioms rather than as particular objects. However, it is proved that every vector space has a base, so that the new theory is much the same as the old. But now no particular base is distinguished, and usually arguments which use particular bases are cumbrous and inelegant compared to arguments directly in terms of the axioms.

We believe that mathematics itself could be founded in an invariant way … No particular axiomatic theory … would be needed, and indeed attempts to force arbitrary theories into a single formal straitjacket will probably continue to produce unnecessarily cumbrous and inelegant contortions.

If I understand aright, David Eppstein’s comment on the post on the definition of a graph, in which he says, “A graph is a thing that behaves like a graph — implementation details aren’t relevant, in the same way that Chrome and Safari and Firefox are all web browsers even though their underlying code is completely different from each other, or in the same way that a permutation group and a group given by a presentation are both groups”, is saying something similar, and he would probably agree with Conway’s viewpoint.

Although there is a beguiling similarity between an axiom and a definition, there is a difference in the kind of authority exerted. Take the example given of an ordered pair defined as an unordered pair of sets (introduced, I believe, by Kuratowski). As an axiom it does little more than rob the ordered pair of its zuhandenheit. However, as a definition, or rather a re-definition, it leads us up and away, providing a first stepping stone (a flagstone?) onto a flag of subsets as an ordered set. It isn’t that a permutation is just or only such a chain, but that it is also such a chain. And though you can always throw away the ladder, it is a necessary preliminary for making the climb.

As I see it, Conway is trying to separate these two functions by something a bit like a “compiler compiler”. You could tell it what foundations to use (ZF or whatever), and what you want to construct (ordered pairs, surreal numbers or whatever), and it would produce the definitions and proofs making up a valid construction for your object. Of course nobody would then read that; they would simply use the properties of the object (which is what Conway thinks preferable).

I mentioned before my picture of mathematics as a building on a hillside; you enter at the ground floor and you can go down to the foundations or up to the upper floors. Once a secure stairway up from the basement has been installed, you have the option of simply trusting it and going upstairs (as probably many applied mathematicians do).

Eppstein’s remark echoes the principle of pragmatic definition that C.S. Peirce enunciated in his

maxim of pragmatismorpragmatic maxim, variations on which theme I enumerated in here. My own favorite variant runs as follows:Consider what effects that might

conceivablyhave practical bearings youconceivethe objects of yourconceptionto have. Then, yourconceptionof those effects is the whole of yourconceptionof the object.From a mathematical point of view the pragmatic maxim has the character of a closure principle combined with a representation principle. Here is a rough sketch I wrote a while back on the closure aspect:

☞ Pragmatic Maxim as Closure Principle

Pingback: Definition and Determination : 10 | Inquiry Into Inquiry