Gracious Living

Ordinal Arithmetic
December 7, 2010, 21:03
Filed under: Math, Set Theory | Tags: , , ,

Freeing yourself of the bounds of Peano arithmetic, you come upon a number of different routes to take.  You want the addition and multiplication operations you defined to make sense in a larger scope, but what this scope is varies depending on your needs.  The standard thing to do is to start “closing” the ordered semiring $\mathbb{N}$ in a number of different ways: group closure by adding negatives and reciprocals, metric closure by adding limits, or algebraic closure by adding solutions to polynomials.  But instead, we could just model the elements of $\mathbb{N}$ as the finite ordinals and use transfinite recursion to extend the operations to operations on ordinals.  Transfinite recursion is like induction on cagefighting elephant steroids.

Once you’ve done this, you lose some of your semiring properties, but you retain the order and you win the exciting new property of it being closed under limits.  This is ordinal arithmetic.  Building it will let us understand the structure of the ordinals more rigorously than we had before: there was an ordinal we called $\omega^2+1$, but what does this actually mean?

The axiom of induction in PA lets you prove something for every element of $\mathbb{N}$ if you know it holds for $0$ and it’s preserved by the successor function.  Another way of looking at this is that we can define a function on $\mathbb{N}$ just by defining it for $0$ and showing what happens when we take the successor.  This is how we defined addition: as a function $+_a:b\mapsto a+b$ for each number $a$, defined only by the rules $+_a(0)=a,+_a(Sb)=S(+_a(b))$.  This is usually called recursion.  Now, the extension of induction to the class of ordinals, $\mathbf{On}$, says that you can prove a statement for every ordinal if it holds for $0$ and is preserved by successors and limits, that is, if it holds for all ordinals $\beta$ less than some limit ordinal $\gamma$, then it holds for $\gamma$.

In terms of defining things, we then have transfinite recursion: a function can be defined on the ordinals by defining $f(0)$, $f(S\alpha)$ in terms of $\alpha$, and $f(\gamma)$ in terms of $\{\beta:\beta<\gamma\}$ for $\gamma$ a limit ordinal.  What we want to do is extend arithmetic using transfinite recursion in the definitions rather than normal recursion.

(First a word about notation: we’d defined successors as $S\alpha=\alpha\cup\{\alpha\}$ and limits as $\gamma=\bigcup_{\beta<\gamma}\beta$.  But just like we don’t have to use the set model for natural numbers, we don’t need it for ordinals either.  We could axiomatize zero and the successor function with the first four Peano axioms, and then axiomatize limits by, for instance, defining $\lim_{\gamma\in A} \beta_\gamma$ to be the least upper bound of $\{\beta_\gamma:\gamma\in A\}$.  Henceforth I’ll be using “lim” for limit rather than the union symbol, for this reason.  Because the ordinals are well-ordered, it makes sense to take the limit of a set rather than a sequence, and indeed, most such sets will be uncountable.  In the following, let $\beta_\gamma$ be a set of ordinals indexed by $\gamma\in A$.)

So for addition we already have the first two pieces: $\alpha+0=\alpha,\alpha+S\beta=S(\alpha+\beta)$, and since the successor function is everywhere defined, this works on ordinals just like on naturals.  (We even have the nice theorem $\alpha+1=S\alpha$.)  Since the successor function “commutes” with the addition, it makes sense to have the limit function do the same: put $\alpha+\lim \beta_\gamma=\lim (\alpha+\beta_\gamma)$.  Now we can add limit ordinals: for example, $1+\omega=\lim_{n\in\mathbb{N}}(1+n)=\omega$.  $\omega+\omega$ will similarly be the limit of $\{\omega,\omega+1,\omega+2,\dotsc\}$.  Note that this is no longer commutative: $1+\omega=\omega$ but $\omega+1=S\omega>\omega$.  Likewise, $\lim\{\omega,1+\omega,\dotsc\}=\lim\{\omega\}=\omega\ne\omega+\omega$.  This is a consequence of the following statement:

Theorem (Fixed Point Theorem).  If $f$ is a function on the ordinals that preserves order and limits (so that $f(\lim \beta_\gamma)=\lim f(\beta_\gamma)$), and $\alpha$ is an ordinal, then there is a $\delta>\alpha$ such that $f(\delta)=\delta$.

Proof.  First note that:

• $f(0)\ge 0$ since everything is
• if $f(\alpha)\ge\alpha$ then $f(S\alpha)>f(\alpha)\ge\alpha$ since $f$ is order-preserving, so $f(S\alpha)\ge S\alpha$
• and if $f(\beta_\gamma)\ge\beta_\gamma$ for each $\gamma\in A$, then $f(\lim\beta_\gamma)=\lim f(\beta_\gamma)\ge\lim\beta_\gamma$ since $f$ is limit-preserving.

By transfinite induction, $f(\alpha)\ge\alpha$ for all $\alpha$.  In particular, this holds for the $\alpha$ we’re given, so assume that $f(\alpha)>\alpha$.  Define a function $g:\mathbb{N}\rightarrow\mathbf{On}$ by ordinary recursion, with $g(0)=\alpha,g(Sx)=f(g(x))$.  Let $\beta=\lim_{x\in\mathbb{N}}g(x)$.  Then $f(\beta)=f(\lim g(x))=\lim f(g(x))=\lim g(x+1)=\beta$.  But $\beta\ge g(1)=f(\alpha)>\alpha$, as desired.  $\square$

We call a function that’s order- and limit-preserving normal; note that the limit-preserving condition is equivalent to being continuous in the order topology.  This theorem doesn’t only prove that normal functions have fixed points larger than any given ordinal; it shows us how to get them!

As a corollary, if we let $+_\alpha:\beta\mapsto\alpha+\beta$, then $+_\alpha$ is normal by definition, so it must have fixed points.  The theorem shows us that $lim\{\beta,\alpha+\beta,\alpha+\alpha+\beta,\dotsc\}$ is such a fixed point, and letting $\beta=0$, we get $\lim\{0,\alpha,\alpha+\alpha,\dotsc\}$ as the first fixed point.

To describe this further, let’s introduce multiplication.  Again we have $\alpha\cdot 0=0,\alpha\cdot S\beta=\alpha\beta+\alpha$.  Now just let $\alpha(\lim\beta_\gamma)=\lim \alpha\beta_\gamma$.  Check that this is still normal; indeed, we have $2\omega=\lim\{2,4,6,8,\dotsc\}=\omega$, but $\omega(2)=\omega+\omega$.  I called this number “$2\omega$” in my first post on the ordinals just because it looks nicer — technically it’s $\omega 2$.  Likewise, we have $\omega\cdot n=\omega+\dotsb+\omega$ $n$ times, for $n<\omega$.  By normality, it follows that $\lim\{0,\omega,\omega+\omega,\dotsc\}=\omega\cdot(\lim\{0,1,2,\dotsc\})=\omega\cdot\omega$.  This is the first fixed point for addition by $\omega$, so $\omega+(\omega\cdot\omega)=\omega\cdot\omega$.

If we want to name this, we’d have to define exponentiation.  Let $\alpha^0=1,\alpha^{S\beta}=\alpha^\beta\cdot\alpha,$ and $\alpha^{\lim\beta_\gamma}=\lim\alpha^{\beta_\gamma}$.  This doesn’t really deserve as much detail, but for example, $\omega^n=\omega\cdot\dotsb\cdot\omega$ $n$ times, for $n$ natural, and $n^\omega=\omega$.

By now you probably notice a pattern.  Given a function $f_0:\mathbf{On}\times\mathbf{On}\rightarrow\mathbf{On}$, we define $f_1(\alpha,S\beta)=f_0(f_1(\alpha,\beta),\alpha),$, $f_1(\alpha,\lim\beta_\gamma)=\lim f_1(\alpha,\beta_\gamma)$, and $f_1(\alpha,0)=$ usually $1$.  We can use recursion to define $f_n$ for all $n\in\mathbb{N}$, and indeed, for all ordinals (see how?).  For example, if $f_0=+,f_1=\times,f_2=$ exponentiation, then $f_3(\alpha,\beta)$ will be $\alpha^{\alpha^{\dotsb^\alpha}}$ $\beta$ times, and so on.  All of these functions have $f_n(\alpha,\beta)$ a normal function of $\beta$ if $\alpha$ is fixed.  For $n\in\mathbb{N}$, they’re also “primitive recursive,” which means you can calculate them by using recursion, zero, and the successor function.  The fixed point of $f_\omega(\alpha,\cdot)$ is then fixed by every primitive recursive function.

Let’s step away from the craziness and look at a few easy properties of addition and multiplication, which you can check with transfinite induction.  Namely:

• Both are associative.
• $0$ is the two-sided identity for addition, and $1$ for multiplication.
• Multiplication distributes over addition on the right: $\alpha(\beta+\gamma)=\alpha\beta+\alpha\gamma$, but $(\beta+\gamma)\alpha$ is not, in general, equal to $\beta\alpha+\gamma\alpha$.
• For exponentiation, associativity no longer holds: instead, we have $(\alpha^beta)^\gamma=\alpha^{\beta\gamma}$.  When we write something like $\alpha^{\beta^\gamma}$, we mean that you evaluate the exponents from right to left, since otherwise the chain collapses into a single exponent, which is boring.
• BUT $\alpha^{\beta+\gamma}=\alpha^\beta\alpha^\gamma$.  Can you think of other properties of exponentiation that hold on $\mathbb{N}$?  Do they still hold here?
• Finally, $\alpha^1=\alpha,1^\alpha=1,\alpha^0=1$.  Also, $0^\alpha=0$ except for $0^0=1$.  (On $\mathbb{R}$, this usually isn’t defined, but for ordinals this is an automatic consequence of the definition.)

Lest I forget, if we accept the set model, there’s a reason for these operations (well, the first few) beyond extending PA.  Ordinals represent the order-isomorphism classes of well-ordered sets, remember?  So if $A$ and $B$ are well-ordered, we can well-order $A\sqcup B=(A\times\{0\})\cup (B\times\{1\})$ by using the usual orderings on each component and saying $(a,0)<(b,1)$ for all $a\in A,b\in B$.  Using brackets to indicate the corresponding ordinal, we have $[A]+[B]=[A\sqcup B]$.  So $1+\omega=[\{-1,0,1,2,\dotsc\}$ which is clearly just $\omega$.  Likewise, $[A][B]=[A\times B]$ with the dictionary ordering, which has $(a,b)\le (a^\prime,b^\prime)$ if $a or $a=a^\prime, b.  This is really just repeated addition — it’s $[A\sqcup A\sqcup\dotsb\sqcup A]$ where you do the union $B$ times!  You can prove these if you’d like — just prove that these functions satisfy the addition and multiplication axioms about successors and limits.  By transfinite induction, there’s a unique function satisfying each set.

Some research for this post came from Keith Devlin’s The Joy of Sets, which I’m actually really enjoying, though the subject is pretty abstract.  I guess I’ll have to start listening to his radio show.  I’m going to do another post on cardinal arithmetic, and maybe some more on more advanced material.  Let me just warn you — this shit is BANANAS.  If you thought an ordinal-indexed chain of functions starting with addition and multiplication was weird, that’s just the tip of the iceberg, my friend.  Even basic operations on cardinals work so weirdly you can’t help laughing.