Exercise 9.3

Suppose that A is a set and {fn} n+ is a given indexed family of injective functions

fn : {1,,n} A.

Show that A is infinite. Can you define an injective function f : + A without using the choice axiom?

Answers

We defer the proof that A is infinite until we define an injective f : + A, which we can do without using the choice axiom by using the principle of recursive definition.

Proof. First, let a0 = f1(1) A. Now consider any function g : Sn A. If g = then set ρ() = ρ(g) = a0. Otherwise let Ig = {i Sn+1fn(i)g(Sn)}. Suppose for the moment that Ig = . Consider any x fn(Sn+1) so that there is a k Sn+1 where fn(k) = x. Then it has to be that x = fn(k) g(Sn) since otherwise we would have k Ig. Since x was arbitrary, this shows that fn(Sn+1) g(Sn). Thus the identity function h1 : fn(Sn+1) g(Sn) is an injection. Clearly, g is a surjection from Sn to its image g(Sn) so that there is we can construct a particular injection h2 : g(Sn) Sn by Corollary 6.7. Lastly, fn is an injection from Sn+1 to fn(Sn+1). Therefore h = h2 h1 fn is an injection from Sn+1 to Sn. Hence h is a bijection from Sn+1 to h(Sn+1), which is clearly a subset of Sn since Sn is the range of h. But, since Sn Sn+1, clearly h(Sn+1) Sn+1 as well so that h is a bijection from Sn+1 onto a proper subset of itself. As Sn+1 is clearly finite, this violates Corollary 6.3 so we have a contradiction.

So it must be that Ig so that it is a non-empty set of positive integers, and hence has a smallest element i. So simply set ρ(g) = fn(i). Now, it then follows from the principle of recursive definition that there is a unique f : + A such that

f(1) = a0, f(n) = ρ(f Sn) for n > 1.

We claim that this f is injective.

To see this we first show that f(n)f(Sn) for all n +. If n = 1 we have that f(n) = f(1) = a0 = f1(1) and f(Sn) = f() = so that clearly the result holds. If n > 1 then f(n) = ρ(f Sn) = fn(i) for some i IfSn since clearly Sn so that f Sn. Since i IfSn we have that f(n) = fn(i)(f Sn)(Sn) = f(Sn) as desired. This shows that f is injective. For consider any n,m + where nm. Without loss of generality, we can assume that n < m. Then clearly f(n) f(Sm) since n Sm since n < m. However, by what was just shown, we have f(m)f(Sm) so that it has to be that f(n)f(m). This shows f to be injective since n and m were arbitrary.

Lastly, since f : + A is injective, it follows that f is a bijection from + to f(+) A. Hence f(+) is infinite since + is, and since it is a subset of A, it has to be that A is infinite as well. □

User profile picture
2019-12-01 00:00
Comments